Feb 20 06:37:22 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Feb 20 06:37:22 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 20 06:37:22 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 20 06:37:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 20 06:37:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 20 06:37:22 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 20 06:37:22 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 20 06:37:22 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Feb 20 06:37:22 localhost kernel: signal: max sigframe size: 1776
Feb 20 06:37:22 localhost kernel: BIOS-provided physical RAM map:
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 20 06:37:22 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Feb 20 06:37:22 localhost kernel: NX (Execute Disable) protection: active
Feb 20 06:37:22 localhost kernel: SMBIOS 2.8 present.
Feb 20 06:37:22 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 20 06:37:22 localhost kernel: Hypervisor detected: KVM
Feb 20 06:37:22 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 20 06:37:22 localhost kernel: kvm-clock: using sched offset of 2856977710 cycles
Feb 20 06:37:22 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 20 06:37:22 localhost kernel: tsc: Detected 2799.998 MHz processor
Feb 20 06:37:22 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 20 06:37:22 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 20 06:37:22 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Feb 20 06:37:22 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 20 06:37:22 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 20 06:37:22 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 20 06:37:22 localhost kernel: Using GB pages for direct mapping
Feb 20 06:37:22 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Feb 20 06:37:22 localhost kernel: ACPI: Early table checksum verification disabled
Feb 20 06:37:22 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 20 06:37:22 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 20 06:37:22 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 20 06:37:22 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 20 06:37:22 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 20 06:37:22 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 20 06:37:22 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 20 06:37:22 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 20 06:37:22 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 20 06:37:22 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 20 06:37:22 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 20 06:37:22 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 20 06:37:22 localhost kernel: No NUMA configuration found
Feb 20 06:37:22 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Feb 20 06:37:22 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd5000-0x43fffffff]
Feb 20 06:37:22 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Feb 20 06:37:22 localhost kernel: Zone ranges:
Feb 20 06:37:22 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 20 06:37:22 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 20 06:37:22 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Feb 20 06:37:22 localhost kernel:   Device   empty
Feb 20 06:37:22 localhost kernel: Movable zone start for each node
Feb 20 06:37:22 localhost kernel: Early memory node ranges
Feb 20 06:37:22 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 20 06:37:22 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 20 06:37:22 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Feb 20 06:37:22 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Feb 20 06:37:22 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 20 06:37:22 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 20 06:37:22 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 20 06:37:22 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 20 06:37:22 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 20 06:37:22 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 20 06:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 20 06:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 20 06:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 20 06:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 20 06:37:22 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 20 06:37:22 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 20 06:37:22 localhost kernel: TSC deadline timer available
Feb 20 06:37:22 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 20 06:37:22 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 20 06:37:22 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 20 06:37:22 localhost kernel: Booting paravirtualized kernel on KVM
Feb 20 06:37:22 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 20 06:37:22 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 20 06:37:22 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Feb 20 06:37:22 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Feb 20 06:37:22 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 20 06:37:22 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 20 06:37:22 localhost kernel: Fallback order for Node 0: 0 
Feb 20 06:37:22 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Feb 20 06:37:22 localhost kernel: Policy zone: Normal
Feb 20 06:37:22 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 20 06:37:22 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Feb 20 06:37:22 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Feb 20 06:37:22 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 20 06:37:22 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 20 06:37:22 localhost kernel: software IO TLB: area num 8.
Feb 20 06:37:22 localhost kernel: Memory: 2873456K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741260K reserved, 0K cma-reserved)
Feb 20 06:37:22 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Feb 20 06:37:22 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 20 06:37:22 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Feb 20 06:37:22 localhost kernel: ftrace: allocated 176 pages with 3 groups
Feb 20 06:37:22 localhost kernel: Dynamic Preempt: voluntary
Feb 20 06:37:22 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 20 06:37:22 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 20 06:37:22 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 20 06:37:22 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 20 06:37:22 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 20 06:37:22 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 20 06:37:22 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 20 06:37:22 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 20 06:37:22 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 20 06:37:22 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 20 06:37:22 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Feb 20 06:37:22 localhost kernel: Console: colour VGA+ 80x25
Feb 20 06:37:22 localhost kernel: printk: console [tty0] enabled
Feb 20 06:37:22 localhost kernel: printk: console [ttyS0] enabled
Feb 20 06:37:22 localhost kernel: ACPI: Core revision 20211217
Feb 20 06:37:22 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 20 06:37:22 localhost kernel: x2apic enabled
Feb 20 06:37:22 localhost kernel: Switched APIC routing to physical x2apic.
Feb 20 06:37:22 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 20 06:37:22 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Feb 20 06:37:22 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 20 06:37:22 localhost kernel: LSM: Security Framework initializing
Feb 20 06:37:22 localhost kernel: Yama: becoming mindful.
Feb 20 06:37:22 localhost kernel: SELinux:  Initializing.
Feb 20 06:37:22 localhost kernel: LSM support for eBPF active
Feb 20 06:37:22 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Feb 20 06:37:22 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Feb 20 06:37:22 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 20 06:37:22 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 20 06:37:22 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 20 06:37:22 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 20 06:37:22 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 20 06:37:22 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Feb 20 06:37:22 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Feb 20 06:37:22 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 20 06:37:22 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 20 06:37:22 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 20 06:37:22 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 20 06:37:22 localhost kernel: Freeing SMP alternatives memory: 36K
Feb 20 06:37:22 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 20 06:37:22 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Feb 20 06:37:22 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 20 06:37:22 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 20 06:37:22 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 20 06:37:22 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 20 06:37:22 localhost kernel: ... version:                0
Feb 20 06:37:22 localhost kernel: ... bit width:              48
Feb 20 06:37:22 localhost kernel: ... generic registers:      6
Feb 20 06:37:22 localhost kernel: ... value mask:             0000ffffffffffff
Feb 20 06:37:22 localhost kernel: ... max period:             00007fffffffffff
Feb 20 06:37:22 localhost kernel: ... fixed-purpose events:   0
Feb 20 06:37:22 localhost kernel: ... event mask:             000000000000003f
Feb 20 06:37:22 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 20 06:37:22 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 20 06:37:22 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 20 06:37:22 localhost kernel: x86: Booting SMP configuration:
Feb 20 06:37:22 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 20 06:37:22 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 20 06:37:22 localhost kernel: smpboot: Max logical packages: 8
Feb 20 06:37:22 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Feb 20 06:37:22 localhost kernel: node 0 deferred pages initialised in 23ms
Feb 20 06:37:22 localhost kernel: devtmpfs: initialized
Feb 20 06:37:22 localhost kernel: x86/mm: Memory block size: 128MB
Feb 20 06:37:22 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 20 06:37:22 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Feb 20 06:37:22 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 20 06:37:22 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 20 06:37:22 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Feb 20 06:37:22 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 20 06:37:22 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 20 06:37:22 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 20 06:37:22 localhost kernel: audit: type=2000 audit(1771569440.456:1): state=initialized audit_enabled=0 res=1
Feb 20 06:37:22 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 20 06:37:22 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 20 06:37:22 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 20 06:37:22 localhost kernel: cpuidle: using governor menu
Feb 20 06:37:22 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Feb 20 06:37:22 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 20 06:37:22 localhost kernel: PCI: Using configuration type 1 for base access
Feb 20 06:37:22 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 20 06:37:22 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 20 06:37:22 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Feb 20 06:37:22 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Feb 20 06:37:22 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Feb 20 06:37:22 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Feb 20 06:37:22 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Feb 20 06:37:22 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 20 06:37:22 localhost kernel: ACPI: Interpreter enabled
Feb 20 06:37:22 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 20 06:37:22 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 20 06:37:22 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 20 06:37:22 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 20 06:37:22 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 20 06:37:22 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 20 06:37:22 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [3] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [4] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [5] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [6] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [7] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [8] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [9] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [10] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [11] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [12] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [13] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [14] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [15] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [16] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [17] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [18] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [19] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [20] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [21] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [22] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [23] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [24] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [25] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [26] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [27] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [28] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [29] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [30] registered
Feb 20 06:37:22 localhost kernel: acpiphp: Slot [31] registered
Feb 20 06:37:22 localhost kernel: PCI host bridge to bus 0000:00
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Feb 20 06:37:22 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Feb 20 06:37:22 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Feb 20 06:37:22 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Feb 20 06:37:22 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Feb 20 06:37:22 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 20 06:37:22 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Feb 20 06:37:22 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Feb 20 06:37:22 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 20 06:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 20 06:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 20 06:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 20 06:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 20 06:37:22 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 20 06:37:22 localhost kernel: iommu: Default domain type: Translated 
Feb 20 06:37:22 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Feb 20 06:37:22 localhost kernel: SCSI subsystem initialized
Feb 20 06:37:22 localhost kernel: ACPI: bus type USB registered
Feb 20 06:37:22 localhost kernel: usbcore: registered new interface driver usbfs
Feb 20 06:37:22 localhost kernel: usbcore: registered new interface driver hub
Feb 20 06:37:22 localhost kernel: usbcore: registered new device driver usb
Feb 20 06:37:22 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 20 06:37:22 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 20 06:37:22 localhost kernel: PTP clock support registered
Feb 20 06:37:22 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 20 06:37:22 localhost kernel: NetLabel: Initializing
Feb 20 06:37:22 localhost kernel: NetLabel:  domain hash size = 128
Feb 20 06:37:22 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 20 06:37:22 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 20 06:37:22 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 20 06:37:22 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 20 06:37:22 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 20 06:37:22 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 20 06:37:22 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 20 06:37:22 localhost kernel: vgaarb: loaded
Feb 20 06:37:22 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 20 06:37:22 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 20 06:37:22 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 20 06:37:22 localhost kernel: pnp: PnP ACPI init
Feb 20 06:37:22 localhost kernel: pnp 00:03: [dma 2]
Feb 20 06:37:22 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 20 06:37:22 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 20 06:37:22 localhost kernel: NET: Registered PF_INET protocol family
Feb 20 06:37:22 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Feb 20 06:37:22 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Feb 20 06:37:22 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 20 06:37:22 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 20 06:37:22 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 20 06:37:22 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Feb 20 06:37:22 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Feb 20 06:37:22 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Feb 20 06:37:22 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Feb 20 06:37:22 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 20 06:37:22 localhost kernel: NET: Registered PF_XDP protocol family
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 20 06:37:22 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 20 06:37:22 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 20 06:37:22 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 20 06:37:22 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27140 usecs
Feb 20 06:37:22 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 20 06:37:22 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 20 06:37:22 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 20 06:37:22 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 20 06:37:22 localhost kernel: ACPI: bus type thunderbolt registered
Feb 20 06:37:22 localhost kernel: Initialise system trusted keyrings
Feb 20 06:37:22 localhost kernel: Key type blacklist registered
Feb 20 06:37:22 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Feb 20 06:37:22 localhost kernel: zbud: loaded
Feb 20 06:37:22 localhost kernel: integrity: Platform Keyring initialized
Feb 20 06:37:22 localhost kernel: NET: Registered PF_ALG protocol family
Feb 20 06:37:22 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 20 06:37:22 localhost kernel: Key type asymmetric registered
Feb 20 06:37:22 localhost kernel: Asymmetric key parser 'x509' registered
Feb 20 06:37:22 localhost kernel: Running certificate verification selftests
Feb 20 06:37:22 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 20 06:37:22 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 20 06:37:22 localhost kernel: io scheduler mq-deadline registered
Feb 20 06:37:22 localhost kernel: io scheduler kyber registered
Feb 20 06:37:22 localhost kernel: io scheduler bfq registered
Feb 20 06:37:22 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 20 06:37:22 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 20 06:37:22 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 20 06:37:22 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 20 06:37:22 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 20 06:37:22 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 20 06:37:22 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 20 06:37:22 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 20 06:37:22 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 20 06:37:22 localhost kernel: Non-volatile memory driver v1.3
Feb 20 06:37:22 localhost kernel: rdac: device handler registered
Feb 20 06:37:22 localhost kernel: hp_sw: device handler registered
Feb 20 06:37:22 localhost kernel: emc: device handler registered
Feb 20 06:37:22 localhost kernel: alua: device handler registered
Feb 20 06:37:22 localhost kernel: libphy: Fixed MDIO Bus: probed
Feb 20 06:37:22 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Feb 20 06:37:22 localhost kernel: ehci-pci: EHCI PCI platform driver
Feb 20 06:37:22 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Feb 20 06:37:22 localhost kernel: ohci-pci: OHCI PCI platform driver
Feb 20 06:37:22 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Feb 20 06:37:22 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 20 06:37:22 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 20 06:37:22 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 20 06:37:22 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 20 06:37:22 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 20 06:37:22 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 20 06:37:22 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 20 06:37:22 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Feb 20 06:37:22 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 20 06:37:22 localhost kernel: hub 1-0:1.0: USB hub found
Feb 20 06:37:22 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 20 06:37:22 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 20 06:37:22 localhost kernel: usbserial: USB Serial support registered for generic
Feb 20 06:37:22 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 20 06:37:22 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 20 06:37:22 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 20 06:37:22 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 20 06:37:22 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 20 06:37:22 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 20 06:37:22 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 20 06:37:22 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-20T06:37:21 UTC (1771569441)
Feb 20 06:37:22 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 20 06:37:22 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 20 06:37:22 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 20 06:37:22 localhost kernel: usbcore: registered new interface driver usbhid
Feb 20 06:37:22 localhost kernel: usbhid: USB HID core driver
Feb 20 06:37:22 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 20 06:37:22 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 20 06:37:22 localhost kernel: Initializing XFRM netlink socket
Feb 20 06:37:22 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 20 06:37:22 localhost kernel: Segment Routing with IPv6
Feb 20 06:37:22 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 20 06:37:22 localhost kernel: mpls_gso: MPLS GSO support
Feb 20 06:37:22 localhost kernel: IPI shorthand broadcast: enabled
Feb 20 06:37:22 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 20 06:37:22 localhost kernel: AES CTR mode by8 optimization enabled
Feb 20 06:37:22 localhost kernel: sched_clock: Marking stable (775586023, 178347764)->(1078203314, -124269527)
Feb 20 06:37:22 localhost kernel: registered taskstats version 1
Feb 20 06:37:22 localhost kernel: Loading compiled-in X.509 certificates
Feb 20 06:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Feb 20 06:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 20 06:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 20 06:37:22 localhost kernel: zswap: loaded using pool lzo/zbud
Feb 20 06:37:22 localhost kernel: page_owner is disabled
Feb 20 06:37:22 localhost kernel: Key type big_key registered
Feb 20 06:37:22 localhost kernel: Freeing initrd memory: 74232K
Feb 20 06:37:22 localhost kernel: Key type encrypted registered
Feb 20 06:37:22 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 20 06:37:22 localhost kernel: Loading compiled-in module X.509 certificates
Feb 20 06:37:22 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Feb 20 06:37:22 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 20 06:37:22 localhost kernel: ima: No architecture policies found
Feb 20 06:37:22 localhost kernel: evm: Initialising EVM extended attributes:
Feb 20 06:37:22 localhost kernel: evm: security.selinux
Feb 20 06:37:22 localhost kernel: evm: security.SMACK64 (disabled)
Feb 20 06:37:22 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 20 06:37:22 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 20 06:37:22 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 20 06:37:22 localhost kernel: evm: security.apparmor (disabled)
Feb 20 06:37:22 localhost kernel: evm: security.ima
Feb 20 06:37:22 localhost kernel: evm: security.capability
Feb 20 06:37:22 localhost kernel: evm: HMAC attrs: 0x1
Feb 20 06:37:22 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 20 06:37:22 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 20 06:37:22 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 20 06:37:22 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 20 06:37:22 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 20 06:37:22 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 20 06:37:22 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 20 06:37:22 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 20 06:37:22 localhost kernel: Freeing unused decrypted memory: 2036K
Feb 20 06:37:22 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Feb 20 06:37:22 localhost kernel: Write protecting the kernel read-only data: 26624k
Feb 20 06:37:22 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Feb 20 06:37:22 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Feb 20 06:37:22 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 20 06:37:22 localhost kernel: Run /init as init process
Feb 20 06:37:22 localhost kernel:   with arguments:
Feb 20 06:37:22 localhost kernel:     /init
Feb 20 06:37:22 localhost kernel:   with environment:
Feb 20 06:37:22 localhost kernel:     HOME=/
Feb 20 06:37:22 localhost kernel:     TERM=linux
Feb 20 06:37:22 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Feb 20 06:37:22 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 20 06:37:22 localhost systemd[1]: Detected virtualization kvm.
Feb 20 06:37:22 localhost systemd[1]: Detected architecture x86-64.
Feb 20 06:37:22 localhost systemd[1]: Running in initrd.
Feb 20 06:37:22 localhost systemd[1]: No hostname configured, using default hostname.
Feb 20 06:37:22 localhost systemd[1]: Hostname set to <localhost>.
Feb 20 06:37:22 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 20 06:37:22 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 20 06:37:22 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 20 06:37:22 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 20 06:37:22 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 20 06:37:22 localhost systemd[1]: Reached target Local File Systems.
Feb 20 06:37:22 localhost systemd[1]: Reached target Path Units.
Feb 20 06:37:22 localhost systemd[1]: Reached target Slice Units.
Feb 20 06:37:22 localhost systemd[1]: Reached target Swaps.
Feb 20 06:37:22 localhost systemd[1]: Reached target Timer Units.
Feb 20 06:37:22 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 20 06:37:22 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 20 06:37:22 localhost systemd[1]: Listening on Journal Socket.
Feb 20 06:37:22 localhost systemd[1]: Listening on udev Control Socket.
Feb 20 06:37:22 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 20 06:37:22 localhost systemd[1]: Reached target Socket Units.
Feb 20 06:37:22 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 20 06:37:22 localhost systemd[1]: Starting Journal Service...
Feb 20 06:37:22 localhost systemd[1]: Starting Load Kernel Modules...
Feb 20 06:37:22 localhost systemd[1]: Starting Create System Users...
Feb 20 06:37:22 localhost systemd[1]: Starting Setup Virtual Console...
Feb 20 06:37:22 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 20 06:37:22 localhost systemd[1]: Finished Load Kernel Modules.
Feb 20 06:37:22 localhost systemd-journald[283]: Journal started
Feb 20 06:37:22 localhost systemd-journald[283]: Runtime Journal (/run/log/journal/f44a30b3674b4e65a07dfb3d71d4ae11) is 8.0M, max 314.7M, 306.7M free.
Feb 20 06:37:22 localhost systemd-modules-load[284]: Module 'msr' is built in
Feb 20 06:37:22 localhost systemd[1]: Started Journal Service.
Feb 20 06:37:22 localhost systemd[1]: Finished Setup Virtual Console.
Feb 20 06:37:22 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 20 06:37:22 localhost systemd[1]: Starting dracut cmdline hook...
Feb 20 06:37:22 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 20 06:37:22 localhost systemd-sysusers[285]: Creating group 'sgx' with GID 997.
Feb 20 06:37:22 localhost systemd-sysusers[285]: Creating group 'users' with GID 100.
Feb 20 06:37:22 localhost systemd-sysusers[285]: Creating group 'dbus' with GID 81.
Feb 20 06:37:22 localhost systemd-sysusers[285]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 20 06:37:22 localhost systemd[1]: Finished Create System Users.
Feb 20 06:37:22 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 20 06:37:22 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 20 06:37:22 localhost dracut-cmdline[288]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Feb 20 06:37:22 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 20 06:37:22 localhost dracut-cmdline[288]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 20 06:37:22 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 20 06:37:22 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 20 06:37:22 localhost systemd[1]: Finished dracut cmdline hook.
Feb 20 06:37:22 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 20 06:37:22 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 20 06:37:22 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 20 06:37:22 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Feb 20 06:37:22 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 20 06:37:22 localhost kernel: RPC: Registered udp transport module.
Feb 20 06:37:22 localhost kernel: RPC: Registered tcp transport module.
Feb 20 06:37:22 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 20 06:37:22 localhost rpc.statd[408]: Version 2.5.4 starting
Feb 20 06:37:22 localhost rpc.statd[408]: Initializing NSM state
Feb 20 06:37:22 localhost rpc.idmapd[413]: Setting log level to 0
Feb 20 06:37:22 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 20 06:37:22 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 20 06:37:22 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'.
Feb 20 06:37:22 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 20 06:37:22 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 20 06:37:22 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 20 06:37:22 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 20 06:37:22 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 20 06:37:22 localhost systemd[1]: Reached target System Initialization.
Feb 20 06:37:22 localhost systemd[1]: Reached target Basic System.
Feb 20 06:37:22 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 20 06:37:22 localhost systemd[1]: Reached target Network.
Feb 20 06:37:22 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 20 06:37:22 localhost systemd[1]: Starting dracut initqueue hook...
Feb 20 06:37:22 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Feb 20 06:37:22 localhost kernel: libata version 3.00 loaded.
Feb 20 06:37:22 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Feb 20 06:37:22 localhost kernel: GPT:20971519 != 838860799
Feb 20 06:37:22 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Feb 20 06:37:22 localhost kernel: GPT:20971519 != 838860799
Feb 20 06:37:22 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Feb 20 06:37:22 localhost kernel:  vda: vda1 vda2 vda3 vda4
Feb 20 06:37:22 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 20 06:37:22 localhost systemd-udevd[453]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 06:37:22 localhost kernel: scsi host0: ata_piix
Feb 20 06:37:22 localhost kernel: scsi host1: ata_piix
Feb 20 06:37:22 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Feb 20 06:37:22 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Feb 20 06:37:22 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Feb 20 06:37:22 localhost systemd[1]: Reached target Initrd Root Device.
Feb 20 06:37:22 localhost kernel: ata1: found unknown device (class 0)
Feb 20 06:37:22 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 20 06:37:22 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 20 06:37:23 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 20 06:37:23 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 20 06:37:23 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 20 06:37:23 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 20 06:37:23 localhost systemd[1]: Finished dracut initqueue hook.
Feb 20 06:37:23 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 20 06:37:23 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 20 06:37:23 localhost systemd[1]: Reached target Remote File Systems.
Feb 20 06:37:23 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 20 06:37:23 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 20 06:37:23 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Feb 20 06:37:23 localhost systemd-fsck[512]: /usr/sbin/fsck.xfs: XFS file system.
Feb 20 06:37:23 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Feb 20 06:37:23 localhost systemd[1]: Mounting /sysroot...
Feb 20 06:37:23 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 20 06:37:23 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Feb 20 06:37:23 localhost kernel: XFS (vda4): Ending clean mount
Feb 20 06:37:23 localhost systemd[1]: Mounted /sysroot.
Feb 20 06:37:23 localhost systemd[1]: Reached target Initrd Root File System.
Feb 20 06:37:23 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 20 06:37:23 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 20 06:37:23 localhost systemd[1]: Reached target Initrd File Systems.
Feb 20 06:37:23 localhost systemd[1]: Reached target Initrd Default Target.
Feb 20 06:37:23 localhost systemd[1]: Starting dracut mount hook...
Feb 20 06:37:23 localhost systemd[1]: Finished dracut mount hook.
Feb 20 06:37:23 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 20 06:37:23 localhost rpc.idmapd[413]: exiting on signal 15
Feb 20 06:37:23 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 20 06:37:23 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 20 06:37:23 localhost systemd[1]: Stopped target Network.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Timer Units.
Feb 20 06:37:23 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 20 06:37:23 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Basic System.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Path Units.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Remote File Systems.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Slice Units.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Socket Units.
Feb 20 06:37:23 localhost systemd[1]: Stopped target System Initialization.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Local File Systems.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Swaps.
Feb 20 06:37:23 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut mount hook.
Feb 20 06:37:23 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 20 06:37:23 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 20 06:37:23 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 20 06:37:23 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 20 06:37:23 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 20 06:37:23 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Load Kernel Modules.
Feb 20 06:37:23 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 20 06:37:23 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 20 06:37:23 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 20 06:37:23 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 20 06:37:23 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 20 06:37:23 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 20 06:37:23 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 20 06:37:23 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Closed udev Control Socket.
Feb 20 06:37:23 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Closed udev Kernel Socket.
Feb 20 06:37:23 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 20 06:37:23 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 20 06:37:23 localhost systemd[1]: Starting Cleanup udev Database...
Feb 20 06:37:23 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 20 06:37:23 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 20 06:37:23 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Stopped Create System Users.
Feb 20 06:37:23 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 20 06:37:23 localhost systemd[1]: Finished Cleanup udev Database.
Feb 20 06:37:23 localhost systemd[1]: Reached target Switch Root.
Feb 20 06:37:23 localhost systemd[1]: Starting Switch Root...
Feb 20 06:37:23 localhost systemd[1]: Switching root.
Feb 20 06:37:23 localhost systemd-journald[283]: Journal stopped
Feb 20 06:37:24 localhost systemd-journald[283]: Received SIGTERM from PID 1 (systemd).
Feb 20 06:37:24 localhost kernel: audit: type=1404 audit(1771569444.052:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability open_perms=1
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 06:37:24 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 06:37:24 localhost kernel: audit: type=1403 audit(1771569444.185:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 20 06:37:24 localhost systemd[1]: Successfully loaded SELinux policy in 136.805ms.
Feb 20 06:37:24 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 35.458ms.
Feb 20 06:37:24 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 20 06:37:24 localhost systemd[1]: Detected virtualization kvm.
Feb 20 06:37:24 localhost systemd[1]: Detected architecture x86-64.
Feb 20 06:37:24 localhost systemd-rc-local-generator[582]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 06:37:24 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 06:37:24 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 20 06:37:24 localhost systemd[1]: Stopped Switch Root.
Feb 20 06:37:24 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 20 06:37:24 localhost systemd[1]: Created slice Slice /system/getty.
Feb 20 06:37:24 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 20 06:37:24 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 20 06:37:24 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 20 06:37:24 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Feb 20 06:37:24 localhost systemd[1]: Created slice User and Session Slice.
Feb 20 06:37:24 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 20 06:37:24 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 20 06:37:24 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 20 06:37:24 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 20 06:37:24 localhost systemd[1]: Stopped target Switch Root.
Feb 20 06:37:24 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 20 06:37:24 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 20 06:37:24 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 20 06:37:24 localhost systemd[1]: Reached target Path Units.
Feb 20 06:37:24 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 20 06:37:24 localhost systemd[1]: Reached target Slice Units.
Feb 20 06:37:24 localhost systemd[1]: Reached target Swaps.
Feb 20 06:37:24 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 20 06:37:24 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 20 06:37:24 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 20 06:37:24 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 20 06:37:24 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 20 06:37:24 localhost systemd[1]: Listening on udev Control Socket.
Feb 20 06:37:24 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 20 06:37:24 localhost systemd[1]: Mounting Huge Pages File System...
Feb 20 06:37:24 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 20 06:37:24 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 20 06:37:24 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 20 06:37:24 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 20 06:37:24 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 20 06:37:24 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 20 06:37:24 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 20 06:37:24 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 20 06:37:24 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 20 06:37:24 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 20 06:37:24 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 20 06:37:24 localhost systemd[1]: Stopped Journal Service.
Feb 20 06:37:24 localhost systemd[1]: Starting Journal Service...
Feb 20 06:37:24 localhost systemd[1]: Starting Load Kernel Modules...
Feb 20 06:37:24 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 20 06:37:24 localhost kernel: ACPI: bus type drm_connector registered
Feb 20 06:37:24 localhost kernel: fuse: init (API version 7.36)
Feb 20 06:37:24 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 20 06:37:24 localhost systemd-journald[618]: Journal started
Feb 20 06:37:24 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/01f46965e72fd8a157841feaa66c8d52) is 8.0M, max 314.7M, 306.7M free.
Feb 20 06:37:24 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 20 06:37:24 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 20 06:37:24 localhost systemd-modules-load[619]: Module 'msr' is built in
Feb 20 06:37:24 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 20 06:37:24 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 20 06:37:24 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 20 06:37:24 localhost systemd[1]: Started Journal Service.
Feb 20 06:37:24 localhost systemd[1]: Mounted Huge Pages File System.
Feb 20 06:37:24 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 20 06:37:24 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 20 06:37:24 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 20 06:37:24 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 20 06:37:24 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 20 06:37:24 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 20 06:37:24 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 20 06:37:24 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 20 06:37:24 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 20 06:37:24 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 20 06:37:24 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 20 06:37:24 localhost systemd[1]: Finished Load Kernel Modules.
Feb 20 06:37:24 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 20 06:37:24 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 20 06:37:24 localhost systemd[1]: Mounting FUSE Control File System...
Feb 20 06:37:24 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 20 06:37:24 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 20 06:37:24 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 20 06:37:24 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 20 06:37:24 localhost systemd[1]: Starting Load/Save Random Seed...
Feb 20 06:37:24 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 20 06:37:24 localhost systemd[1]: Starting Create System Users...
Feb 20 06:37:24 localhost systemd-journald[618]: Runtime Journal (/run/log/journal/01f46965e72fd8a157841feaa66c8d52) is 8.0M, max 314.7M, 306.7M free.
Feb 20 06:37:24 localhost systemd-journald[618]: Received client request to flush runtime journal.
Feb 20 06:37:24 localhost systemd[1]: Mounted FUSE Control File System.
Feb 20 06:37:24 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 20 06:37:24 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 20 06:37:24 localhost systemd[1]: Finished Load/Save Random Seed.
Feb 20 06:37:24 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 20 06:37:24 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 20 06:37:24 localhost systemd-sysusers[630]: Creating group 'sgx' with GID 989.
Feb 20 06:37:24 localhost systemd-sysusers[630]: Creating group 'systemd-oom' with GID 988.
Feb 20 06:37:24 localhost systemd-sysusers[630]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Feb 20 06:37:24 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 20 06:37:24 localhost systemd[1]: Finished Create System Users.
Feb 20 06:37:24 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 20 06:37:25 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 20 06:37:25 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 20 06:37:25 localhost systemd[1]: Set up automount EFI System Partition Automount.
Feb 20 06:37:25 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 20 06:37:25 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 20 06:37:25 localhost systemd-udevd[635]: Using default interface naming scheme 'rhel-9.0'.
Feb 20 06:37:25 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 20 06:37:25 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 20 06:37:25 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 20 06:37:25 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 20 06:37:25 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 20 06:37:25 localhost systemd-udevd[648]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 06:37:25 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 20 06:37:25 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 20 06:37:25 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Feb 20 06:37:25 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Feb 20 06:37:25 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Feb 20 06:37:25 localhost systemd[1]: Mounting /boot...
Feb 20 06:37:25 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Feb 20 06:37:25 localhost systemd-fsck[687]: fsck.fat 4.2 (2021-01-31)
Feb 20 06:37:25 localhost systemd-fsck[687]: /dev/vda2: 12 files, 1782/51145 clusters
Feb 20 06:37:25 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Feb 20 06:37:25 localhost kernel: XFS (vda3): Ending clean mount
Feb 20 06:37:25 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Feb 20 06:37:25 localhost systemd[1]: Mounted /boot.
Feb 20 06:37:25 localhost kernel: SVM: TSC scaling supported
Feb 20 06:37:25 localhost kernel: kvm: Nested Virtualization enabled
Feb 20 06:37:25 localhost kernel: SVM: kvm: Nested Paging enabled
Feb 20 06:37:25 localhost kernel: SVM: LBR virtualization supported
Feb 20 06:37:25 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 20 06:37:25 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 20 06:37:25 localhost kernel: Console: switching to colour dummy device 80x25
Feb 20 06:37:25 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 20 06:37:25 localhost kernel: [drm] features: -context_init
Feb 20 06:37:25 localhost kernel: [drm] number of scanouts: 1
Feb 20 06:37:25 localhost kernel: [drm] number of cap sets: 0
Feb 20 06:37:25 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Feb 20 06:37:25 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Feb 20 06:37:25 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 20 06:37:25 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 20 06:37:25 localhost systemd[1]: Mounting /boot/efi...
Feb 20 06:37:25 localhost systemd[1]: Mounted /boot/efi.
Feb 20 06:37:25 localhost systemd[1]: Reached target Local File Systems.
Feb 20 06:37:25 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 20 06:37:25 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 20 06:37:25 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 20 06:37:25 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 20 06:37:25 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 20 06:37:25 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 20 06:37:25 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 20 06:37:25 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 713 (bootctl)
Feb 20 06:37:25 localhost systemd[1]: Starting File System Check on /dev/vda2...
Feb 20 06:37:25 localhost systemd[1]: Finished File System Check on /dev/vda2.
Feb 20 06:37:26 localhost systemd[1]: Mounting EFI System Partition Automount...
Feb 20 06:37:26 localhost systemd[1]: Mounted EFI System Partition Automount.
Feb 20 06:37:26 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 20 06:37:26 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 20 06:37:26 localhost systemd[1]: Starting Security Auditing Service...
Feb 20 06:37:26 localhost systemd[1]: Starting RPC Bind...
Feb 20 06:37:26 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 20 06:37:26 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 20 06:37:26 localhost auditd[725]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Feb 20 06:37:26 localhost auditd[725]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Feb 20 06:37:26 localhost systemd[1]: Started RPC Bind.
Feb 20 06:37:26 localhost augenrules[730]: /sbin/augenrules: No change
Feb 20 06:37:26 localhost augenrules[740]: No rules
Feb 20 06:37:26 localhost augenrules[740]: enabled 1
Feb 20 06:37:26 localhost augenrules[740]: failure 1
Feb 20 06:37:26 localhost augenrules[740]: pid 725
Feb 20 06:37:26 localhost augenrules[740]: rate_limit 0
Feb 20 06:37:26 localhost augenrules[740]: backlog_limit 8192
Feb 20 06:37:26 localhost augenrules[740]: lost 0
Feb 20 06:37:26 localhost augenrules[740]: backlog 3
Feb 20 06:37:26 localhost augenrules[740]: backlog_wait_time 60000
Feb 20 06:37:26 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 20 06:37:26 localhost augenrules[740]: enabled 1
Feb 20 06:37:26 localhost augenrules[740]: failure 1
Feb 20 06:37:26 localhost augenrules[740]: pid 725
Feb 20 06:37:26 localhost augenrules[740]: rate_limit 0
Feb 20 06:37:26 localhost augenrules[740]: backlog_limit 8192
Feb 20 06:37:26 localhost augenrules[740]: lost 0
Feb 20 06:37:26 localhost augenrules[740]: backlog 0
Feb 20 06:37:26 localhost augenrules[740]: backlog_wait_time 60000
Feb 20 06:37:26 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 20 06:37:26 localhost augenrules[740]: enabled 1
Feb 20 06:37:26 localhost augenrules[740]: failure 1
Feb 20 06:37:26 localhost augenrules[740]: pid 725
Feb 20 06:37:26 localhost augenrules[740]: rate_limit 0
Feb 20 06:37:26 localhost augenrules[740]: backlog_limit 8192
Feb 20 06:37:26 localhost augenrules[740]: lost 0
Feb 20 06:37:26 localhost augenrules[740]: backlog 4
Feb 20 06:37:26 localhost augenrules[740]: backlog_wait_time 60000
Feb 20 06:37:26 localhost augenrules[740]: backlog_wait_time_actual 0
Feb 20 06:37:26 localhost systemd[1]: Started Security Auditing Service.
Feb 20 06:37:26 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 20 06:37:26 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 20 06:37:26 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 20 06:37:26 localhost systemd[1]: Starting Update is Completed...
Feb 20 06:37:26 localhost systemd[1]: Finished Update is Completed.
Feb 20 06:37:26 localhost systemd[1]: Reached target System Initialization.
Feb 20 06:37:26 localhost systemd[1]: Started dnf makecache --timer.
Feb 20 06:37:26 localhost systemd[1]: Started Daily rotation of log files.
Feb 20 06:37:26 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 20 06:37:26 localhost systemd[1]: Reached target Timer Units.
Feb 20 06:37:26 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 20 06:37:26 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 20 06:37:26 localhost systemd[1]: Reached target Socket Units.
Feb 20 06:37:26 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Feb 20 06:37:26 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 20 06:37:26 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 20 06:37:26 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 20 06:37:26 localhost systemd[1]: Reached target Basic System.
Feb 20 06:37:26 localhost systemd[1]: Starting NTP client/server...
Feb 20 06:37:26 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 20 06:37:26 localhost dbus-broker-lau[750]: Ready
Feb 20 06:37:26 localhost systemd[1]: Started irqbalance daemon.
Feb 20 06:37:26 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 20 06:37:26 localhost systemd[1]: Starting System Logging Service...
Feb 20 06:37:26 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 06:37:26 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 06:37:26 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 06:37:26 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 20 06:37:26 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 20 06:37:26 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 20 06:37:26 localhost systemd[1]: Starting User Login Management...
Feb 20 06:37:26 localhost rsyslogd[758]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="758" x-info="https://www.rsyslog.com"] start
Feb 20 06:37:26 localhost rsyslogd[758]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Feb 20 06:37:26 localhost systemd[1]: Started System Logging Service.
Feb 20 06:37:26 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 20 06:37:26 localhost chronyd[765]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 20 06:37:26 localhost chronyd[765]: Using right/UTC timezone to obtain leap second data
Feb 20 06:37:26 localhost chronyd[765]: Loaded seccomp filter (level 2)
Feb 20 06:37:26 localhost systemd[1]: Started NTP client/server.
Feb 20 06:37:26 localhost systemd-logind[759]: New seat seat0.
Feb 20 06:37:26 localhost systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 20 06:37:26 localhost systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 20 06:37:26 localhost systemd[1]: Started User Login Management.
Feb 20 06:37:26 localhost rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 06:37:26 localhost cloud-init[769]: Cloud-init v. 22.1-9.el9 running 'init-local' at Fri, 20 Feb 2026 06:37:26 +0000. Up 6.12 seconds.
Feb 20 06:37:27 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 20 06:37:27 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 20 06:37:27 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpjpj1ezo2.mount: Deactivated successfully.
Feb 20 06:37:27 localhost systemd[1]: Starting Hostname Service...
Feb 20 06:37:27 localhost systemd[1]: Started Hostname Service.
Feb 20 06:37:27 np0005625204.novalocal systemd-hostnamed[783]: Hostname set to <np0005625204.novalocal> (static)
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Reached target Preparation for Network.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Network Manager...
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4523] NetworkManager (version 1.42.2-1.el9) is starting... (boot:de550921-08e7-4d93-b7d2-b745d62af5c6)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4530] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4589] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Started Network Manager.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Reached target Network.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4667] manager[0x55f0caa47020]: monitoring kernel firmware directory '/lib/firmware'.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4716] hostname: hostname: using hostnamed
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4717] hostname: static hostname changed from (none) to "np0005625204.novalocal"
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4730] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Reached target NFS client services.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Reached target Remote File Systems.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4888] manager[0x55f0caa47020]: rfkill: Wi-Fi hardware radio set enabled
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4889] manager[0x55f0caa47020]: rfkill: WWAN hardware radio set enabled
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4952] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4953] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4966] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.4969] manager: Networking is enabled by state file
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5010] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5012] settings: Loaded settings plugin: keyfile (internal)
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5042] dhcp: init: Using DHCP client 'internal'
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5044] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5057] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5061] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5069] device (lo): Activation: starting connection 'lo' (b35a86af-6461-4196-bb8b-daceaa528560)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5077] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5080] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5114] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5117] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5119] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5120] device (eth0): carrier: link connected
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5122] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5126] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5131] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5135] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5136] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5138] manager: NetworkManager state is now CONNECTING
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5140] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5155] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5157] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5188] dhcp4 (eth0): state changed new lease, address=38.102.83.80
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5191] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5212] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5271] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5274] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5280] device (lo): Activation: successful, device activated.
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5286] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5288] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5291] manager: NetworkManager state is now CONNECTED_SITE
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5295] device (eth0): Activation: successful, device activated.
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5300] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 20 06:37:27 np0005625204.novalocal NetworkManager[788]: <info>  [1771569447.5305] manager: startup complete
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: Cloud-init v. 22.1-9.el9 running 'init' at Fri, 20 Feb 2026 06:37:27 +0000. Up 6.98 seconds.
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |  eth0  | True |         38.102.83.80         | 255.255.255.0 | global | fa:16:3e:f0:29:e2 |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |  eth0  | True | fe80::f816:3eff:fef0:29e2/64 |       .       |  link  | fa:16:3e:f0:29:e2 |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 20 06:37:27 np0005625204.novalocal cloud-init[936]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 20 06:37:27 np0005625204.novalocal systemd[1]: Starting Authorization Manager...
Feb 20 06:37:28 np0005625204.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Feb 20 06:37:28 np0005625204.novalocal polkitd[1035]: Started polkitd version 0.117
Feb 20 06:37:28 np0005625204.novalocal polkitd[1035]: Loading rules from directory /etc/polkit-1/rules.d
Feb 20 06:37:28 np0005625204.novalocal polkitd[1035]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 20 06:37:28 np0005625204.novalocal polkitd[1035]: Finished loading, compiling and executing 4 rules
Feb 20 06:37:28 np0005625204.novalocal systemd[1]: Started Authorization Manager.
Feb 20 06:37:28 np0005625204.novalocal polkitd[1035]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 20 06:37:29 np0005625204.novalocal useradd[1118]: new group: name=cloud-user, GID=1001
Feb 20 06:37:29 np0005625204.novalocal useradd[1118]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 20 06:37:29 np0005625204.novalocal useradd[1118]: add 'cloud-user' to group 'adm'
Feb 20 06:37:29 np0005625204.novalocal useradd[1118]: add 'cloud-user' to group 'systemd-journal'
Feb 20 06:37:29 np0005625204.novalocal useradd[1118]: add 'cloud-user' to shadow group 'adm'
Feb 20 06:37:29 np0005625204.novalocal useradd[1118]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Generating public/private rsa key pair.
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: The key fingerprint is:
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: SHA256:E91G8EdNxLVmgA2Vm1JmrWN5rMaiGWlWyNAnQy6MrH8 root@np0005625204.novalocal
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: The key's randomart image is:
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: +---[RSA 3072]----+
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |        o..o*o+==|
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |    . o..= * B..+|
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |     o o+.* B B+ |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |    .   .+ + Ooo |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |   .    S o + +  |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |    .    * . +   |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |     . Eo + o    |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |      .  o       |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |                 |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Generating public/private ecdsa key pair.
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: The key fingerprint is:
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: SHA256:aQlEOhbipyPb2qhzcpToy+ulg+XetNgyfycgUOryKOw root@np0005625204.novalocal
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: The key's randomart image is:
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: +---[ECDSA 256]---+
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |  . ..o          |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: | ... +           |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: | o. = .          |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |o  + . . o       |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |ooo.    S        |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |o==..  .         |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |**.o..           |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |BBX= .o .        |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |OEB+=. o         |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Generating public/private ed25519 key pair.
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: The key fingerprint is:
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: SHA256:/ksuBpRMcIozXwWEz9Nvt9HqjKrIKBmCAauXD8R0p04 root@np0005625204.novalocal
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: The key's randomart image is:
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: +--[ED25519 256]--+
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |    .++..        |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |. ..ooo.         |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |.++..B.o         |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |o o+E.B .        |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |oo +.. .S.   .   |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |= + . ..  o o .  |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |.+ o   ..... +   |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: |o  o..  o+ oo    |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: | .. o .o.o=oo    |
Feb 20 06:37:31 np0005625204.novalocal cloud-init[936]: +----[SHA256]-----+
Feb 20 06:37:31 np0005625204.novalocal sm-notify[1131]: Version 2.5.4 starting
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Feb 20 06:37:31 np0005625204.novalocal sshd[1132]: Server listening on 0.0.0.0 port 22.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 20 06:37:31 np0005625204.novalocal sshd[1132]: Server listening on :: port 22.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Reached target Network is Online.
Feb 20 06:37:31 np0005625204.novalocal sshd[1140]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Feb 20 06:37:31 np0005625204.novalocal sshd[1132]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting Permit User Sessions...
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Finished Permit User Sessions.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Started Command Scheduler.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Started Getty on tty1.
Feb 20 06:37:31 np0005625204.novalocal crond[1143]: (CRON) STARTUP (1.5.7)
Feb 20 06:37:31 np0005625204.novalocal crond[1143]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 20 06:37:31 np0005625204.novalocal crond[1143]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 57% if used.)
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 20 06:37:31 np0005625204.novalocal crond[1143]: (CRON) INFO (running with inotify support)
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Reached target Login Prompts.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Reached target Multi-User System.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 20 06:37:31 np0005625204.novalocal sshd[1154]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 20 06:37:31 np0005625204.novalocal sshd[1154]: Unable to negotiate with 38.102.83.114 port 38254: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 20 06:37:31 np0005625204.novalocal sshd[1171]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1184]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1184]: Unable to negotiate with 38.102.83.114 port 38272: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 20 06:37:31 np0005625204.novalocal kdumpctl[1134]: kdump: No kdump initial ramdisk found.
Feb 20 06:37:31 np0005625204.novalocal kdumpctl[1134]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Feb 20 06:37:31 np0005625204.novalocal sshd[1195]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1195]: Unable to negotiate with 38.102.83.114 port 38288: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 20 06:37:31 np0005625204.novalocal sshd[1140]: Connection closed by 38.102.83.114 port 38250 [preauth]
Feb 20 06:37:31 np0005625204.novalocal sshd[1204]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1218]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1171]: Connection closed by 38.102.83.114 port 38256 [preauth]
Feb 20 06:37:31 np0005625204.novalocal sshd[1236]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1236]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 06:37:31 np0005625204.novalocal sshd[1204]: Connection closed by 38.102.83.114 port 38290 [preauth]
Feb 20 06:37:31 np0005625204.novalocal cloud-init[1274]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Fri, 20 Feb 2026 06:37:31 +0000. Up 11.01 seconds.
Feb 20 06:37:31 np0005625204.novalocal sshd[1268]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:37:31 np0005625204.novalocal sshd[1268]: Unable to negotiate with 38.102.83.114 port 38316: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 20 06:37:31 np0005625204.novalocal sshd[1218]: Connection closed by 38.102.83.114 port 38304 [preauth]
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Feb 20 06:37:31 np0005625204.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Feb 20 06:37:32 np0005625204.novalocal dracut[1437]: dracut-057-21.git20230214.el9
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1438]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Fri, 20 Feb 2026 06:37:32 +0000. Up 11.35 seconds.
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1455]: #############################################################
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1456]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1460]: 256 SHA256:aQlEOhbipyPb2qhzcpToy+ulg+XetNgyfycgUOryKOw root@np0005625204.novalocal (ECDSA)
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1465]: 256 SHA256:/ksuBpRMcIozXwWEz9Nvt9HqjKrIKBmCAauXD8R0p04 root@np0005625204.novalocal (ED25519)
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1475]: 3072 SHA256:E91G8EdNxLVmgA2Vm1JmrWN5rMaiGWlWyNAnQy6MrH8 root@np0005625204.novalocal (RSA)
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1477]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1480]: #############################################################
Feb 20 06:37:32 np0005625204.novalocal cloud-init[1438]: Cloud-init v. 22.1-9.el9 finished at Fri, 20 Feb 2026 06:37:32 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.60 seconds
Feb 20 06:37:32 np0005625204.novalocal systemd[1]: Reloading Network Manager...
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 20 06:37:32 np0005625204.novalocal NetworkManager[788]: <info>  [1771569452.5003] audit: op="reload" arg="0" pid=1577 uid=0 result="success"
Feb 20 06:37:32 np0005625204.novalocal NetworkManager[788]: <info>  [1771569452.5013] config: signal: SIGHUP (no changes from disk)
Feb 20 06:37:32 np0005625204.novalocal systemd[1]: Reloaded Network Manager.
Feb 20 06:37:32 np0005625204.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Feb 20 06:37:32 np0005625204.novalocal systemd[1]: Reached target Cloud-init target.
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 20 06:37:32 np0005625204.novalocal chronyd[765]: Selected source 199.182.221.110 (2.rhel.pool.ntp.org)
Feb 20 06:37:32 np0005625204.novalocal chronyd[765]: System clock TAI offset set to 37 seconds
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 20 06:37:32 np0005625204.novalocal dracut[1440]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: memstrack is not available
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: memstrack is not available
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: *** Including module: systemd ***
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: *** Including module: systemd-initrd ***
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: *** Including module: i18n ***
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: No KEYMAP configured.
Feb 20 06:37:33 np0005625204.novalocal dracut[1440]: *** Including module: drm ***
Feb 20 06:37:34 np0005625204.novalocal chronyd[765]: Selected source 149.56.19.163 (2.rhel.pool.ntp.org)
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: prefixdevname ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: kernel-modules ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: kernel-modules-extra ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: qemu ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: fstab-sys ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: rootfs-block ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: terminfo ***
Feb 20 06:37:34 np0005625204.novalocal dracut[1440]: *** Including module: udev-rules ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: Skipping udev rule: 91-permissions.rules
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: virtiofs ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: dracut-systemd ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: usrmount ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: base ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: fs-lib ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: kdumpbase ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]:   microcode_ctl module: mangling fw_dir
Feb 20 06:37:35 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]: *** Including module: shutdown ***
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]: *** Including module: squash ***
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]: *** Including modules done ***
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]: *** Installing kernel module dependencies ***
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]: *** Installing kernel module dependencies done ***
Feb 20 06:37:36 np0005625204.novalocal dracut[1440]: *** Resolving executable dependencies ***
Feb 20 06:37:37 np0005625204.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Resolving executable dependencies done ***
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Hardlinking files ***
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Mode:           real
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Files:          1099
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Linked:         3 files
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Compared:       0 xattrs
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Compared:       373 files
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Saved:          61.04 KiB
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Duration:       0.035561 seconds
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Hardlinking files done ***
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Could not find 'strip'. Not stripping the initramfs.
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Generating early-microcode cpio image ***
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Constructing AuthenticAMD.bin ***
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Store current command line parameters ***
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: Stored kernel commandline:
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: No dracut internal kernel commandline stored in the initramfs
Feb 20 06:37:38 np0005625204.novalocal dracut[1440]: *** Install squash loader ***
Feb 20 06:37:39 np0005625204.novalocal dracut[1440]: *** Squashing the files inside the initramfs ***
Feb 20 06:37:40 np0005625204.novalocal dracut[1440]: *** Squashing the files inside the initramfs done ***
Feb 20 06:37:40 np0005625204.novalocal dracut[1440]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Feb 20 06:37:40 np0005625204.novalocal dracut[1440]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Feb 20 06:37:40 np0005625204.novalocal kdumpctl[1134]: kdump: kexec: loaded kdump kernel
Feb 20 06:37:40 np0005625204.novalocal kdumpctl[1134]: kdump: Starting kdump: [OK]
Feb 20 06:37:40 np0005625204.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 20 06:37:40 np0005625204.novalocal systemd[1]: Startup finished in 1.214s (kernel) + 2.066s (initrd) + 16.639s (userspace) = 19.920s.
Feb 20 06:37:57 np0005625204.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 20 06:38:17 np0005625204.novalocal sshd[4176]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:38:17 np0005625204.novalocal sshd[4176]: Accepted publickey for zuul from 38.102.83.114 port 52564 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 20 06:38:17 np0005625204.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 20 06:38:17 np0005625204.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 20 06:38:17 np0005625204.novalocal systemd-logind[759]: New session 1 of user zuul.
Feb 20 06:38:17 np0005625204.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 20 06:38:17 np0005625204.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Queued start job for default target Main User Target.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Created slice User Application Slice.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Reached target Paths.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Reached target Timers.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Starting D-Bus User Message Bus Socket...
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Starting Create User's Volatile Files and Directories...
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Finished Create User's Volatile Files and Directories.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Listening on D-Bus User Message Bus Socket.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Reached target Sockets.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Reached target Basic System.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Reached target Main User Target.
Feb 20 06:38:17 np0005625204.novalocal systemd[4180]: Startup finished in 114ms.
Feb 20 06:38:17 np0005625204.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 20 06:38:17 np0005625204.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 20 06:38:17 np0005625204.novalocal sshd[4176]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:38:18 np0005625204.novalocal python3[4232]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 06:38:27 np0005625204.novalocal python3[4250]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 06:38:33 np0005625204.novalocal python3[4303]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 06:38:34 np0005625204.novalocal python3[4333]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 20 06:38:37 np0005625204.novalocal python3[4349]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:38:38 np0005625204.novalocal python3[4363]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:39 np0005625204.novalocal python3[4422]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:38:40 np0005625204.novalocal python3[4463]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771569519.4242065-396-28194802573997/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa follow=False checksum=1ede725f5cdca64ff103c7e62f7bb7b42f0b9244 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:41 np0005625204.novalocal python3[4536]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:38:41 np0005625204.novalocal python3[4577]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771569521.188387-494-117951666179636/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa.pub follow=False checksum=d5896bb6dcd221ffe99ce3acccb68a5152af8369 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:43 np0005625204.novalocal python3[4605]: ansible-ping Invoked with data=pong
Feb 20 06:38:45 np0005625204.novalocal python3[4619]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 06:38:49 np0005625204.novalocal python3[4673]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 20 06:38:51 np0005625204.novalocal python3[4695]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:51 np0005625204.novalocal python3[4709]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:52 np0005625204.novalocal python3[4723]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:53 np0005625204.novalocal python3[4737]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:53 np0005625204.novalocal python3[4751]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:53 np0005625204.novalocal python3[4765]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:56 np0005625204.novalocal sudo[4779]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhbbopdlrfshlumrpkwypubdowoefroe ; /usr/bin/python3
Feb 20 06:38:56 np0005625204.novalocal sudo[4779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:38:56 np0005625204.novalocal python3[4781]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:56 np0005625204.novalocal sudo[4779]: pam_unix(sudo:session): session closed for user root
Feb 20 06:38:57 np0005625204.novalocal sudo[4827]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnlzsoggiozordxblegasdewxpnjmatw ; /usr/bin/python3
Feb 20 06:38:57 np0005625204.novalocal sudo[4827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:38:58 np0005625204.novalocal python3[4829]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:38:58 np0005625204.novalocal sudo[4827]: pam_unix(sudo:session): session closed for user root
Feb 20 06:38:58 np0005625204.novalocal sudo[4870]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgcbvlbqaxobjauqbvmkrssnsljfurda ; /usr/bin/python3
Feb 20 06:38:58 np0005625204.novalocal sudo[4870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:38:58 np0005625204.novalocal python3[4872]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1771569537.8461342-104-85037354026452/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:38:58 np0005625204.novalocal sudo[4870]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:06 np0005625204.novalocal python3[4900]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:06 np0005625204.novalocal python3[4914]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:06 np0005625204.novalocal python3[4928]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:06 np0005625204.novalocal python3[4942]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:07 np0005625204.novalocal python3[4956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:07 np0005625204.novalocal python3[4970]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:07 np0005625204.novalocal python3[4984]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:07 np0005625204.novalocal python3[4998]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:08 np0005625204.novalocal python3[5012]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:08 np0005625204.novalocal python3[5026]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:08 np0005625204.novalocal python3[5040]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:09 np0005625204.novalocal python3[5054]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:09 np0005625204.novalocal python3[5068]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICWBreHW95Wz2Toz5YwCGQwFcUG8oFYkienDh9tntmDc ralfieri@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:09 np0005625204.novalocal python3[5082]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:09 np0005625204.novalocal python3[5096]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:10 np0005625204.novalocal python3[5110]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:10 np0005625204.novalocal python3[5124]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:10 np0005625204.novalocal python3[5138]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:11 np0005625204.novalocal python3[5152]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:11 np0005625204.novalocal python3[5166]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:11 np0005625204.novalocal python3[5180]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:11 np0005625204.novalocal python3[5194]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:12 np0005625204.novalocal python3[5208]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:12 np0005625204.novalocal python3[5222]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:12 np0005625204.novalocal python3[5236]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:12 np0005625204.novalocal python3[5250]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:39:13 np0005625204.novalocal sudo[5264]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqqtadbhjhudppodabzzojvvsnneobze ; /usr/bin/python3
Feb 20 06:39:13 np0005625204.novalocal sudo[5264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:13 np0005625204.novalocal python3[5266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 20 06:39:13 np0005625204.novalocal systemd[1]: Starting Time & Date Service...
Feb 20 06:39:13 np0005625204.novalocal systemd[1]: Started Time & Date Service.
Feb 20 06:39:14 np0005625204.novalocal systemd-timedated[5268]: Changed time zone to 'UTC' (UTC).
Feb 20 06:39:14 np0005625204.novalocal sudo[5264]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:14 np0005625204.novalocal sudo[5285]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nsorclzwvylmqalqzgveiedqmktzzpfu ; /usr/bin/python3
Feb 20 06:39:14 np0005625204.novalocal sudo[5285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:14 np0005625204.novalocal python3[5287]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:14 np0005625204.novalocal sudo[5285]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:15 np0005625204.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:39:16 np0005625204.novalocal python3[5374]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1771569555.625111-497-82497591994618/source _original_basename=tmp78_25nce follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:17 np0005625204.novalocal python3[5434]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:39:17 np0005625204.novalocal python3[5475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771569557.1648777-588-22594415562177/source _original_basename=tmpc1tibo3d follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:19 np0005625204.novalocal sudo[5535]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afksckdujsthuptalkhjeknivaglprnw ; /usr/bin/python3
Feb 20 06:39:19 np0005625204.novalocal sudo[5535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:19 np0005625204.novalocal python3[5537]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:39:19 np0005625204.novalocal sudo[5535]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:19 np0005625204.novalocal sudo[5578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfdcovuvunxcxljsqkoqsejamhdhtmta ; /usr/bin/python3
Feb 20 06:39:19 np0005625204.novalocal sudo[5578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:19 np0005625204.novalocal python3[5580]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1771569559.2069945-732-235973663411204/source _original_basename=tmpyjse3d76 follow=False checksum=d65fea983e4ac4bc5449bcd3fb3aadcab86a1db0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:19 np0005625204.novalocal sudo[5578]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:20 np0005625204.novalocal python3[5608]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:39:21 np0005625204.novalocal python3[5624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:39:22 np0005625204.novalocal sudo[5672]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rukpdzqkaapehhqpkejkwawbagrgfmsa ; /usr/bin/python3
Feb 20 06:39:22 np0005625204.novalocal sudo[5672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:22 np0005625204.novalocal python3[5674]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:39:22 np0005625204.novalocal sudo[5672]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:22 np0005625204.novalocal sudo[5715]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqpsukskvmoqjcdottinqvsiqsmtmxqx ; /usr/bin/python3
Feb 20 06:39:22 np0005625204.novalocal sudo[5715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:22 np0005625204.novalocal python3[5717]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1771569562.132321-859-54902510195330/source _original_basename=tmp8m97n_kz follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:22 np0005625204.novalocal sudo[5715]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:23 np0005625204.novalocal sudo[5746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmsyfjxaisxknxhqwnijmohazbgxouyr ; /usr/bin/python3
Feb 20 06:39:23 np0005625204.novalocal sudo[5746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:23 np0005625204.novalocal python3[5748]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-ff2a-a63c-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:39:23 np0005625204.novalocal sudo[5746]: pam_unix(sudo:session): session closed for user root
Feb 20 06:39:25 np0005625204.novalocal python3[5766]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-ff2a-a63c-000000000024-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 20 06:39:26 np0005625204.novalocal python3[5785]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:44 np0005625204.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 20 06:39:46 np0005625204.novalocal sudo[5801]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbpynodcyocpombvphnbfidamkyqnqte ; /usr/bin/python3
Feb 20 06:39:46 np0005625204.novalocal sudo[5801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:39:46 np0005625204.novalocal python3[5803]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:39:46 np0005625204.novalocal sudo[5801]: pam_unix(sudo:session): session closed for user root
Feb 20 06:40:46 np0005625204.novalocal sshd[4189]: Received disconnect from 38.102.83.114 port 52564:11: disconnected by user
Feb 20 06:40:46 np0005625204.novalocal sshd[4189]: Disconnected from user zuul 38.102.83.114 port 52564
Feb 20 06:40:46 np0005625204.novalocal sshd[4176]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:40:46 np0005625204.novalocal systemd-logind[759]: Session 1 logged out. Waiting for processes to exit.
Feb 20 06:40:56 np0005625204.novalocal systemd[4180]: Starting Mark boot as successful...
Feb 20 06:40:56 np0005625204.novalocal systemd[4180]: Finished Mark boot as successful.
Feb 20 06:41:27 np0005625204.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Feb 20 06:41:27 np0005625204.novalocal systemd[1]: efi.mount: Deactivated successfully.
Feb 20 06:41:27 np0005625204.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Feb 20 06:43:31 np0005625204.novalocal sshd[5810]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:35 np0005625204.novalocal sshd[5810]: error: maximum authentication attempts exceeded for root from 45.15.225.137 port 38618 ssh2 [preauth]
Feb 20 06:43:35 np0005625204.novalocal sshd[5810]: Disconnecting authenticating user root 45.15.225.137 port 38618: Too many authentication failures [preauth]
Feb 20 06:43:35 np0005625204.novalocal sshd[5812]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:40 np0005625204.novalocal sshd[5812]: error: maximum authentication attempts exceeded for root from 45.15.225.137 port 38884 ssh2 [preauth]
Feb 20 06:43:40 np0005625204.novalocal sshd[5812]: Disconnecting authenticating user root 45.15.225.137 port 38884: Too many authentication failures [preauth]
Feb 20 06:43:40 np0005625204.novalocal sshd[5814]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:43 np0005625204.novalocal sshd[5814]: error: maximum authentication attempts exceeded for root from 45.15.225.137 port 39206 ssh2 [preauth]
Feb 20 06:43:43 np0005625204.novalocal sshd[5814]: Disconnecting authenticating user root 45.15.225.137 port 39206: Too many authentication failures [preauth]
Feb 20 06:43:43 np0005625204.novalocal sshd[5816]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:46 np0005625204.novalocal sshd[5816]: Received disconnect from 45.15.225.137 port 39430:11: disconnected by user [preauth]
Feb 20 06:43:46 np0005625204.novalocal sshd[5816]: Disconnected from authenticating user root 45.15.225.137 port 39430 [preauth]
Feb 20 06:43:46 np0005625204.novalocal sshd[5818]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Feb 20 06:43:47 np0005625204.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Feb 20 06:43:47 np0005625204.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2012] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 20 06:43:47 np0005625204.novalocal systemd-udevd[5821]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2156] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2191] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2196] device (eth1): carrier: link connected
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2200] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2207] policy: auto-activating connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8)
Feb 20 06:43:47 np0005625204.novalocal systemd[4180]: Created slice User Background Tasks Slice.
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2219] device (eth1): Activation: starting connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8)
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2222] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2229] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2236] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Feb 20 06:43:47 np0005625204.novalocal NetworkManager[788]: <info>  [1771569827.2242] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 20 06:43:47 np0005625204.novalocal systemd[4180]: Starting Cleanup of User's Temporary Files and Directories...
Feb 20 06:43:47 np0005625204.novalocal systemd[4180]: Finished Cleanup of User's Temporary Files and Directories.
Feb 20 06:43:47 np0005625204.novalocal sshd[5824]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:47 np0005625204.novalocal sshd[5824]: Accepted publickey for zuul from 38.102.83.114 port 49156 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:43:47 np0005625204.novalocal systemd-logind[759]: New session 3 of user zuul.
Feb 20 06:43:47 np0005625204.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 20 06:43:47 np0005625204.novalocal sshd[5824]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:43:48 np0005625204.novalocal python3[5841]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-fb18-e746-000000000475-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:43:48 np0005625204.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Feb 20 06:43:49 np0005625204.novalocal sshd[5818]: Invalid user admin from 45.15.225.137 port 39616
Feb 20 06:43:50 np0005625204.novalocal sshd[5818]: error: maximum authentication attempts exceeded for invalid user admin from 45.15.225.137 port 39616 ssh2 [preauth]
Feb 20 06:43:50 np0005625204.novalocal sshd[5818]: Disconnecting invalid user admin 45.15.225.137 port 39616: Too many authentication failures [preauth]
Feb 20 06:43:50 np0005625204.novalocal sshd[5844]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:53 np0005625204.novalocal sshd[5844]: Invalid user admin from 45.15.225.137 port 39900
Feb 20 06:43:53 np0005625204.novalocal sshd[5844]: error: maximum authentication attempts exceeded for invalid user admin from 45.15.225.137 port 39900 ssh2 [preauth]
Feb 20 06:43:53 np0005625204.novalocal sshd[5844]: Disconnecting invalid user admin 45.15.225.137 port 39900: Too many authentication failures [preauth]
Feb 20 06:43:54 np0005625204.novalocal sshd[5846]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:43:56 np0005625204.novalocal sshd[5846]: Invalid user admin from 45.15.225.137 port 40136
Feb 20 06:43:57 np0005625204.novalocal sshd[5846]: Received disconnect from 45.15.225.137 port 40136:11: disconnected by user [preauth]
Feb 20 06:43:57 np0005625204.novalocal sshd[5846]: Disconnected from invalid user admin 45.15.225.137 port 40136 [preauth]
Feb 20 06:43:57 np0005625204.novalocal sshd[5848]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:01 np0005625204.novalocal sshd[5848]: Invalid user oracle from 45.15.225.137 port 40390
Feb 20 06:44:01 np0005625204.novalocal sudo[5895]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mljlgopwduhbazcsgffoenuazvefvdks ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 20 06:44:01 np0005625204.novalocal sudo[5895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:44:01 np0005625204.novalocal python3[5897]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:44:01 np0005625204.novalocal sudo[5895]: pam_unix(sudo:session): session closed for user root
Feb 20 06:44:01 np0005625204.novalocal sudo[5938]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pawoazfdtjsdtaiznheyytovcvjzcqjo ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 20 06:44:01 np0005625204.novalocal sudo[5938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:44:01 np0005625204.novalocal python3[5940]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771569841.1756814-537-139979349408384/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=3a3f85cb9d3a049d92e27c50f152d5abef64c350 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:44:01 np0005625204.novalocal sudo[5938]: pam_unix(sudo:session): session closed for user root
Feb 20 06:44:02 np0005625204.novalocal sshd[5848]: error: maximum authentication attempts exceeded for invalid user oracle from 45.15.225.137 port 40390 ssh2 [preauth]
Feb 20 06:44:02 np0005625204.novalocal sshd[5848]: Disconnecting invalid user oracle 45.15.225.137 port 40390: Too many authentication failures [preauth]
Feb 20 06:44:02 np0005625204.novalocal sudo[5968]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwoluzeikcpqqnsqgjdzawwmijqdjefn ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 20 06:44:02 np0005625204.novalocal sudo[5968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:44:02 np0005625204.novalocal sshd[5971]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:02 np0005625204.novalocal python3[5970]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Stopping Network Manager...
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.4880] caught SIGTERM, shutting down normally.
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5048] dhcp4 (eth0): canceled DHCP transaction
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5049] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5049] dhcp4 (eth0): state changed no lease
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5053] manager: NetworkManager state is now CONNECTING
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5148] dhcp4 (eth1): canceled DHCP transaction
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5149] dhcp4 (eth1): state changed no lease
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[788]: <info>  [1771569842.5215] exiting (success)
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Stopped Network Manager.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: NetworkManager.service: Consumed 2.215s CPU time.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Starting Network Manager...
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.5859] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:de550921-08e7-4d93-b7d2-b745d62af5c6)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.5862] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.5888] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Started Network Manager.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.5950] manager[0x55adec924090]: monitoring kernel firmware directory '/lib/firmware'.
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Starting Hostname Service...
Feb 20 06:44:02 np0005625204.novalocal sudo[5968]: pam_unix(sudo:session): session closed for user root
Feb 20 06:44:02 np0005625204.novalocal systemd[1]: Started Hostname Service.
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6671] hostname: hostname: using hostnamed
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6672] hostname: static hostname changed from (none) to "np0005625204.novalocal"
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6680] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6687] manager[0x55adec924090]: rfkill: Wi-Fi hardware radio set enabled
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6688] manager[0x55adec924090]: rfkill: WWAN hardware radio set enabled
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6728] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6729] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6729] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6730] manager: Networking is enabled by state file
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6746] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6747] settings: Loaded settings plugin: keyfile (internal)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6799] dhcp: init: Using DHCP client 'internal'
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6802] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6809] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6816] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6827] device (lo): Activation: starting connection 'lo' (b35a86af-6461-4196-bb8b-daceaa528560)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6835] device (eth0): carrier: link connected
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6841] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6846] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6847] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6855] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6864] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6871] device (eth1): carrier: link connected
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6877] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6884] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8) (indicated)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6884] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6890] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6899] device (eth1): Activation: starting connection 'Wired connection 1' (f176c6dc-cf7e-3130-a878-662e47281df8)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6922] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6925] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6928] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6930] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6934] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6936] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6940] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6943] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6949] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6952] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6964] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.6966] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7003] dhcp4 (eth0): state changed new lease, address=38.102.83.80
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7012] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7130] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7135] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7141] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7146] device (lo): Activation: successful, device activated.
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7186] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7188] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7193] manager: NetworkManager state is now CONNECTED_SITE
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7196] device (eth0): Activation: successful, device activated.
Feb 20 06:44:02 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569842.7201] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 20 06:44:02 np0005625204.novalocal python3[6042]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-fb18-e746-000000000136-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:44:05 np0005625204.novalocal sshd[5971]: Invalid user oracle from 45.15.225.137 port 40700
Feb 20 06:44:06 np0005625204.novalocal sshd[5971]: error: maximum authentication attempts exceeded for invalid user oracle from 45.15.225.137 port 40700 ssh2 [preauth]
Feb 20 06:44:06 np0005625204.novalocal sshd[5971]: Disconnecting invalid user oracle 45.15.225.137 port 40700: Too many authentication failures [preauth]
Feb 20 06:44:06 np0005625204.novalocal sshd[6057]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:08 np0005625204.novalocal sshd[6057]: Invalid user oracle from 45.15.225.137 port 40968
Feb 20 06:44:09 np0005625204.novalocal sshd[6057]: Received disconnect from 45.15.225.137 port 40968:11: disconnected by user [preauth]
Feb 20 06:44:09 np0005625204.novalocal sshd[6057]: Disconnected from invalid user oracle 45.15.225.137 port 40968 [preauth]
Feb 20 06:44:09 np0005625204.novalocal sshd[6059]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:12 np0005625204.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 20 06:44:13 np0005625204.novalocal sshd[6059]: Invalid user usuario from 45.15.225.137 port 41212
Feb 20 06:44:13 np0005625204.novalocal sshd[6059]: error: maximum authentication attempts exceeded for invalid user usuario from 45.15.225.137 port 41212 ssh2 [preauth]
Feb 20 06:44:13 np0005625204.novalocal sshd[6059]: Disconnecting invalid user usuario 45.15.225.137 port 41212: Too many authentication failures [preauth]
Feb 20 06:44:14 np0005625204.novalocal sshd[6061]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:16 np0005625204.novalocal sshd[6061]: Invalid user usuario from 45.15.225.137 port 41510
Feb 20 06:44:17 np0005625204.novalocal sshd[6061]: error: maximum authentication attempts exceeded for invalid user usuario from 45.15.225.137 port 41510 ssh2 [preauth]
Feb 20 06:44:17 np0005625204.novalocal sshd[6061]: Disconnecting invalid user usuario 45.15.225.137 port 41510: Too many authentication failures [preauth]
Feb 20 06:44:17 np0005625204.novalocal sshd[6063]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:20 np0005625204.novalocal sshd[6063]: Invalid user usuario from 45.15.225.137 port 41740
Feb 20 06:44:20 np0005625204.novalocal sshd[6063]: Received disconnect from 45.15.225.137 port 41740:11: disconnected by user [preauth]
Feb 20 06:44:20 np0005625204.novalocal sshd[6063]: Disconnected from invalid user usuario 45.15.225.137 port 41740 [preauth]
Feb 20 06:44:20 np0005625204.novalocal sshd[6065]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:23 np0005625204.novalocal sshd[6065]: Invalid user test from 45.15.225.137 port 41954
Feb 20 06:44:24 np0005625204.novalocal sshd[6065]: error: maximum authentication attempts exceeded for invalid user test from 45.15.225.137 port 41954 ssh2 [preauth]
Feb 20 06:44:24 np0005625204.novalocal sshd[6065]: Disconnecting invalid user test 45.15.225.137 port 41954: Too many authentication failures [preauth]
Feb 20 06:44:24 np0005625204.novalocal sshd[6067]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:27 np0005625204.novalocal sshd[6067]: Invalid user test from 45.15.225.137 port 42230
Feb 20 06:44:28 np0005625204.novalocal sshd[6067]: error: maximum authentication attempts exceeded for invalid user test from 45.15.225.137 port 42230 ssh2 [preauth]
Feb 20 06:44:28 np0005625204.novalocal sshd[6067]: Disconnecting invalid user test 45.15.225.137 port 42230: Too many authentication failures [preauth]
Feb 20 06:44:28 np0005625204.novalocal sshd[6069]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:30 np0005625204.novalocal sshd[6069]: Invalid user test from 45.15.225.137 port 42474
Feb 20 06:44:31 np0005625204.novalocal sshd[6069]: Received disconnect from 45.15.225.137 port 42474:11: disconnected by user [preauth]
Feb 20 06:44:31 np0005625204.novalocal sshd[6069]: Disconnected from invalid user test 45.15.225.137 port 42474 [preauth]
Feb 20 06:44:31 np0005625204.novalocal sshd[6071]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:32 np0005625204.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 20 06:44:33 np0005625204.novalocal sshd[6071]: Invalid user user from 45.15.225.137 port 42676
Feb 20 06:44:34 np0005625204.novalocal sshd[6071]: error: maximum authentication attempts exceeded for invalid user user from 45.15.225.137 port 42676 ssh2 [preauth]
Feb 20 06:44:34 np0005625204.novalocal sshd[6071]: Disconnecting invalid user user 45.15.225.137 port 42676: Too many authentication failures [preauth]
Feb 20 06:44:34 np0005625204.novalocal sshd[6076]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:37 np0005625204.novalocal sshd[6076]: Invalid user user from 45.15.225.137 port 42922
Feb 20 06:44:38 np0005625204.novalocal sshd[6076]: error: maximum authentication attempts exceeded for invalid user user from 45.15.225.137 port 42922 ssh2 [preauth]
Feb 20 06:44:38 np0005625204.novalocal sshd[6076]: Disconnecting invalid user user 45.15.225.137 port 42922: Too many authentication failures [preauth]
Feb 20 06:44:38 np0005625204.novalocal sshd[6078]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:41 np0005625204.novalocal sshd[6078]: Invalid user user from 45.15.225.137 port 43174
Feb 20 06:44:41 np0005625204.novalocal sshd[6078]: Received disconnect from 45.15.225.137 port 43174:11: disconnected by user [preauth]
Feb 20 06:44:41 np0005625204.novalocal sshd[6078]: Disconnected from invalid user user 45.15.225.137 port 43174 [preauth]
Feb 20 06:44:41 np0005625204.novalocal sshd[6080]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:45 np0005625204.novalocal sshd[6080]: Invalid user ftpuser from 45.15.225.137 port 43400
Feb 20 06:44:46 np0005625204.novalocal sshd[6080]: error: maximum authentication attempts exceeded for invalid user ftpuser from 45.15.225.137 port 43400 ssh2 [preauth]
Feb 20 06:44:46 np0005625204.novalocal sshd[6080]: Disconnecting invalid user ftpuser 45.15.225.137 port 43400: Too many authentication failures [preauth]
Feb 20 06:44:46 np0005625204.novalocal sshd[6082]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:47 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569887.7621] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:47 np0005625204.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 20 06:44:47 np0005625204.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 20 06:44:47 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569887.7860] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:47 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569887.7863] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Feb 20 06:44:47 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569887.7873] device (eth1): Activation: successful, device activated.
Feb 20 06:44:47 np0005625204.novalocal NetworkManager[5988]: <info>  [1771569887.7881] manager: startup complete
Feb 20 06:44:47 np0005625204.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 20 06:44:51 np0005625204.novalocal sshd[6082]: Invalid user ftpuser from 45.15.225.137 port 43758
Feb 20 06:44:52 np0005625204.novalocal sshd[6082]: error: maximum authentication attempts exceeded for invalid user ftpuser from 45.15.225.137 port 43758 ssh2 [preauth]
Feb 20 06:44:52 np0005625204.novalocal sshd[6082]: Disconnecting invalid user ftpuser 45.15.225.137 port 43758: Too many authentication failures [preauth]
Feb 20 06:44:52 np0005625204.novalocal sshd[6097]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:55 np0005625204.novalocal sshd[6097]: Invalid user ftpuser from 45.15.225.137 port 44142
Feb 20 06:44:55 np0005625204.novalocal sshd[6097]: Received disconnect from 45.15.225.137 port 44142:11: disconnected by user [preauth]
Feb 20 06:44:55 np0005625204.novalocal sshd[6097]: Disconnected from invalid user ftpuser 45.15.225.137 port 44142 [preauth]
Feb 20 06:44:56 np0005625204.novalocal sshd[6099]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:44:57 np0005625204.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 20 06:44:58 np0005625204.novalocal sshd[6099]: Invalid user test1 from 45.15.225.137 port 44350
Feb 20 06:44:59 np0005625204.novalocal sshd[6099]: error: maximum authentication attempts exceeded for invalid user test1 from 45.15.225.137 port 44350 ssh2 [preauth]
Feb 20 06:44:59 np0005625204.novalocal sshd[6099]: Disconnecting invalid user test1 45.15.225.137 port 44350: Too many authentication failures [preauth]
Feb 20 06:44:59 np0005625204.novalocal sshd[6101]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:02 np0005625204.novalocal sshd[6101]: Invalid user test1 from 45.15.225.137 port 44596
Feb 20 06:45:02 np0005625204.novalocal sshd[5827]: Received disconnect from 38.102.83.114 port 49156:11: disconnected by user
Feb 20 06:45:02 np0005625204.novalocal sshd[5827]: Disconnected from user zuul 38.102.83.114 port 49156
Feb 20 06:45:02 np0005625204.novalocal sshd[5824]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:45:03 np0005625204.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 20 06:45:03 np0005625204.novalocal systemd[1]: session-3.scope: Consumed 1.537s CPU time.
Feb 20 06:45:03 np0005625204.novalocal systemd-logind[759]: Session 3 logged out. Waiting for processes to exit.
Feb 20 06:45:03 np0005625204.novalocal systemd-logind[759]: Removed session 3.
Feb 20 06:45:03 np0005625204.novalocal sshd[6101]: error: maximum authentication attempts exceeded for invalid user test1 from 45.15.225.137 port 44596 ssh2 [preauth]
Feb 20 06:45:03 np0005625204.novalocal sshd[6101]: Disconnecting invalid user test1 45.15.225.137 port 44596: Too many authentication failures [preauth]
Feb 20 06:45:03 np0005625204.novalocal sshd[6103]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:05 np0005625204.novalocal sshd[6103]: Invalid user test1 from 45.15.225.137 port 44860
Feb 20 06:45:06 np0005625204.novalocal sshd[6103]: Received disconnect from 45.15.225.137 port 44860:11: disconnected by user [preauth]
Feb 20 06:45:06 np0005625204.novalocal sshd[6103]: Disconnected from invalid user test1 45.15.225.137 port 44860 [preauth]
Feb 20 06:45:06 np0005625204.novalocal sshd[6105]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:09 np0005625204.novalocal sshd[6105]: Invalid user test2 from 45.15.225.137 port 45044
Feb 20 06:45:09 np0005625204.novalocal sshd[6105]: error: maximum authentication attempts exceeded for invalid user test2 from 45.15.225.137 port 45044 ssh2 [preauth]
Feb 20 06:45:09 np0005625204.novalocal sshd[6105]: Disconnecting invalid user test2 45.15.225.137 port 45044: Too many authentication failures [preauth]
Feb 20 06:45:10 np0005625204.novalocal sshd[6107]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:13 np0005625204.novalocal sshd[6107]: Invalid user test2 from 45.15.225.137 port 45326
Feb 20 06:45:14 np0005625204.novalocal sshd[6107]: error: maximum authentication attempts exceeded for invalid user test2 from 45.15.225.137 port 45326 ssh2 [preauth]
Feb 20 06:45:14 np0005625204.novalocal sshd[6107]: Disconnecting invalid user test2 45.15.225.137 port 45326: Too many authentication failures [preauth]
Feb 20 06:45:15 np0005625204.novalocal sshd[6109]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:17 np0005625204.novalocal sshd[6109]: Invalid user test2 from 45.15.225.137 port 45642
Feb 20 06:45:18 np0005625204.novalocal sshd[6109]: Received disconnect from 45.15.225.137 port 45642:11: disconnected by user [preauth]
Feb 20 06:45:18 np0005625204.novalocal sshd[6109]: Disconnected from invalid user test2 45.15.225.137 port 45642 [preauth]
Feb 20 06:45:18 np0005625204.novalocal sshd[6111]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:21 np0005625204.novalocal sshd[6111]: Invalid user ubuntu from 45.15.225.137 port 45860
Feb 20 06:45:22 np0005625204.novalocal sshd[6111]: error: maximum authentication attempts exceeded for invalid user ubuntu from 45.15.225.137 port 45860 ssh2 [preauth]
Feb 20 06:45:22 np0005625204.novalocal sshd[6111]: Disconnecting invalid user ubuntu 45.15.225.137 port 45860: Too many authentication failures [preauth]
Feb 20 06:45:22 np0005625204.novalocal sshd[6113]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:25 np0005625204.novalocal sshd[6115]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:25 np0005625204.novalocal sshd[6115]: Accepted publickey for zuul from 38.102.83.114 port 47484 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:45:25 np0005625204.novalocal systemd-logind[759]: New session 4 of user zuul.
Feb 20 06:45:25 np0005625204.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 20 06:45:25 np0005625204.novalocal sshd[6115]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:45:25 np0005625204.novalocal sudo[6164]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsoakocwqvupuhvgdsfibsscwdcvesxn ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 20 06:45:25 np0005625204.novalocal sudo[6164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:45:25 np0005625204.novalocal python3[6166]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:45:25 np0005625204.novalocal sudo[6164]: pam_unix(sudo:session): session closed for user root
Feb 20 06:45:25 np0005625204.novalocal sshd[6113]: Invalid user ubuntu from 45.15.225.137 port 46202
Feb 20 06:45:25 np0005625204.novalocal sudo[6207]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glnvmdfhmfmyqcxkvgrhnvsvjwmzsxjg ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 20 06:45:25 np0005625204.novalocal sudo[6207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:45:25 np0005625204.novalocal python3[6209]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771569925.270666-628-183704899478326/source _original_basename=tmpc235pqzn follow=False checksum=1adafc0c3cabf5458281c7d741082eddefa40194 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:45:25 np0005625204.novalocal sudo[6207]: pam_unix(sudo:session): session closed for user root
Feb 20 06:45:26 np0005625204.novalocal sshd[6113]: error: maximum authentication attempts exceeded for invalid user ubuntu from 45.15.225.137 port 46202 ssh2 [preauth]
Feb 20 06:45:26 np0005625204.novalocal sshd[6113]: Disconnecting invalid user ubuntu 45.15.225.137 port 46202: Too many authentication failures [preauth]
Feb 20 06:45:26 np0005625204.novalocal sshd[6224]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:28 np0005625204.novalocal sshd[6115]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:45:28 np0005625204.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 20 06:45:28 np0005625204.novalocal systemd-logind[759]: Session 4 logged out. Waiting for processes to exit.
Feb 20 06:45:28 np0005625204.novalocal systemd-logind[759]: Removed session 4.
Feb 20 06:45:29 np0005625204.novalocal sshd[6224]: Invalid user ubuntu from 45.15.225.137 port 46454
Feb 20 06:45:29 np0005625204.novalocal sshd[6224]: Received disconnect from 45.15.225.137 port 46454:11: disconnected by user [preauth]
Feb 20 06:45:29 np0005625204.novalocal sshd[6224]: Disconnected from invalid user ubuntu 45.15.225.137 port 46454 [preauth]
Feb 20 06:45:30 np0005625204.novalocal sshd[6226]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:32 np0005625204.novalocal sshd[6226]: Invalid user pi from 45.15.225.137 port 46672
Feb 20 06:45:33 np0005625204.novalocal sshd[6226]: Received disconnect from 45.15.225.137 port 46672:11: disconnected by user [preauth]
Feb 20 06:45:33 np0005625204.novalocal sshd[6226]: Disconnected from invalid user pi 45.15.225.137 port 46672 [preauth]
Feb 20 06:45:33 np0005625204.novalocal sshd[6228]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:45:36 np0005625204.novalocal sshd[6228]: Invalid user baikal from 45.15.225.137 port 46932
Feb 20 06:45:36 np0005625204.novalocal sshd[6228]: Received disconnect from 45.15.225.137 port 46932:11: disconnected by user [preauth]
Feb 20 06:45:36 np0005625204.novalocal sshd[6228]: Disconnected from invalid user baikal 45.15.225.137 port 46932 [preauth]
Feb 20 06:52:27 np0005625204.novalocal sshd[6233]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:52:27 np0005625204.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Feb 20 06:52:27 np0005625204.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 20 06:52:27 np0005625204.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Feb 20 06:52:27 np0005625204.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 20 06:52:27 np0005625204.novalocal sshd[6233]: Accepted publickey for zuul from 38.102.83.114 port 39478 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:52:27 np0005625204.novalocal systemd-logind[759]: New session 5 of user zuul.
Feb 20 06:52:27 np0005625204.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 20 06:52:27 np0005625204.novalocal sshd[6233]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:52:27 np0005625204.novalocal sudo[6254]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwefovwewscemxpypogdtsyxzposxych ; /usr/bin/python3
Feb 20 06:52:27 np0005625204.novalocal sudo[6254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:27 np0005625204.novalocal python3[6256]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-12ca-cc53-00000000219f-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:52:27 np0005625204.novalocal sudo[6254]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:29 np0005625204.novalocal sudo[6273]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmpjvsrlrjtqqmycaefamacaxpnjrphb ; /usr/bin/python3
Feb 20 06:52:29 np0005625204.novalocal sudo[6273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:29 np0005625204.novalocal python3[6275]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:52:29 np0005625204.novalocal sudo[6273]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:29 np0005625204.novalocal sudo[6289]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adowwwpwxdsnfzvjuqyoxlijsunajaem ; /usr/bin/python3
Feb 20 06:52:29 np0005625204.novalocal sudo[6289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:29 np0005625204.novalocal python3[6291]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:52:29 np0005625204.novalocal sudo[6289]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:29 np0005625204.novalocal sudo[6305]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeygigslitkqbfmcsdmiagdeyyccyzis ; /usr/bin/python3
Feb 20 06:52:29 np0005625204.novalocal sudo[6305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:29 np0005625204.novalocal python3[6307]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:52:29 np0005625204.novalocal sudo[6305]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:29 np0005625204.novalocal sudo[6321]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtceluusmsiieqqoofzyrynkvlmxfxyy ; /usr/bin/python3
Feb 20 06:52:29 np0005625204.novalocal sudo[6321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:30 np0005625204.novalocal python3[6323]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:52:30 np0005625204.novalocal sudo[6321]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:30 np0005625204.novalocal sudo[6337]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxubtwluixbdbcaprhjsbbtlosaoskcx ; /usr/bin/python3
Feb 20 06:52:30 np0005625204.novalocal sudo[6337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:30 np0005625204.novalocal python3[6339]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:52:30 np0005625204.novalocal sudo[6337]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:31 np0005625204.novalocal sudo[6385]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sivcnbhdqncbxpfimxlnjpbnlskdnhzw ; /usr/bin/python3
Feb 20 06:52:31 np0005625204.novalocal sudo[6385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:32 np0005625204.novalocal python3[6387]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:52:32 np0005625204.novalocal sudo[6385]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:32 np0005625204.novalocal sudo[6428]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsornipyszhlurmlnqhmlzzzhtnhcfpr ; /usr/bin/python3
Feb 20 06:52:32 np0005625204.novalocal sudo[6428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:32 np0005625204.novalocal python3[6430]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771570351.7884274-667-182459782895875/source _original_basename=tmp_jn8vaqc follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:52:32 np0005625204.novalocal sudo[6428]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:33 np0005625204.novalocal sudo[6458]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dweyczznfttieqisigasdrcqivwksgkh ; /usr/bin/python3
Feb 20 06:52:33 np0005625204.novalocal sudo[6458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:34 np0005625204.novalocal python3[6460]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 06:52:34 np0005625204.novalocal systemd[1]: Reloading.
Feb 20 06:52:34 np0005625204.novalocal systemd-rc-local-generator[6481]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 06:52:34 np0005625204.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 06:52:34 np0005625204.novalocal sudo[6458]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:35 np0005625204.novalocal sudo[6505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onfejhexxhlmoizbolykfsykwlqjamjf ; /usr/bin/python3
Feb 20 06:52:35 np0005625204.novalocal sudo[6505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:35 np0005625204.novalocal python3[6507]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 20 06:52:35 np0005625204.novalocal sudo[6505]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:36 np0005625204.novalocal sudo[6521]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwcmeesbxvbywoboepxizpzpwbeniaot ; /usr/bin/python3
Feb 20 06:52:36 np0005625204.novalocal sudo[6521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:36 np0005625204.novalocal python3[6523]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:52:36 np0005625204.novalocal sudo[6521]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:37 np0005625204.novalocal sudo[6539]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-peigggjqtutgmcykcpzbpuwnevjfctlo ; /usr/bin/python3
Feb 20 06:52:37 np0005625204.novalocal sudo[6539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:37 np0005625204.novalocal python3[6541]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:52:37 np0005625204.novalocal sudo[6539]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:37 np0005625204.novalocal sudo[6557]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mugfbzxndbxkwqgfusdakmqobofiyexj ; /usr/bin/python3
Feb 20 06:52:37 np0005625204.novalocal sudo[6557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:37 np0005625204.novalocal python3[6559]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:52:37 np0005625204.novalocal sudo[6557]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:37 np0005625204.novalocal sudo[6575]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avbttdkunivxydehqmlmfjxujaxzrzqp ; /usr/bin/python3
Feb 20 06:52:37 np0005625204.novalocal sudo[6575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:52:37 np0005625204.novalocal python3[6577]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:52:37 np0005625204.novalocal sudo[6575]: pam_unix(sudo:session): session closed for user root
Feb 20 06:52:48 np0005625204.novalocal python3[6594]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-12ca-cc53-0000000021a6-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:52:49 np0005625204.novalocal python3[6614]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 06:52:52 np0005625204.novalocal sshd[6233]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:52:52 np0005625204.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 20 06:52:52 np0005625204.novalocal systemd[1]: session-5.scope: Consumed 4.053s CPU time.
Feb 20 06:52:52 np0005625204.novalocal systemd-logind[759]: Session 5 logged out. Waiting for processes to exit.
Feb 20 06:52:52 np0005625204.novalocal systemd-logind[759]: Removed session 5.
Feb 20 06:53:53 np0005625204.novalocal sshd[6620]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:53:53 np0005625204.novalocal sshd[6620]: Accepted publickey for zuul from 38.102.83.114 port 46628 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:53:53 np0005625204.novalocal systemd-logind[759]: New session 6 of user zuul.
Feb 20 06:53:53 np0005625204.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 20 06:53:53 np0005625204.novalocal sshd[6620]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:53:53 np0005625204.novalocal sudo[6637]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzsxivwmosekkmucqzmgqbirnyiiiwte ; /usr/bin/python3
Feb 20 06:53:53 np0005625204.novalocal sudo[6637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:53:54 np0005625204.novalocal systemd[1]: Starting RHSM dbus service...
Feb 20 06:53:54 np0005625204.novalocal systemd[1]: Started RHSM dbus service.
Feb 20 06:53:54 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 20 06:53:54 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 20 06:53:54 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 20 06:53:54 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005625204.novalocal (430a9023-94d5-4ff5-8ad4-f0155783873a)
Feb 20 06:53:56 np0005625204.novalocal subscription-manager[6644]: Registered system with identity: 430a9023-94d5-4ff5-8ad4-f0155783873a
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.entcertlib:131] certs updated:
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]: Total updates: 1
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]: Found (local) serial# []
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]: Expected (UEP) serial# [5588354398145753591]
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]: Added (new)
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]:   [sn:5588354398145753591 ( Content Access,) @ /etc/pki/entitlement/5588354398145753591.pem]
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]: Deleted (rogue):
Feb 20 06:53:56 np0005625204.novalocal rhsm-service[6644]:   <NONE>
Feb 20 06:53:56 np0005625204.novalocal subscription-manager[6644]: Added subscription for 'Content Access' contract 'None'
Feb 20 06:53:56 np0005625204.novalocal subscription-manager[6644]: Added subscription for product ' Content Access'
Feb 20 06:53:57 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 20 06:53:57 np0005625204.novalocal rhsm-service[6644]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 20 06:53:57 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:53:58 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:53:58 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:53:58 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:53:58 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:53:59 np0005625204.novalocal sudo[6637]: pam_unix(sudo:session): session closed for user root
Feb 20 06:54:00 np0005625204.novalocal python3[6735]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-d2eb-5884-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 06:55:00 np0005625204.novalocal sshd[6623]: Received disconnect from 38.102.83.114 port 46628:11: disconnected by user
Feb 20 06:55:00 np0005625204.novalocal sshd[6623]: Disconnected from user zuul 38.102.83.114 port 46628
Feb 20 06:55:00 np0005625204.novalocal sshd[6620]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:55:00 np0005625204.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Feb 20 06:55:00 np0005625204.novalocal systemd[1]: session-6.scope: Consumed 1.567s CPU time.
Feb 20 06:55:00 np0005625204.novalocal systemd-logind[759]: Session 6 logged out. Waiting for processes to exit.
Feb 20 06:55:00 np0005625204.novalocal systemd-logind[759]: Removed session 6.
Feb 20 06:55:03 np0005625204.novalocal sshd[6740]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:55:03 np0005625204.novalocal sshd[6740]: Accepted publickey for zuul from 38.102.83.114 port 34018 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:55:03 np0005625204.novalocal systemd-logind[759]: New session 7 of user zuul.
Feb 20 06:55:03 np0005625204.novalocal systemd[1]: Started Session 7 of User zuul.
Feb 20 06:55:03 np0005625204.novalocal sshd[6740]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:55:03 np0005625204.novalocal sudo[6757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqxzbtqntomavyyrkugyphlrsqwhkqnr ; /usr/bin/python3
Feb 20 06:55:03 np0005625204.novalocal sudo[6757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:55:03 np0005625204.novalocal python3[6759]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 06:55:09 np0005625204.novalocal sshd[6766]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:55:09 np0005625204.novalocal sshd[6766]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 06:55:09 np0005625204.novalocal sshd[6766]: Connection closed by 119.148.49.82 port 53680
Feb 20 06:55:28 np0005625204.novalocal sshd[6779]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:55:32 np0005625204.novalocal setsebool[6836]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 20 06:55:32 np0005625204.novalocal setsebool[6836]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  Converting 406 SID table entries...
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 06:55:41 np0005625204.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 06:55:53 np0005625204.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Feb 20 06:55:53 np0005625204.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 06:55:53 np0005625204.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 20 06:55:53 np0005625204.novalocal systemd[1]: Reloading.
Feb 20 06:55:53 np0005625204.novalocal systemd-rc-local-generator[7693]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 06:55:53 np0005625204.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 06:55:53 np0005625204.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 06:55:55 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:55:55 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 06:55:55 np0005625204.novalocal sudo[6757]: pam_unix(sudo:session): session closed for user root
Feb 20 06:55:59 np0005625204.novalocal sudo[16003]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nravjiewphmmhnmupziqwjuaqejrjadm ; /usr/bin/python3
Feb 20 06:56:00 np0005625204.novalocal sudo[16003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:56:00 np0005625204.novalocal podman[16246]: 2026-02-20 06:56:00.28808241 +0000 UTC m=+0.103332790 system refresh
Feb 20 06:56:00 np0005625204.novalocal sudo[16003]: pam_unix(sudo:session): session closed for user root
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: Starting D-Bus User Message Bus...
Feb 20 06:56:01 np0005625204.novalocal dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 20 06:56:01 np0005625204.novalocal dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: Started D-Bus User Message Bus.
Feb 20 06:56:01 np0005625204.novalocal dbus-broker-lau[17399]: Ready
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: Created slice Slice /user.
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: podman-17269.scope: unit configures an IP firewall, but not running as root.
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: (This warning is only shown for the first unit using IP firewalling.)
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: Started podman-17269.scope.
Feb 20 06:56:01 np0005625204.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 06:56:01 np0005625204.novalocal systemd[4180]: Started podman-pause-6d03a9a2.scope.
Feb 20 06:56:02 np0005625204.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 06:56:02 np0005625204.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 20 06:56:02 np0005625204.novalocal systemd[1]: man-db-cache-update.service: Consumed 10.323s CPU time.
Feb 20 06:56:02 np0005625204.novalocal systemd[1]: run-r45b5324098c246c3ae3dacad90a0c586.service: Deactivated successfully.
Feb 20 06:56:03 np0005625204.novalocal sshd[6740]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:56:03 np0005625204.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Feb 20 06:56:03 np0005625204.novalocal systemd[1]: session-7.scope: Consumed 48.546s CPU time.
Feb 20 06:56:03 np0005625204.novalocal systemd-logind[759]: Session 7 logged out. Waiting for processes to exit.
Feb 20 06:56:03 np0005625204.novalocal systemd-logind[759]: Removed session 7.
Feb 20 06:56:19 np0005625204.novalocal sshd[18493]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:56:19 np0005625204.novalocal sshd[18492]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:56:19 np0005625204.novalocal sshd[18494]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:56:19 np0005625204.novalocal sshd[18496]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:56:19 np0005625204.novalocal sshd[18495]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:56:19 np0005625204.novalocal sshd[18492]: Unable to negotiate with 38.102.83.74 port 57558: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 20 06:56:19 np0005625204.novalocal sshd[18493]: Unable to negotiate with 38.102.83.74 port 57574: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 20 06:56:19 np0005625204.novalocal sshd[18494]: Connection closed by 38.102.83.74 port 57540 [preauth]
Feb 20 06:56:19 np0005625204.novalocal sshd[18496]: Connection closed by 38.102.83.74 port 57542 [preauth]
Feb 20 06:56:19 np0005625204.novalocal sshd[18495]: Unable to negotiate with 38.102.83.74 port 57578: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 20 06:56:23 np0005625204.novalocal sshd[18502]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:56:23 np0005625204.novalocal sshd[18502]: Accepted publickey for zuul from 38.102.83.114 port 53626 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:56:23 np0005625204.novalocal systemd-logind[759]: New session 8 of user zuul.
Feb 20 06:56:23 np0005625204.novalocal systemd[1]: Started Session 8 of User zuul.
Feb 20 06:56:23 np0005625204.novalocal sshd[18502]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:56:23 np0005625204.novalocal python3[18519]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHF6ws6TTGIgpcynk+zfDmAiKAngdz4qTSYI5OZYL/Nj9dQsVH9D0sSlKxQpeRN7puQyuA81owKWTQGJzf43DRQ= zuul@np0005625196.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:56:24 np0005625204.novalocal sudo[18533]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqstbcxzllwbnsywgkffnbufranhajer ; /usr/bin/python3
Feb 20 06:56:24 np0005625204.novalocal sudo[18533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:56:24 np0005625204.novalocal python3[18535]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHF6ws6TTGIgpcynk+zfDmAiKAngdz4qTSYI5OZYL/Nj9dQsVH9D0sSlKxQpeRN7puQyuA81owKWTQGJzf43DRQ= zuul@np0005625196.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:56:24 np0005625204.novalocal sudo[18533]: pam_unix(sudo:session): session closed for user root
Feb 20 06:56:26 np0005625204.novalocal sshd[18502]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:56:26 np0005625204.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Feb 20 06:56:26 np0005625204.novalocal systemd-logind[759]: Session 8 logged out. Waiting for processes to exit.
Feb 20 06:56:26 np0005625204.novalocal systemd-logind[759]: Removed session 8.
Feb 20 06:57:28 np0005625204.novalocal sshd[6779]: fatal: Timeout before authentication for 124.70.128.34 port 36624
Feb 20 06:57:45 np0005625204.novalocal sshd[18538]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:57:45 np0005625204.novalocal sshd[18538]: Accepted publickey for zuul from 38.102.83.114 port 34802 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 06:57:45 np0005625204.novalocal systemd-logind[759]: New session 9 of user zuul.
Feb 20 06:57:45 np0005625204.novalocal systemd[1]: Started Session 9 of User zuul.
Feb 20 06:57:45 np0005625204.novalocal sshd[18538]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 06:57:45 np0005625204.novalocal sudo[18555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkqvfxbvutujkdnkkprnqtdaebhomglj ; /usr/bin/python3
Feb 20 06:57:45 np0005625204.novalocal sudo[18555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:45 np0005625204.novalocal python3[18557]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 06:57:45 np0005625204.novalocal sudo[18555]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:46 np0005625204.novalocal sudo[18571]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpplpmbgqfvluuqizuvhnplrethcorht ; /usr/bin/python3
Feb 20 06:57:46 np0005625204.novalocal sudo[18571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:46 np0005625204.novalocal python3[18573]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 20 06:57:46 np0005625204.novalocal sudo[18571]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:47 np0005625204.novalocal sudo[18621]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgzqvpbbpxstiglhcnsqhyiddnoioycu ; /usr/bin/python3
Feb 20 06:57:47 np0005625204.novalocal sudo[18621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:48 np0005625204.novalocal python3[18623]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:57:48 np0005625204.novalocal sudo[18621]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:48 np0005625204.novalocal sudo[18664]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uejfjobpehufpwerdrusyxfcszayhjjl ; /usr/bin/python3
Feb 20 06:57:48 np0005625204.novalocal sudo[18664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:48 np0005625204.novalocal python3[18666]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771570667.8219144-139-8684751804930/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa follow=False checksum=1ede725f5cdca64ff103c7e62f7bb7b42f0b9244 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:57:48 np0005625204.novalocal sudo[18664]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:49 np0005625204.novalocal sudo[18726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asfrpljeqyqoflimfvpfzgwezmaljnob ; /usr/bin/python3
Feb 20 06:57:49 np0005625204.novalocal sudo[18726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:49 np0005625204.novalocal python3[18728]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:57:49 np0005625204.novalocal sudo[18726]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:49 np0005625204.novalocal sudo[18769]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fegruziavhoswboaktyxrnyauhjmdmbp ; /usr/bin/python3
Feb 20 06:57:49 np0005625204.novalocal sudo[18769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:50 np0005625204.novalocal python3[18771]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771570669.45577-228-268540852689489/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=2f497d714b8e492188c98067e59a9994_id_rsa.pub follow=False checksum=d5896bb6dcd221ffe99ce3acccb68a5152af8369 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:57:50 np0005625204.novalocal sudo[18769]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:52 np0005625204.novalocal sudo[18799]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piknpvzavhkzuxfkuffqghffiujvlnvk ; /usr/bin/python3
Feb 20 06:57:52 np0005625204.novalocal sudo[18799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 06:57:52 np0005625204.novalocal python3[18801]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:57:52 np0005625204.novalocal sudo[18799]: pam_unix(sudo:session): session closed for user root
Feb 20 06:57:53 np0005625204.novalocal python3[18847]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:57:53 np0005625204.novalocal python3[18863]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmp40v7jvle recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:57:54 np0005625204.novalocal python3[18923]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:57:54 np0005625204.novalocal python3[18939]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmppwxjlel3 recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:57:56 np0005625204.novalocal python3[18999]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 06:57:56 np0005625204.novalocal python3[19015]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmpffh2s3k2 recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 06:57:57 np0005625204.novalocal sshd[18538]: pam_unix(sshd:session): session closed for user zuul
Feb 20 06:57:57 np0005625204.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Feb 20 06:57:57 np0005625204.novalocal systemd[1]: session-9.scope: Consumed 3.587s CPU time.
Feb 20 06:57:57 np0005625204.novalocal systemd-logind[759]: Session 9 logged out. Waiting for processes to exit.
Feb 20 06:57:57 np0005625204.novalocal systemd-logind[759]: Removed session 9.
Feb 20 06:58:12 np0005625204.novalocal sshd[19030]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:58:12 np0005625204.novalocal sshd[19031]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 06:58:13 np0005625204.novalocal sshd[19031]: error: kex_exchange_identification: read: Connection reset by peer
Feb 20 06:58:13 np0005625204.novalocal sshd[19031]: Connection reset by 176.120.22.52 port 9263
Feb 20 07:00:07 np0005625204.novalocal sshd[19033]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:00:07 np0005625204.novalocal sshd[19033]: Accepted publickey for zuul from 38.102.83.74 port 49632 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:00:07 np0005625204.novalocal systemd-logind[759]: New session 10 of user zuul.
Feb 20 07:00:07 np0005625204.novalocal systemd[1]: Started Session 10 of User zuul.
Feb 20 07:00:07 np0005625204.novalocal sshd[19033]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 07:00:07 np0005625204.novalocal python3[19079]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:01:01 np0005625204.novalocal CROND[19082]: (root) CMD (run-parts /etc/cron.hourly)
Feb 20 07:01:01 np0005625204.novalocal run-parts[19085]: (/etc/cron.hourly) starting 0anacron
Feb 20 07:01:01 np0005625204.novalocal anacron[19093]: Anacron started on 2026-02-20
Feb 20 07:01:01 np0005625204.novalocal anacron[19093]: Will run job `cron.daily' in 14 min.
Feb 20 07:01:01 np0005625204.novalocal anacron[19093]: Will run job `cron.weekly' in 34 min.
Feb 20 07:01:01 np0005625204.novalocal anacron[19093]: Will run job `cron.monthly' in 54 min.
Feb 20 07:01:01 np0005625204.novalocal anacron[19093]: Jobs will be executed sequentially
Feb 20 07:01:01 np0005625204.novalocal run-parts[19095]: (/etc/cron.hourly) finished 0anacron
Feb 20 07:01:01 np0005625204.novalocal CROND[19081]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 20 07:05:06 np0005625204.novalocal sshd[19036]: Received disconnect from 38.102.83.74 port 49632:11: disconnected by user
Feb 20 07:05:06 np0005625204.novalocal sshd[19036]: Disconnected from user zuul 38.102.83.74 port 49632
Feb 20 07:05:06 np0005625204.novalocal sshd[19033]: pam_unix(sshd:session): session closed for user zuul
Feb 20 07:05:06 np0005625204.novalocal systemd[1]: session-10.scope: Deactivated successfully.
Feb 20 07:05:06 np0005625204.novalocal systemd-logind[759]: Session 10 logged out. Waiting for processes to exit.
Feb 20 07:05:06 np0005625204.novalocal systemd-logind[759]: Removed session 10.
Feb 20 07:06:08 np0005625204.novalocal sshd[19098]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:06:09 np0005625204.novalocal sshd[19098]: Invalid user titu from 189.143.72.189 port 59726
Feb 20 07:06:09 np0005625204.novalocal sshd[19098]: Received disconnect from 189.143.72.189 port 59726:11: Bye Bye [preauth]
Feb 20 07:06:09 np0005625204.novalocal sshd[19098]: Disconnected from invalid user titu 189.143.72.189 port 59726 [preauth]
Feb 20 07:07:29 np0005625204.novalocal sshd[19100]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:07:30 np0005625204.novalocal sshd[19100]: Connection closed by 14.103.118.194 port 36538 [preauth]
Feb 20 07:08:06 np0005625204.novalocal sshd[19102]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:08:07 np0005625204.novalocal sshd[19102]: Invalid user sonar from 151.252.84.225 port 40058
Feb 20 07:08:07 np0005625204.novalocal sshd[19102]: Received disconnect from 151.252.84.225 port 40058:11: Bye Bye [preauth]
Feb 20 07:08:07 np0005625204.novalocal sshd[19102]: Disconnected from invalid user sonar 151.252.84.225 port 40058 [preauth]
Feb 20 07:08:50 np0005625204.novalocal sshd[19105]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:08:51 np0005625204.novalocal sshd[19105]: Received disconnect from 202.165.22.246 port 45976:11: Bye Bye [preauth]
Feb 20 07:08:51 np0005625204.novalocal sshd[19105]: Disconnected from authenticating user root 202.165.22.246 port 45976 [preauth]
Feb 20 07:11:23 np0005625204.novalocal sshd[19109]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:11:24 np0005625204.novalocal sshd[19109]: Accepted publickey for zuul from 38.102.83.114 port 37370 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:11:24 np0005625204.novalocal systemd-logind[759]: New session 11 of user zuul.
Feb 20 07:11:24 np0005625204.novalocal systemd[1]: Started Session 11 of User zuul.
Feb 20 07:11:24 np0005625204.novalocal sshd[19109]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 07:11:24 np0005625204.novalocal python3[19126]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-064b-165c-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:11:27 np0005625204.novalocal sudo[19144]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gotpwaqpomxdqkpwkjqefbqkdkxtagij ; /usr/bin/python3
Feb 20 07:11:27 np0005625204.novalocal sudo[19144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:11:27 np0005625204.novalocal python3[19146]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-064b-165c-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:11:29 np0005625204.novalocal sudo[19144]: pam_unix(sudo:session): session closed for user root
Feb 20 07:11:58 np0005625204.novalocal sudo[19163]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcooxqrtowldccrvodhitqfkphxxowic ; /usr/bin/python3
Feb 20 07:11:58 np0005625204.novalocal sudo[19163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:11:58 np0005625204.novalocal python3[19165]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Feb 20 07:12:01 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:12:01 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:12:25 np0005625204.novalocal sudo[19163]: pam_unix(sudo:session): session closed for user root
Feb 20 07:12:30 np0005625204.novalocal sudo[19320]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eridobzdztjzagnzswgyjbmbtpwwwjlb ; /usr/bin/python3
Feb 20 07:12:30 np0005625204.novalocal sudo[19320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:12:30 np0005625204.novalocal python3[19322]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Feb 20 07:12:33 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:12:35 np0005625204.novalocal sudo[19320]: pam_unix(sudo:session): session closed for user root
Feb 20 07:12:49 np0005625204.novalocal sudo[19460]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcwbpiymthvhnxqirxxlscxyssvckxfw ; /usr/bin/python3
Feb 20 07:12:49 np0005625204.novalocal sudo[19460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:12:50 np0005625204.novalocal python3[19462]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Feb 20 07:12:52 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:12:52 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:12:54 np0005625204.novalocal sshd[19589]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:12:55 np0005625204.novalocal sshd[19589]: Invalid user ali from 151.252.84.225 port 42354
Feb 20 07:12:55 np0005625204.novalocal sshd[19589]: Received disconnect from 151.252.84.225 port 42354:11: Bye Bye [preauth]
Feb 20 07:12:55 np0005625204.novalocal sshd[19589]: Disconnected from invalid user ali 151.252.84.225 port 42354 [preauth]
Feb 20 07:12:57 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:04 np0005625204.novalocal sudo[19460]: pam_unix(sudo:session): session closed for user root
Feb 20 07:13:04 np0005625204.novalocal sshd[19784]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:09 np0005625204.novalocal sshd[19784]: Invalid user 0 from 185.246.128.171 port 62874
Feb 20 07:13:10 np0005625204.novalocal sshd[19784]: Disconnecting invalid user 0 185.246.128.171 port 62874: Change of username or service not allowed: (0,ssh-connection) -> (accounting,ssh-connection) [preauth]
Feb 20 07:13:11 np0005625204.novalocal sshd[19786]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:14 np0005625204.novalocal sshd[19786]: Invalid user accounting from 185.246.128.171 port 24284
Feb 20 07:13:15 np0005625204.novalocal sshd[19786]: Disconnecting invalid user accounting 185.246.128.171 port 24284: Change of username or service not allowed: (accounting,ssh-connection) -> (sara,ssh-connection) [preauth]
Feb 20 07:13:18 np0005625204.novalocal sudo[19801]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdexllfpgncvrmgpfwzmvajcryhnhapk ; /usr/bin/python3
Feb 20 07:13:18 np0005625204.novalocal sudo[19801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:13:18 np0005625204.novalocal python3[19803]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Feb 20 07:13:18 np0005625204.novalocal sshd[19805]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:21 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:21 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:22 np0005625204.novalocal sshd[19805]: Invalid user sara from 185.246.128.171 port 52439
Feb 20 07:13:23 np0005625204.novalocal sshd[19805]: Disconnecting invalid user sara 185.246.128.171 port 52439: Change of username or service not allowed: (sara,ssh-connection) -> (webuser,ssh-connection) [preauth]
Feb 20 07:13:25 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:26 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:26 np0005625204.novalocal sshd[20116]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:27 np0005625204.novalocal sshd[20120]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:28 np0005625204.novalocal sshd[20120]: Invalid user loguser from 202.165.22.246 port 32852
Feb 20 07:13:28 np0005625204.novalocal sshd[20120]: Received disconnect from 202.165.22.246 port 32852:11: Bye Bye [preauth]
Feb 20 07:13:28 np0005625204.novalocal sshd[20120]: Disconnected from invalid user loguser 202.165.22.246 port 32852 [preauth]
Feb 20 07:13:30 np0005625204.novalocal sshd[20116]: Invalid user webuser from 185.246.128.171 port 19035
Feb 20 07:13:30 np0005625204.novalocal sshd[20116]: Disconnecting invalid user webuser 185.246.128.171 port 19035: Change of username or service not allowed: (webuser,ssh-connection) -> (vscode,ssh-connection) [preauth]
Feb 20 07:13:31 np0005625204.novalocal sshd[20189]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:32 np0005625204.novalocal sudo[19801]: pam_unix(sudo:session): session closed for user root
Feb 20 07:13:34 np0005625204.novalocal sshd[20191]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:34 np0005625204.novalocal sshd[20189]: Invalid user vscode from 185.246.128.171 port 40121
Feb 20 07:13:34 np0005625204.novalocal sshd[20189]: Disconnecting invalid user vscode 185.246.128.171 port 40121: Change of username or service not allowed: (vscode,ssh-connection) -> (testing,ssh-connection) [preauth]
Feb 20 07:13:34 np0005625204.novalocal sshd[20191]: Received disconnect from 189.143.72.189 port 50680:11: Bye Bye [preauth]
Feb 20 07:13:34 np0005625204.novalocal sshd[20191]: Disconnected from authenticating user root 189.143.72.189 port 50680 [preauth]
Feb 20 07:13:37 np0005625204.novalocal sshd[20193]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:40 np0005625204.novalocal sshd[20193]: Invalid user testing from 185.246.128.171 port 64081
Feb 20 07:13:41 np0005625204.novalocal sshd[20193]: Disconnecting invalid user testing 185.246.128.171 port 64081: Change of username or service not allowed: (testing,ssh-connection) -> (publicuser,ssh-connection) [preauth]
Feb 20 07:13:45 np0005625204.novalocal sshd[20195]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:46 np0005625204.novalocal sudo[20209]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htfqkvfqiowtqkmgbnxnlhwlnorsodpy ; /usr/bin/python3
Feb 20 07:13:46 np0005625204.novalocal sudo[20209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:13:46 np0005625204.novalocal python3[20211]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Feb 20 07:13:49 np0005625204.novalocal sshd[20195]: Invalid user publicuser from 185.246.128.171 port 30643
Feb 20 07:13:49 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:50 np0005625204.novalocal sshd[20195]: Disconnecting invalid user publicuser 185.246.128.171 port 30643: Change of username or service not allowed: (publicuser,ssh-connection) -> (jay,ssh-connection) [preauth]
Feb 20 07:13:51 np0005625204.novalocal sshd[20397]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:13:54 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:54 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:13:55 np0005625204.novalocal sshd[20397]: Invalid user jay from 185.246.128.171 port 58425
Feb 20 07:13:57 np0005625204.novalocal sshd[20397]: Disconnecting invalid user jay 185.246.128.171 port 58425: Change of username or service not allowed: (jay,ssh-connection) -> (chain,ssh-connection) [preauth]
Feb 20 07:14:00 np0005625204.novalocal sshd[20535]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:01 np0005625204.novalocal sudo[20209]: pam_unix(sudo:session): session closed for user root
Feb 20 07:14:04 np0005625204.novalocal sshd[20535]: Invalid user chain from 185.246.128.171 port 27577
Feb 20 07:14:04 np0005625204.novalocal sudo[20551]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deisnlfktimxgjzvahaiygmdzvejknhd ; /usr/bin/python3
Feb 20 07:14:04 np0005625204.novalocal sudo[20551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:14:04 np0005625204.novalocal python3[20553]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000013-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:14:05 np0005625204.novalocal sshd[20535]: Disconnecting invalid user chain 185.246.128.171 port 27577: Change of username or service not allowed: (chain,ssh-connection) -> (terraform,ssh-connection) [preauth]
Feb 20 07:14:06 np0005625204.novalocal sudo[20551]: pam_unix(sudo:session): session closed for user root
Feb 20 07:14:08 np0005625204.novalocal sshd[20557]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:09 np0005625204.novalocal sshd[20557]: Invalid user terraform from 185.246.128.171 port 58281
Feb 20 07:14:10 np0005625204.novalocal sshd[20557]: Disconnecting invalid user terraform 185.246.128.171 port 58281: Change of username or service not allowed: (terraform,ssh-connection) -> (install,ssh-connection) [preauth]
Feb 20 07:14:13 np0005625204.novalocal sshd[20559]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:19 np0005625204.novalocal sshd[20559]: Invalid user install from 185.246.128.171 port 15385
Feb 20 07:14:20 np0005625204.novalocal sshd[20559]: Disconnecting invalid user install 185.246.128.171 port 15385: Change of username or service not allowed: (install,ssh-connection) -> (siapbot,ssh-connection) [preauth]
Feb 20 07:14:22 np0005625204.novalocal sshd[20561]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:25 np0005625204.novalocal sshd[20561]: Invalid user siapbot from 185.246.128.171 port 52353
Feb 20 07:14:26 np0005625204.novalocal sshd[20561]: Disconnecting invalid user siapbot 185.246.128.171 port 52353: Change of username or service not allowed: (siapbot,ssh-connection) -> (worker,ssh-connection) [preauth]
Feb 20 07:14:28 np0005625204.novalocal sshd[20563]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:32 np0005625204.novalocal sudo[20578]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hptozheodbmpzfxywwxalthozrqidihn ; /usr/bin/python3
Feb 20 07:14:32 np0005625204.novalocal sudo[20578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:14:32 np0005625204.novalocal python3[20580]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:14:32 np0005625204.novalocal sshd[20563]: Invalid user worker from 185.246.128.171 port 12266
Feb 20 07:14:34 np0005625204.novalocal sshd[20563]: Disconnecting invalid user worker 185.246.128.171 port 12266: Change of username or service not allowed: (worker,ssh-connection) -> (ken,ssh-connection) [preauth]
Feb 20 07:14:36 np0005625204.novalocal sshd[20594]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:39 np0005625204.novalocal sshd[20594]: Invalid user ken from 185.246.128.171 port 41588
Feb 20 07:14:39 np0005625204.novalocal sshd[20594]: Disconnecting invalid user ken 185.246.128.171 port 41588: Change of username or service not allowed: (ken,ssh-connection) -> (user3,ssh-connection) [preauth]
Feb 20 07:14:41 np0005625204.novalocal sshd[20635]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:44 np0005625204.novalocal sshd[20635]: Invalid user user3 from 185.246.128.171 port 64187
Feb 20 07:14:46 np0005625204.novalocal sshd[20635]: Disconnecting invalid user user3 185.246.128.171 port 64187: Change of username or service not allowed: (user3,ssh-connection) -> (gestion,ssh-connection) [preauth]
Feb 20 07:14:47 np0005625204.novalocal sshd[20675]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:49 np0005625204.novalocal sshd[20675]: Invalid user gestion from 185.246.128.171 port 23857
Feb 20 07:14:50 np0005625204.novalocal sshd[20675]: Disconnecting invalid user gestion 185.246.128.171 port 23857: Change of username or service not allowed: (gestion,ssh-connection) -> (alberto,ssh-connection) [preauth]
Feb 20 07:14:51 np0005625204.novalocal sshd[20680]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  Converting 486 SID table entries...
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:14:51 np0005625204.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:14:52 np0005625204.novalocal groupadd[20687]: group added to /etc/group: name=unbound, GID=987
Feb 20 07:14:52 np0005625204.novalocal groupadd[20687]: group added to /etc/gshadow: name=unbound
Feb 20 07:14:52 np0005625204.novalocal groupadd[20687]: new group: name=unbound, GID=987
Feb 20 07:14:52 np0005625204.novalocal useradd[20694]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Feb 20 07:14:52 np0005625204.novalocal dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 20 07:14:52 np0005625204.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 20 07:14:52 np0005625204.novalocal groupadd[20707]: group added to /etc/group: name=openvswitch, GID=986
Feb 20 07:14:52 np0005625204.novalocal groupadd[20707]: group added to /etc/gshadow: name=openvswitch
Feb 20 07:14:52 np0005625204.novalocal groupadd[20707]: new group: name=openvswitch, GID=986
Feb 20 07:14:52 np0005625204.novalocal useradd[20714]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Feb 20 07:14:52 np0005625204.novalocal groupadd[20722]: group added to /etc/group: name=hugetlbfs, GID=985
Feb 20 07:14:52 np0005625204.novalocal groupadd[20722]: group added to /etc/gshadow: name=hugetlbfs
Feb 20 07:14:53 np0005625204.novalocal groupadd[20722]: new group: name=hugetlbfs, GID=985
Feb 20 07:14:53 np0005625204.novalocal usermod[20730]: add 'openvswitch' to group 'hugetlbfs'
Feb 20 07:14:53 np0005625204.novalocal usermod[20730]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 20 07:14:53 np0005625204.novalocal sshd[20680]: Invalid user alberto from 185.246.128.171 port 39362
Feb 20 07:14:53 np0005625204.novalocal sshd[20680]: Disconnecting invalid user alberto 185.246.128.171 port 39362: Change of username or service not allowed: (alberto,ssh-connection) -> (ali,ssh-connection) [preauth]
Feb 20 07:14:55 np0005625204.novalocal sshd[21011]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:14:55 np0005625204.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:14:55 np0005625204.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 20 07:14:55 np0005625204.novalocal systemd[1]: Reloading.
Feb 20 07:14:55 np0005625204.novalocal systemd-rc-local-generator[21240]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:14:55 np0005625204.novalocal systemd-sysv-generator[21245]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:14:56 np0005625204.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:14:56 np0005625204.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 07:14:56 np0005625204.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 07:14:56 np0005625204.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 20 07:14:56 np0005625204.novalocal systemd[1]: run-rb9ad198008414985965620ce5def172c.service: Deactivated successfully.
Feb 20 07:14:57 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:14:57 np0005625204.novalocal sudo[20578]: pam_unix(sudo:session): session closed for user root
Feb 20 07:14:57 np0005625204.novalocal rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:14:57 np0005625204.novalocal sshd[21011]: Invalid user ali from 185.246.128.171 port 55486
Feb 20 07:14:58 np0005625204.novalocal sshd[21011]: Disconnecting invalid user ali 185.246.128.171 port 55486: Change of username or service not allowed: (ali,ssh-connection) -> (linuxadmin,ssh-connection) [preauth]
Feb 20 07:15:01 np0005625204.novalocal anacron[19093]: Job `cron.daily' started
Feb 20 07:15:01 np0005625204.novalocal anacron[19093]: Job `cron.daily' terminated
Feb 20 07:15:02 np0005625204.novalocal sshd[21848]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:04 np0005625204.novalocal sshd[21848]: Invalid user linuxadmin from 185.246.128.171 port 17793
Feb 20 07:15:05 np0005625204.novalocal sshd[21848]: Disconnecting invalid user linuxadmin 185.246.128.171 port 17793: Change of username or service not allowed: (linuxadmin,ssh-connection) -> (andre,ssh-connection) [preauth]
Feb 20 07:15:06 np0005625204.novalocal sshd[21850]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:10 np0005625204.novalocal sshd[21850]: Invalid user andre from 185.246.128.171 port 35434
Feb 20 07:15:11 np0005625204.novalocal sshd[21850]: Disconnecting invalid user andre 185.246.128.171 port 35434: Change of username or service not allowed: (andre,ssh-connection) -> (wade,ssh-connection) [preauth]
Feb 20 07:15:11 np0005625204.novalocal sudo[21865]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdwhhpltgyvywumfasxepkmoqvvbooyi ; /usr/bin/python3
Feb 20 07:15:11 np0005625204.novalocal sudo[21865]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:12 np0005625204.novalocal python3[21867]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000015-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:15:12 np0005625204.novalocal sshd[21871]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:14 np0005625204.novalocal sshd[21871]: Invalid user wade from 185.246.128.171 port 60477
Feb 20 07:15:14 np0005625204.novalocal sshd[21871]: Disconnecting invalid user wade 185.246.128.171 port 60477: Change of username or service not allowed: (wade,ssh-connection) -> (tst,ssh-connection) [preauth]
Feb 20 07:15:16 np0005625204.novalocal sshd[21873]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:20 np0005625204.novalocal sshd[21873]: Invalid user tst from 185.246.128.171 port 11821
Feb 20 07:15:20 np0005625204.novalocal sshd[21873]: Disconnecting invalid user tst 185.246.128.171 port 11821: Change of username or service not allowed: (tst,ssh-connection) -> (zhongwen,ssh-connection) [preauth]
Feb 20 07:15:23 np0005625204.novalocal sshd[21875]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:28 np0005625204.novalocal sshd[21875]: Invalid user zhongwen from 185.246.128.171 port 38791
Feb 20 07:15:29 np0005625204.novalocal sshd[21875]: Disconnecting invalid user zhongwen 185.246.128.171 port 38791: Change of username or service not allowed: (zhongwen,ssh-connection) -> (123,ssh-connection) [preauth]
Feb 20 07:15:31 np0005625204.novalocal sudo[21865]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:32 np0005625204.novalocal sshd[21878]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:34 np0005625204.novalocal sudo[21893]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfzgrwdketxptyhvoymvgnxozdscggeu ; /usr/bin/python3
Feb 20 07:15:34 np0005625204.novalocal sudo[21893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:34 np0005625204.novalocal python3[21895]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:15:34 np0005625204.novalocal sudo[21893]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:35 np0005625204.novalocal sshd[21878]: Invalid user 123 from 185.246.128.171 port 8637
Feb 20 07:15:36 np0005625204.novalocal sudo[21941]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-earzuvepzpximgttcshvxmypzbmfcjuk ; /usr/bin/python3
Feb 20 07:15:36 np0005625204.novalocal sudo[21941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:36 np0005625204.novalocal python3[21943]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:15:36 np0005625204.novalocal sudo[21941]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:36 np0005625204.novalocal sudo[21984]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htbeghcwzumelemtzagpfccmfkwinnmj ; /usr/bin/python3
Feb 20 07:15:36 np0005625204.novalocal sudo[21984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:36 np0005625204.novalocal sshd[21878]: Disconnecting invalid user 123 185.246.128.171 port 8637: Change of username or service not allowed: (123,ssh-connection) -> (USERID,ssh-connection) [preauth]
Feb 20 07:15:36 np0005625204.novalocal python3[21986]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771571735.942986-332-69464575789040/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=91bc45728dd9738fc644e3ada9d8642294da29ff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:15:36 np0005625204.novalocal sudo[21984]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:37 np0005625204.novalocal sshd[22001]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:38 np0005625204.novalocal sudo[22016]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nangsedndcazvngfmoelunylmwknzwcu ; /usr/bin/python3
Feb 20 07:15:38 np0005625204.novalocal sudo[22016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:38 np0005625204.novalocal sshd[22001]: Invalid user nutanix from 151.252.84.225 port 51168
Feb 20 07:15:38 np0005625204.novalocal python3[22018]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 20 07:15:38 np0005625204.novalocal sudo[22016]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:38 np0005625204.novalocal systemd-journald[618]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Feb 20 07:15:38 np0005625204.novalocal systemd-journald[618]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 07:15:38 np0005625204.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 07:15:38 np0005625204.novalocal rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 07:15:38 np0005625204.novalocal sshd[22001]: Received disconnect from 151.252.84.225 port 51168:11: Bye Bye [preauth]
Feb 20 07:15:38 np0005625204.novalocal sshd[22001]: Disconnected from invalid user nutanix 151.252.84.225 port 51168 [preauth]
Feb 20 07:15:38 np0005625204.novalocal sudo[22037]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuawkymkeoevymohleuomtwpxprpcskm ; /usr/bin/python3
Feb 20 07:15:38 np0005625204.novalocal sudo[22037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:38 np0005625204.novalocal python3[22039]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 20 07:15:38 np0005625204.novalocal sudo[22037]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:38 np0005625204.novalocal sshd[22055]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:39 np0005625204.novalocal sudo[22058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxtmzxcqacmmfyqquvsgrumobrbkjbvi ; /usr/bin/python3
Feb 20 07:15:39 np0005625204.novalocal sudo[22058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:39 np0005625204.novalocal python3[22060]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 20 07:15:39 np0005625204.novalocal sudo[22058]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:39 np0005625204.novalocal sudo[22079]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbqnsdslyyxbsohnluetxyeoeltzuyxm ; /usr/bin/python3
Feb 20 07:15:39 np0005625204.novalocal sudo[22079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:39 np0005625204.novalocal python3[22081]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 20 07:15:39 np0005625204.novalocal sudo[22079]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:39 np0005625204.novalocal sudo[22099]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aomffzfxkqnwipucwpcdpodjrwpqwomr ; /usr/bin/python3
Feb 20 07:15:39 np0005625204.novalocal sudo[22099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:39 np0005625204.novalocal python3[22101]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 20 07:15:39 np0005625204.novalocal sudo[22099]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:39 np0005625204.novalocal sshd[22055]: Invalid user USERID from 185.246.128.171 port 35360
Feb 20 07:15:40 np0005625204.novalocal sshd[22055]: Disconnecting invalid user USERID 185.246.128.171 port 35360: Change of username or service not allowed: (USERID,ssh-connection) -> (ftpuser1,ssh-connection) [preauth]
Feb 20 07:15:41 np0005625204.novalocal sshd[22106]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:42 np0005625204.novalocal sudo[22121]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omcwlrqujeyvowsnuujbayrfrsemjpee ; /usr/bin/python3
Feb 20 07:15:42 np0005625204.novalocal sudo[22121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:42 np0005625204.novalocal python3[22123]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:15:42 np0005625204.novalocal sshd[22106]: Invalid user ftpuser1 from 185.246.128.171 port 44917
Feb 20 07:15:42 np0005625204.novalocal sshd[22106]: Disconnecting invalid user ftpuser1 185.246.128.171 port 44917: Change of username or service not allowed: (ftpuser1,ssh-connection) -> (ftp_inst,ssh-connection) [preauth]
Feb 20 07:15:43 np0005625204.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Feb 20 07:15:43 np0005625204.novalocal network[22126]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 07:15:43 np0005625204.novalocal network[22137]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 07:15:43 np0005625204.novalocal network[22126]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Feb 20 07:15:43 np0005625204.novalocal network[22138]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:15:43 np0005625204.novalocal network[22126]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 07:15:43 np0005625204.novalocal network[22139]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 07:15:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571743.5758] audit: op="connections-reload" pid=22167 uid=0 result="success"
Feb 20 07:15:43 np0005625204.novalocal network[22126]: Bringing up loopback interface:  [  OK  ]
Feb 20 07:15:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571743.7763] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22255 uid=0 result="success"
Feb 20 07:15:43 np0005625204.novalocal network[22126]: Bringing up interface eth0:  [  OK  ]
Feb 20 07:15:43 np0005625204.novalocal systemd[1]: Started LSB: Bring up/down networking.
Feb 20 07:15:43 np0005625204.novalocal sudo[22121]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:43 np0005625204.novalocal sudo[22294]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vixensdqadutqqxbnrikvypkepfoderf ; /usr/bin/python3
Feb 20 07:15:43 np0005625204.novalocal sudo[22294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:15:44 np0005625204.novalocal python3[22296]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:15:44 np0005625204.novalocal sshd[22298]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Feb 20 07:15:44 np0005625204.novalocal chown[22301]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22306]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22306]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22306]: Starting ovsdb-server [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal ovs-vsctl[22356]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 20 07:15:44 np0005625204.novalocal ovs-vsctl[22376]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"e6b84e4d-7dff-4c2c-96db-c41e3ef520c6\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22306]: Configuring Open vSwitch system IDs [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal ovs-vsctl[22382]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005625204.novalocal
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22306]: Enabling remote OVSDB managers [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Started Open vSwitch Database Unit.
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 20 07:15:44 np0005625204.novalocal kernel: openvswitch: Open vSwitch switching datapath
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22426]: Inserting openvswitch module [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22395]: Starting ovs-vswitchd [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal ovs-vsctl[22445]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005625204.novalocal
Feb 20 07:15:44 np0005625204.novalocal ovs-ctl[22395]: Enabling remote OVSDB managers [  OK  ]
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Starting Open vSwitch...
Feb 20 07:15:44 np0005625204.novalocal systemd[1]: Finished Open vSwitch.
Feb 20 07:15:44 np0005625204.novalocal sudo[22294]: pam_unix(sudo:session): session closed for user root
Feb 20 07:15:46 np0005625204.novalocal sshd[22298]: Invalid user ftp_inst from 185.246.128.171 port 56148
Feb 20 07:15:46 np0005625204.novalocal sshd[22298]: Disconnecting invalid user ftp_inst 185.246.128.171 port 56148: Change of username or service not allowed: (ftp_inst,ssh-connection) -> (test,ssh-connection) [preauth]
Feb 20 07:15:47 np0005625204.novalocal sshd[22449]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:50 np0005625204.novalocal sshd[22449]: Invalid user test from 185.246.128.171 port 6139
Feb 20 07:15:57 np0005625204.novalocal sshd[22449]: error: maximum authentication attempts exceeded for invalid user test from 185.246.128.171 port 6139 ssh2 [preauth]
Feb 20 07:15:57 np0005625204.novalocal sshd[22449]: Disconnecting invalid user test 185.246.128.171 port 6139: Too many authentication failures [preauth]
Feb 20 07:15:58 np0005625204.novalocal sshd[22451]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:15:59 np0005625204.novalocal sshd[22451]: Invalid user test from 185.246.128.171 port 48984
Feb 20 07:16:02 np0005625204.novalocal sshd[22451]: error: maximum authentication attempts exceeded for invalid user test from 185.246.128.171 port 48984 ssh2 [preauth]
Feb 20 07:16:02 np0005625204.novalocal sshd[22451]: Disconnecting invalid user test 185.246.128.171 port 48984: Too many authentication failures [preauth]
Feb 20 07:16:03 np0005625204.novalocal sshd[22453]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:16:05 np0005625204.novalocal sshd[22453]: Invalid user test from 185.246.128.171 port 4622
Feb 20 07:16:08 np0005625204.novalocal sshd[22453]: error: maximum authentication attempts exceeded for invalid user test from 185.246.128.171 port 4622 ssh2 [preauth]
Feb 20 07:16:08 np0005625204.novalocal sshd[22453]: Disconnecting invalid user test 185.246.128.171 port 4622: Too many authentication failures [preauth]
Feb 20 07:16:11 np0005625204.novalocal sshd[22455]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:16:14 np0005625204.novalocal sshd[22455]: Invalid user test from 185.246.128.171 port 33671
Feb 20 07:16:15 np0005625204.novalocal sudo[22470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-braylhsfqpowgovokdqabinwdfpeziaz ; /usr/bin/python3
Feb 20 07:16:15 np0005625204.novalocal sudo[22470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:16:15 np0005625204.novalocal python3[22472]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-00000000001a-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:16:16 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571776.3677] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22669 uid=0 result="success"
Feb 20 07:16:16 np0005625204.novalocal ifup[22670]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:16 np0005625204.novalocal ifup[22671]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:16 np0005625204.novalocal ifup[22672]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:16 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571776.4002] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22678 uid=0 result="success"
Feb 20 07:16:16 np0005625204.novalocal ovs-vsctl[22680]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:ba:18:b1 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Feb 20 07:16:16 np0005625204.novalocal kernel: device ovs-system entered promiscuous mode
Feb 20 07:16:16 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571776.4262] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Feb 20 07:16:16 np0005625204.novalocal systemd-udevd[22681]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:16 np0005625204.novalocal kernel: Timeout policy base is empty
Feb 20 07:16:16 np0005625204.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Feb 20 07:16:16 np0005625204.novalocal kernel: device br-ex entered promiscuous mode
Feb 20 07:16:16 np0005625204.novalocal systemd-udevd[22696]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:16 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571776.4715] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Feb 20 07:16:16 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571776.5021] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22706 uid=0 result="success"
Feb 20 07:16:16 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571776.5230] device (br-ex): carrier: link connected
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.5795] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22735 uid=0 result="success"
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.6278] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22750 uid=0 result="success"
Feb 20 07:16:19 np0005625204.novalocal NET[22775]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.7095] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.7148] dhcp4 (eth1): canceled DHCP transaction
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.7148] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.7149] dhcp4 (eth1): state changed no lease
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.7188] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22784 uid=0 result="success"
Feb 20 07:16:19 np0005625204.novalocal ifup[22785]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:19 np0005625204.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 20 07:16:19 np0005625204.novalocal ifup[22786]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:19 np0005625204.novalocal ifup[22788]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:19 np0005625204.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.7550] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22801 uid=0 result="success"
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.8046] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22812 uid=0 result="success"
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.8139] device (eth1): carrier: link connected
Feb 20 07:16:19 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571779.8346] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22821 uid=0 result="success"
Feb 20 07:16:19 np0005625204.novalocal ipv6_wait_tentative[22833]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Feb 20 07:16:20 np0005625204.novalocal ipv6_wait_tentative[22838]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Feb 20 07:16:21 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571781.9060] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22847 uid=0 result="success"
Feb 20 07:16:21 np0005625204.novalocal ovs-vsctl[22862]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Feb 20 07:16:21 np0005625204.novalocal kernel: device eth1 entered promiscuous mode
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.0041] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22870 uid=0 result="success"
Feb 20 07:16:22 np0005625204.novalocal ifup[22871]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:22 np0005625204.novalocal ifup[22872]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:22 np0005625204.novalocal ifup[22873]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.0353] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22879 uid=0 result="success"
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.0769] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22889 uid=0 result="success"
Feb 20 07:16:22 np0005625204.novalocal ifup[22890]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:22 np0005625204.novalocal ifup[22891]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:22 np0005625204.novalocal ifup[22892]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.1071] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22898 uid=0 result="success"
Feb 20 07:16:22 np0005625204.novalocal ovs-vsctl[22901]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Feb 20 07:16:22 np0005625204.novalocal kernel: device vlan21 entered promiscuous mode
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.1473] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Feb 20 07:16:22 np0005625204.novalocal systemd-udevd[22903]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.1670] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22912 uid=0 result="success"
Feb 20 07:16:22 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571782.1863] device (vlan21): carrier: link connected
Feb 20 07:16:22 np0005625204.novalocal sshd[22455]: error: maximum authentication attempts exceeded for invalid user test from 185.246.128.171 port 33671 ssh2 [preauth]
Feb 20 07:16:22 np0005625204.novalocal sshd[22455]: Disconnecting invalid user test 185.246.128.171 port 33671: Too many authentication failures [preauth]
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.2467] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22941 uid=0 result="success"
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.2918] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=22956 uid=0 result="success"
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.3499] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22977 uid=0 result="success"
Feb 20 07:16:25 np0005625204.novalocal ifup[22978]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:25 np0005625204.novalocal ifup[22979]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:25 np0005625204.novalocal ifup[22980]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.3826] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22986 uid=0 result="success"
Feb 20 07:16:25 np0005625204.novalocal ovs-vsctl[22989]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Feb 20 07:16:25 np0005625204.novalocal kernel: device vlan44 entered promiscuous mode
Feb 20 07:16:25 np0005625204.novalocal systemd-udevd[22991]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.4231] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.4491] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23001 uid=0 result="success"
Feb 20 07:16:25 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571785.4731] device (vlan44): carrier: link connected
Feb 20 07:16:27 np0005625204.novalocal sshd[23022]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.5345] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23033 uid=0 result="success"
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.5802] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23048 uid=0 result="success"
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.6401] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23069 uid=0 result="success"
Feb 20 07:16:28 np0005625204.novalocal ifup[23070]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:28 np0005625204.novalocal ifup[23071]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:28 np0005625204.novalocal ifup[23072]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.6719] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23078 uid=0 result="success"
Feb 20 07:16:28 np0005625204.novalocal ovs-vsctl[23081]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Feb 20 07:16:28 np0005625204.novalocal kernel: device vlan20 entered promiscuous mode
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.7132] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Feb 20 07:16:28 np0005625204.novalocal systemd-udevd[23083]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.7409] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23093 uid=0 result="success"
Feb 20 07:16:28 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571788.7619] device (vlan20): carrier: link connected
Feb 20 07:16:29 np0005625204.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 20 07:16:31 np0005625204.novalocal sshd[23022]: Invalid user test from 185.246.128.171 port 36357
Feb 20 07:16:31 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571791.8210] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23123 uid=0 result="success"
Feb 20 07:16:31 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571791.8693] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23138 uid=0 result="success"
Feb 20 07:16:31 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571791.9266] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23159 uid=0 result="success"
Feb 20 07:16:31 np0005625204.novalocal ifup[23160]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:31 np0005625204.novalocal ifup[23161]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:31 np0005625204.novalocal ifup[23162]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:31 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571791.9584] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23168 uid=0 result="success"
Feb 20 07:16:32 np0005625204.novalocal ovs-vsctl[23171]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Feb 20 07:16:32 np0005625204.novalocal kernel: device vlan22 entered promiscuous mode
Feb 20 07:16:32 np0005625204.novalocal systemd-udevd[23173]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:32 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571792.0446] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Feb 20 07:16:32 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571792.0705] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23183 uid=0 result="success"
Feb 20 07:16:32 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571792.0902] device (vlan22): carrier: link connected
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.1387] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23213 uid=0 result="success"
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.1840] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23228 uid=0 result="success"
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.2380] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23249 uid=0 result="success"
Feb 20 07:16:35 np0005625204.novalocal ifup[23250]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:35 np0005625204.novalocal ifup[23251]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:35 np0005625204.novalocal ifup[23252]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.2632] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23258 uid=0 result="success"
Feb 20 07:16:35 np0005625204.novalocal ovs-vsctl[23261]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Feb 20 07:16:35 np0005625204.novalocal systemd-udevd[23263]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 07:16:35 np0005625204.novalocal kernel: device vlan23 entered promiscuous mode
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.3027] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.3273] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23273 uid=0 result="success"
Feb 20 07:16:35 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571795.3477] device (vlan23): carrier: link connected
Feb 20 07:16:38 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571798.4005] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23303 uid=0 result="success"
Feb 20 07:16:38 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571798.4453] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23318 uid=0 result="success"
Feb 20 07:16:38 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571798.5049] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23339 uid=0 result="success"
Feb 20 07:16:38 np0005625204.novalocal ifup[23340]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:38 np0005625204.novalocal ifup[23341]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:38 np0005625204.novalocal ifup[23342]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:38 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571798.5355] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23348 uid=0 result="success"
Feb 20 07:16:38 np0005625204.novalocal ovs-vsctl[23351]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Feb 20 07:16:38 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571798.5915] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23358 uid=0 result="success"
Feb 20 07:16:38 np0005625204.novalocal sshd[23022]: error: maximum authentication attempts exceeded for invalid user test from 185.246.128.171 port 36357 ssh2 [preauth]
Feb 20 07:16:38 np0005625204.novalocal sshd[23022]: Disconnecting invalid user test 185.246.128.171 port 36357: Too many authentication failures [preauth]
Feb 20 07:16:39 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571799.6741] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23385 uid=0 result="success"
Feb 20 07:16:39 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571799.7196] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23400 uid=0 result="success"
Feb 20 07:16:39 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571799.7682] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23421 uid=0 result="success"
Feb 20 07:16:39 np0005625204.novalocal ifup[23422]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:39 np0005625204.novalocal ifup[23423]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:39 np0005625204.novalocal ifup[23424]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:39 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571799.7965] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23430 uid=0 result="success"
Feb 20 07:16:39 np0005625204.novalocal ovs-vsctl[23433]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Feb 20 07:16:39 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571799.8493] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23440 uid=0 result="success"
Feb 20 07:16:40 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571800.9080] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23468 uid=0 result="success"
Feb 20 07:16:40 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571800.9504] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23483 uid=0 result="success"
Feb 20 07:16:41 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571801.0095] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23504 uid=0 result="success"
Feb 20 07:16:41 np0005625204.novalocal ifup[23505]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:41 np0005625204.novalocal ifup[23506]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:41 np0005625204.novalocal ifup[23507]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:41 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571801.0408] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23513 uid=0 result="success"
Feb 20 07:16:41 np0005625204.novalocal ovs-vsctl[23516]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Feb 20 07:16:41 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571801.0989] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23523 uid=0 result="success"
Feb 20 07:16:41 np0005625204.novalocal sshd[23542]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:16:42 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571802.1596] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23552 uid=0 result="success"
Feb 20 07:16:42 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571802.2088] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23568 uid=0 result="success"
Feb 20 07:16:42 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571802.2703] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23589 uid=0 result="success"
Feb 20 07:16:42 np0005625204.novalocal ifup[23590]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:42 np0005625204.novalocal ifup[23591]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:42 np0005625204.novalocal ifup[23592]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:42 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571802.3019] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23598 uid=0 result="success"
Feb 20 07:16:42 np0005625204.novalocal ovs-vsctl[23601]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Feb 20 07:16:42 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571802.3586] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23608 uid=0 result="success"
Feb 20 07:16:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571803.4158] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23636 uid=0 result="success"
Feb 20 07:16:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571803.4631] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23651 uid=0 result="success"
Feb 20 07:16:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571803.5242] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23672 uid=0 result="success"
Feb 20 07:16:43 np0005625204.novalocal ifup[23673]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 20 07:16:43 np0005625204.novalocal ifup[23674]: 'network-scripts' will be removed from distribution in near future.
Feb 20 07:16:43 np0005625204.novalocal ifup[23675]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 20 07:16:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571803.5554] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23681 uid=0 result="success"
Feb 20 07:16:43 np0005625204.novalocal ovs-vsctl[23684]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Feb 20 07:16:43 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571803.6123] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23691 uid=0 result="success"
Feb 20 07:16:44 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571804.7144] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23719 uid=0 result="success"
Feb 20 07:16:44 np0005625204.novalocal NetworkManager[5988]: <info>  [1771571804.7610] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23734 uid=0 result="success"
Feb 20 07:16:44 np0005625204.novalocal sudo[22470]: pam_unix(sudo:session): session closed for user root
Feb 20 07:16:45 np0005625204.novalocal sshd[23542]: Invalid user test from 185.246.128.171 port 29188
Feb 20 07:16:48 np0005625204.novalocal sshd[23542]: Disconnecting invalid user test 185.246.128.171 port 29188: Change of username or service not allowed: (test,ssh-connection) -> (odoo17,ssh-connection) [preauth]
Feb 20 07:16:50 np0005625204.novalocal sshd[23752]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:16:54 np0005625204.novalocal sshd[23752]: Invalid user odoo17 from 185.246.128.171 port 62506
Feb 20 07:16:54 np0005625204.novalocal sshd[23752]: Disconnecting invalid user odoo17 185.246.128.171 port 62506: Change of username or service not allowed: (odoo17,ssh-connection) -> (user1,ssh-connection) [preauth]
Feb 20 07:16:58 np0005625204.novalocal sshd[23754]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:03 np0005625204.novalocal sshd[23754]: Invalid user user1 from 185.246.128.171 port 34623
Feb 20 07:17:08 np0005625204.novalocal sshd[23754]: error: maximum authentication attempts exceeded for invalid user user1 from 185.246.128.171 port 34623 ssh2 [preauth]
Feb 20 07:17:08 np0005625204.novalocal sshd[23754]: Disconnecting invalid user user1 185.246.128.171 port 34623: Too many authentication failures [preauth]
Feb 20 07:17:09 np0005625204.novalocal sshd[23757]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:09 np0005625204.novalocal sshd[23759]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:10 np0005625204.novalocal python3[23774]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-00000000001b-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:17:10 np0005625204.novalocal sshd[23759]: Invalid user httpd from 202.165.22.246 port 39514
Feb 20 07:17:11 np0005625204.novalocal sshd[23759]: Received disconnect from 202.165.22.246 port 39514:11: Bye Bye [preauth]
Feb 20 07:17:11 np0005625204.novalocal sshd[23759]: Disconnected from invalid user httpd 202.165.22.246 port 39514 [preauth]
Feb 20 07:17:11 np0005625204.novalocal sshd[23757]: Invalid user user1 from 185.246.128.171 port 14297
Feb 20 07:17:15 np0005625204.novalocal python3[23793]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 07:17:15 np0005625204.novalocal sudo[23807]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdqytaqkffcujtotjmbjbqztbtmsykkz ; /usr/bin/python3
Feb 20 07:17:15 np0005625204.novalocal sudo[23807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:17:15 np0005625204.novalocal python3[23809]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 07:17:15 np0005625204.novalocal sudo[23807]: pam_unix(sudo:session): session closed for user root
Feb 20 07:17:16 np0005625204.novalocal python3[23823]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 07:17:17 np0005625204.novalocal sudo[23837]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybahskxqyycqeruyjwyuwmmxfnzlzcfk ; /usr/bin/python3
Feb 20 07:17:17 np0005625204.novalocal sudo[23837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:17:17 np0005625204.novalocal python3[23839]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 20 07:17:17 np0005625204.novalocal sudo[23837]: pam_unix(sudo:session): session closed for user root
Feb 20 07:17:17 np0005625204.novalocal python3[23853]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Feb 20 07:17:18 np0005625204.novalocal python3[23868]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005625204.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000022-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:17:19 np0005625204.novalocal sudo[23886]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlssmxgubfkgwexuumzzvenvtobljpta ; /usr/bin/python3
Feb 20 07:17:19 np0005625204.novalocal sudo[23886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:17:19 np0005625204.novalocal python3[23888]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-064b-165c-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:17:19 np0005625204.novalocal systemd[1]: Starting Hostname Service...
Feb 20 07:17:19 np0005625204.novalocal systemd[1]: Started Hostname Service.
Feb 20 07:17:19 np0005625204.localdomain systemd-hostnamed[23892]: Hostname set to <np0005625204.localdomain> (static)
Feb 20 07:17:19 np0005625204.localdomain NetworkManager[5988]: <info>  [1771571839.7481] hostname: static hostname changed from "np0005625204.novalocal" to "np0005625204.localdomain"
Feb 20 07:17:19 np0005625204.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 20 07:17:19 np0005625204.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 20 07:17:19 np0005625204.localdomain sudo[23886]: pam_unix(sudo:session): session closed for user root
Feb 20 07:17:20 np0005625204.localdomain sshd[23757]: error: maximum authentication attempts exceeded for invalid user user1 from 185.246.128.171 port 14297 ssh2 [preauth]
Feb 20 07:17:20 np0005625204.localdomain sshd[23757]: Disconnecting invalid user user1 185.246.128.171 port 14297: Too many authentication failures [preauth]
Feb 20 07:17:20 np0005625204.localdomain sshd[19109]: pam_unix(sshd:session): session closed for user zuul
Feb 20 07:17:20 np0005625204.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Feb 20 07:17:20 np0005625204.localdomain systemd[1]: session-11.scope: Consumed 1min 44.180s CPU time.
Feb 20 07:17:20 np0005625204.localdomain systemd-logind[759]: Session 11 logged out. Waiting for processes to exit.
Feb 20 07:17:20 np0005625204.localdomain systemd-logind[759]: Removed session 11.
Feb 20 07:17:21 np0005625204.localdomain sshd[23903]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:22 np0005625204.localdomain sshd[23903]: Invalid user user1 from 185.246.128.171 port 62843
Feb 20 07:17:23 np0005625204.localdomain sshd[23905]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:24 np0005625204.localdomain sshd[23905]: Accepted publickey for zuul from 38.102.83.114 port 51816 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:17:24 np0005625204.localdomain systemd-logind[759]: New session 12 of user zuul.
Feb 20 07:17:24 np0005625204.localdomain systemd[1]: Started Session 12 of User zuul.
Feb 20 07:17:24 np0005625204.localdomain sshd[23905]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 07:17:24 np0005625204.localdomain python3[23922]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Feb 20 07:17:25 np0005625204.localdomain sshd[23905]: pam_unix(sshd:session): session closed for user zuul
Feb 20 07:17:25 np0005625204.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Feb 20 07:17:25 np0005625204.localdomain systemd-logind[759]: Session 12 logged out. Waiting for processes to exit.
Feb 20 07:17:25 np0005625204.localdomain systemd-logind[759]: Removed session 12.
Feb 20 07:17:26 np0005625204.localdomain sshd[23903]: error: maximum authentication attempts exceeded for invalid user user1 from 185.246.128.171 port 62843 ssh2 [preauth]
Feb 20 07:17:26 np0005625204.localdomain sshd[23903]: Disconnecting invalid user user1 185.246.128.171 port 62843: Too many authentication failures [preauth]
Feb 20 07:17:29 np0005625204.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 20 07:17:31 np0005625204.localdomain sshd[23923]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:35 np0005625204.localdomain sshd[23923]: Invalid user user1 from 185.246.128.171 port 43717
Feb 20 07:17:36 np0005625204.localdomain sshd[23923]: Disconnecting invalid user user1 185.246.128.171 port 43717: Change of username or service not allowed: (user1,ssh-connection) -> (prod,ssh-connection) [preauth]
Feb 20 07:17:37 np0005625204.localdomain sshd[23925]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:37 np0005625204.localdomain sshd[23925]: Received disconnect from 189.143.72.189 port 33914:11: Bye Bye [preauth]
Feb 20 07:17:37 np0005625204.localdomain sshd[23925]: Disconnected from authenticating user root 189.143.72.189 port 33914 [preauth]
Feb 20 07:17:38 np0005625204.localdomain sshd[23927]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:41 np0005625204.localdomain sshd[23927]: Invalid user prod from 185.246.128.171 port 8314
Feb 20 07:17:42 np0005625204.localdomain sshd[23927]: Disconnecting invalid user prod 185.246.128.171 port 8314: Change of username or service not allowed: (prod,ssh-connection) -> (liqing,ssh-connection) [preauth]
Feb 20 07:17:46 np0005625204.localdomain sshd[23929]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:49 np0005625204.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 20 07:17:50 np0005625204.localdomain sshd[23929]: Invalid user liqing from 185.246.128.171 port 43490
Feb 20 07:17:50 np0005625204.localdomain sshd[23929]: Disconnecting invalid user liqing 185.246.128.171 port 43490: Change of username or service not allowed: (liqing,ssh-connection) -> (wang,ssh-connection) [preauth]
Feb 20 07:17:53 np0005625204.localdomain sshd[23934]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:17:57 np0005625204.localdomain sshd[23934]: Invalid user wang from 185.246.128.171 port 9725
Feb 20 07:17:58 np0005625204.localdomain sshd[23934]: Disconnecting invalid user wang 185.246.128.171 port 9725: Change of username or service not allowed: (wang,ssh-connection) -> (craft,ssh-connection) [preauth]
Feb 20 07:18:03 np0005625204.localdomain sshd[23936]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:07 np0005625204.localdomain sshd[23936]: Invalid user craft from 185.246.128.171 port 52715
Feb 20 07:18:07 np0005625204.localdomain sshd[23936]: Disconnecting invalid user craft 185.246.128.171 port 52715: Change of username or service not allowed: (craft,ssh-connection) -> (plex,ssh-connection) [preauth]
Feb 20 07:18:10 np0005625204.localdomain sshd[23938]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:15 np0005625204.localdomain sshd[23940]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:16 np0005625204.localdomain sshd[23940]: Accepted publickey for zuul from 38.102.83.114 port 37732 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:18:16 np0005625204.localdomain systemd-logind[759]: New session 13 of user zuul.
Feb 20 07:18:16 np0005625204.localdomain systemd[1]: Started Session 13 of User zuul.
Feb 20 07:18:16 np0005625204.localdomain sshd[23940]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 07:18:16 np0005625204.localdomain sudo[23957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exvtjmndesxupkgdbvvefwlqppqgxtpq ; /usr/bin/python3
Feb 20 07:18:16 np0005625204.localdomain sudo[23957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:18:16 np0005625204.localdomain python3[23959]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:18:17 np0005625204.localdomain sshd[23938]: Invalid user plex from 185.246.128.171 port 18679
Feb 20 07:18:18 np0005625204.localdomain sshd[23938]: Disconnecting invalid user plex 185.246.128.171 port 18679: Change of username or service not allowed: (plex,ssh-connection) -> (mika,ssh-connection) [preauth]
Feb 20 07:18:19 np0005625204.localdomain sshd[23974]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:19 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:18:20 np0005625204.localdomain systemd-rc-local-generator[23999]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:18:20 np0005625204.localdomain systemd-sysv-generator[24002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:18:20 np0005625204.localdomain systemd-rc-local-generator[24040]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:18:20 np0005625204.localdomain systemd-sysv-generator[24043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:18:20 np0005625204.localdomain systemd-rc-local-generator[24084]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:18:20 np0005625204.localdomain systemd-sysv-generator[24087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Starting dnf makecache...
Feb 20 07:18:20 np0005625204.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:18:21 np0005625204.localdomain dnf[24095]: Updating Subscription Management repositories.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:18:21 np0005625204.localdomain systemd-rc-local-generator[24134]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:18:21 np0005625204.localdomain systemd-sysv-generator[24141]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: run-r187169f6d8f44215b922a68d2dedb85d.service: Deactivated successfully.
Feb 20 07:18:21 np0005625204.localdomain systemd[1]: run-r1c752a7949614ed7b4fc52d4bf634731.service: Deactivated successfully.
Feb 20 07:18:22 np0005625204.localdomain sudo[23957]: pam_unix(sudo:session): session closed for user root
Feb 20 07:18:22 np0005625204.localdomain dnf[24095]: Failed determining last makecache time.
Feb 20 07:18:23 np0005625204.localdomain dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   26 kB/s | 4.1 kB     00:00
Feb 20 07:18:23 np0005625204.localdomain sshd[23974]: Invalid user mika from 185.246.128.171 port 58628
Feb 20 07:18:23 np0005625204.localdomain dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - High Av  51 kB/s | 4.0 kB     00:00
Feb 20 07:18:23 np0005625204.localdomain dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  51 kB/s | 4.5 kB     00:00
Feb 20 07:18:23 np0005625204.localdomain dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   48 kB/s | 4.1 kB     00:00
Feb 20 07:18:23 np0005625204.localdomain dnf[24095]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  51 kB/s | 4.5 kB     00:00
Feb 20 07:18:23 np0005625204.localdomain dnf[24095]: Red Hat OpenStack Platform 17.1 for RHEL 9 x86_  48 kB/s | 4.0 kB     00:00
Feb 20 07:18:24 np0005625204.localdomain dnf[24095]: Fast Datapath for RHEL 9 x86_64 (RPMs)           47 kB/s | 4.0 kB     00:00
Feb 20 07:18:24 np0005625204.localdomain sshd[23974]: Disconnecting invalid user mika 185.246.128.171 port 58628: Change of username or service not allowed: (mika,ssh-connection) -> (report,ssh-connection) [preauth]
Feb 20 07:18:24 np0005625204.localdomain dnf[24095]: Metadata cache created.
Feb 20 07:18:24 np0005625204.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 20 07:18:24 np0005625204.localdomain systemd[1]: Finished dnf makecache.
Feb 20 07:18:24 np0005625204.localdomain systemd[1]: dnf-makecache.service: Consumed 2.889s CPU time.
Feb 20 07:18:25 np0005625204.localdomain sshd[24743]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:25 np0005625204.localdomain sshd[24743]: Invalid user ito from 151.252.84.225 port 42258
Feb 20 07:18:26 np0005625204.localdomain sshd[24743]: Received disconnect from 151.252.84.225 port 42258:11: Bye Bye [preauth]
Feb 20 07:18:26 np0005625204.localdomain sshd[24743]: Disconnected from invalid user ito 151.252.84.225 port 42258 [preauth]
Feb 20 07:18:26 np0005625204.localdomain sshd[24745]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:28 np0005625204.localdomain sshd[24745]: Invalid user report from 185.246.128.171 port 22127
Feb 20 07:18:29 np0005625204.localdomain sshd[24745]: Disconnecting invalid user report 185.246.128.171 port 22127: Change of username or service not allowed: (report,ssh-connection) -> (mama,ssh-connection) [preauth]
Feb 20 07:18:32 np0005625204.localdomain sshd[24747]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:34 np0005625204.localdomain sshd[24747]: Invalid user mama from 185.246.128.171 port 48813
Feb 20 07:18:35 np0005625204.localdomain sshd[24747]: Disconnecting invalid user mama 185.246.128.171 port 48813: Change of username or service not allowed: (mama,ssh-connection) -> (secret,ssh-connection) [preauth]
Feb 20 07:18:38 np0005625204.localdomain sshd[24749]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:40 np0005625204.localdomain sshd[24749]: Invalid user secret from 185.246.128.171 port 13124
Feb 20 07:18:41 np0005625204.localdomain sshd[24749]: Disconnecting invalid user secret 185.246.128.171 port 13124: Change of username or service not allowed: (secret,ssh-connection) -> (minh,ssh-connection) [preauth]
Feb 20 07:18:43 np0005625204.localdomain sshd[24751]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:49 np0005625204.localdomain sshd[24751]: Invalid user minh from 185.246.128.171 port 34159
Feb 20 07:18:50 np0005625204.localdomain sshd[24751]: Disconnecting invalid user minh 185.246.128.171 port 34159: Change of username or service not allowed: (minh,ssh-connection) -> (tmax,ssh-connection) [preauth]
Feb 20 07:18:50 np0005625204.localdomain sshd[24753]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:18:53 np0005625204.localdomain sshd[24753]: Invalid user tmax from 185.246.128.171 port 1482
Feb 20 07:18:54 np0005625204.localdomain sshd[24753]: Disconnecting invalid user tmax 185.246.128.171 port 1482: Change of username or service not allowed: (tmax,ssh-connection) -> (vyatta,ssh-connection) [preauth]
Feb 20 07:18:57 np0005625204.localdomain sshd[24755]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:01 np0005625204.localdomain sshd[24755]: Invalid user vyatta from 185.246.128.171 port 32103
Feb 20 07:19:02 np0005625204.localdomain sshd[24755]: Disconnecting invalid user vyatta 185.246.128.171 port 32103: Change of username or service not allowed: (vyatta,ssh-connection) -> (mongod,ssh-connection) [preauth]
Feb 20 07:19:06 np0005625204.localdomain sshd[24757]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:09 np0005625204.localdomain sshd[24757]: Invalid user mongod from 185.246.128.171 port 7376
Feb 20 07:19:10 np0005625204.localdomain sshd[24757]: Disconnecting invalid user mongod 185.246.128.171 port 7376: Change of username or service not allowed: (mongod,ssh-connection) -> (toto,ssh-connection) [preauth]
Feb 20 07:19:11 np0005625204.localdomain sshd[24759]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:14 np0005625204.localdomain sshd[24759]: Invalid user toto from 185.246.128.171 port 29653
Feb 20 07:19:15 np0005625204.localdomain sshd[24759]: Disconnecting invalid user toto 185.246.128.171 port 29653: Change of username or service not allowed: (toto,ssh-connection) -> (ubnt,ssh-connection) [preauth]
Feb 20 07:19:17 np0005625204.localdomain sshd[24761]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:20 np0005625204.localdomain sshd[24761]: Invalid user ubnt from 185.246.128.171 port 54053
Feb 20 07:19:22 np0005625204.localdomain sshd[23943]: Received disconnect from 38.102.83.114 port 37732:11: disconnected by user
Feb 20 07:19:22 np0005625204.localdomain sshd[23943]: Disconnected from user zuul 38.102.83.114 port 37732
Feb 20 07:19:22 np0005625204.localdomain sshd[23940]: pam_unix(sshd:session): session closed for user zuul
Feb 20 07:19:22 np0005625204.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Feb 20 07:19:22 np0005625204.localdomain systemd[1]: session-13.scope: Consumed 4.735s CPU time.
Feb 20 07:19:22 np0005625204.localdomain systemd-logind[759]: Session 13 logged out. Waiting for processes to exit.
Feb 20 07:19:22 np0005625204.localdomain systemd-logind[759]: Removed session 13.
Feb 20 07:19:26 np0005625204.localdomain sshd[24761]: error: maximum authentication attempts exceeded for invalid user ubnt from 185.246.128.171 port 54053 ssh2 [preauth]
Feb 20 07:19:26 np0005625204.localdomain sshd[24761]: Disconnecting invalid user ubnt 185.246.128.171 port 54053: Too many authentication failures [preauth]
Feb 20 07:19:29 np0005625204.localdomain sshd[24763]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:32 np0005625204.localdomain sshd[24763]: Invalid user ubnt from 185.246.128.171 port 44088
Feb 20 07:19:36 np0005625204.localdomain sshd[24763]: Disconnecting invalid user ubnt 185.246.128.171 port 44088: Change of username or service not allowed: (ubnt,ssh-connection) -> (clouduser,ssh-connection) [preauth]
Feb 20 07:19:39 np0005625204.localdomain sshd[24765]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:41 np0005625204.localdomain sshd[24765]: Invalid user clouduser from 185.246.128.171 port 21812
Feb 20 07:19:42 np0005625204.localdomain sshd[24765]: Disconnecting invalid user clouduser 185.246.128.171 port 21812: Change of username or service not allowed: (clouduser,ssh-connection) -> (TEST,ssh-connection) [preauth]
Feb 20 07:19:43 np0005625204.localdomain sshd[24767]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:46 np0005625204.localdomain sshd[24767]: Invalid user TEST from 185.246.128.171 port 42255
Feb 20 07:19:47 np0005625204.localdomain sshd[24767]: Disconnecting invalid user TEST 185.246.128.171 port 42255: Change of username or service not allowed: (TEST,ssh-connection) -> (vtiger,ssh-connection) [preauth]
Feb 20 07:19:49 np0005625204.localdomain sshd[24769]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:19:54 np0005625204.localdomain sshd[24769]: Invalid user vtiger from 185.246.128.171 port 3796
Feb 20 07:19:54 np0005625204.localdomain sshd[24769]: Disconnecting invalid user vtiger 185.246.128.171 port 3796: Change of username or service not allowed: (vtiger,ssh-connection) -> (Test,ssh-connection) [preauth]
Feb 20 07:19:57 np0005625204.localdomain sshd[24771]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:00 np0005625204.localdomain sshd[24771]: Invalid user Test from 185.246.128.171 port 34735
Feb 20 07:20:05 np0005625204.localdomain sshd[24771]: Disconnecting invalid user Test 185.246.128.171 port 34735: Change of username or service not allowed: (Test,ssh-connection) -> (tempadmin,ssh-connection) [preauth]
Feb 20 07:20:08 np0005625204.localdomain sshd[24774]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:11 np0005625204.localdomain sshd[24774]: Invalid user tempadmin from 185.246.128.171 port 18825
Feb 20 07:20:11 np0005625204.localdomain sshd[24774]: Disconnecting invalid user tempadmin 185.246.128.171 port 18825: Change of username or service not allowed: (tempadmin,ssh-connection) -> (adm,ssh-connection) [preauth]
Feb 20 07:20:15 np0005625204.localdomain sshd[24776]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:24 np0005625204.localdomain sshd[24776]: Disconnecting authenticating user adm 185.246.128.171 port 47492: Change of username or service not allowed: (adm,ssh-connection) -> (nobody,ssh-connection) [preauth]
Feb 20 07:20:27 np0005625204.localdomain sshd[24778]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:34 np0005625204.localdomain sshd[24778]: Disconnecting authenticating user nobody 185.246.128.171 port 32940: Change of username or service not allowed: (nobody,ssh-connection) -> (frank,ssh-connection) [preauth]
Feb 20 07:20:36 np0005625204.localdomain sshd[24780]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:40 np0005625204.localdomain sshd[24782]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:42 np0005625204.localdomain sshd[24782]: Invalid user user from 202.165.22.246 port 46164
Feb 20 07:20:42 np0005625204.localdomain sshd[24782]: Received disconnect from 202.165.22.246 port 46164:11: Bye Bye [preauth]
Feb 20 07:20:42 np0005625204.localdomain sshd[24782]: Disconnected from invalid user user 202.165.22.246 port 46164 [preauth]
Feb 20 07:20:42 np0005625204.localdomain sshd[24780]: Invalid user frank from 185.246.128.171 port 6988
Feb 20 07:20:42 np0005625204.localdomain sshd[24780]: Disconnecting invalid user frank 185.246.128.171 port 6988: Change of username or service not allowed: (frank,ssh-connection) -> (es_user,ssh-connection) [preauth]
Feb 20 07:20:45 np0005625204.localdomain sshd[24785]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:49 np0005625204.localdomain sshd[24785]: Invalid user es_user from 185.246.128.171 port 43077
Feb 20 07:20:49 np0005625204.localdomain sshd[24785]: Disconnecting invalid user es_user 185.246.128.171 port 43077: Change of username or service not allowed: (es_user,ssh-connection) -> (test5,ssh-connection) [preauth]
Feb 20 07:20:54 np0005625204.localdomain sshd[24787]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:20:56 np0005625204.localdomain sshd[24787]: Invalid user test5 from 185.246.128.171 port 14614
Feb 20 07:20:58 np0005625204.localdomain sshd[24787]: Disconnecting invalid user test5 185.246.128.171 port 14614: Change of username or service not allowed: (test5,ssh-connection) -> (nagios,ssh-connection) [preauth]
Feb 20 07:20:58 np0005625204.localdomain sshd[24789]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:00 np0005625204.localdomain sshd[24791]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:01 np0005625204.localdomain sshd[24789]: Invalid user nagios from 185.246.128.171 port 32545
Feb 20 07:21:01 np0005625204.localdomain sshd[24791]: Invalid user claude from 151.252.84.225 port 36664
Feb 20 07:21:01 np0005625204.localdomain sshd[24791]: Received disconnect from 151.252.84.225 port 36664:11: Bye Bye [preauth]
Feb 20 07:21:01 np0005625204.localdomain sshd[24791]: Disconnected from invalid user claude 151.252.84.225 port 36664 [preauth]
Feb 20 07:21:02 np0005625204.localdomain sshd[24789]: Disconnecting invalid user nagios 185.246.128.171 port 32545: Change of username or service not allowed: (nagios,ssh-connection) -> (desliga,ssh-connection) [preauth]
Feb 20 07:21:05 np0005625204.localdomain sshd[24793]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:10 np0005625204.localdomain sshd[24793]: Invalid user desliga from 185.246.128.171 port 60331
Feb 20 07:21:11 np0005625204.localdomain sshd[24793]: Disconnecting invalid user desliga 185.246.128.171 port 60331: Change of username or service not allowed: (desliga,ssh-connection) -> (vncuser,ssh-connection) [preauth]
Feb 20 07:21:13 np0005625204.localdomain sshd[24795]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:17 np0005625204.localdomain sshd[24795]: Invalid user vncuser from 185.246.128.171 port 29625
Feb 20 07:21:20 np0005625204.localdomain sshd[24795]: Disconnecting invalid user vncuser 185.246.128.171 port 29625: Change of username or service not allowed: (vncuser,ssh-connection) -> (alfresco,ssh-connection) [preauth]
Feb 20 07:21:23 np0005625204.localdomain sshd[24797]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:26 np0005625204.localdomain sshd[24797]: Invalid user alfresco from 185.246.128.171 port 6825
Feb 20 07:21:27 np0005625204.localdomain sshd[24797]: Disconnecting invalid user alfresco 185.246.128.171 port 6825: Change of username or service not allowed: (alfresco,ssh-connection) -> (mahmoud,ssh-connection) [preauth]
Feb 20 07:21:28 np0005625204.localdomain sshd[24799]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:32 np0005625204.localdomain sshd[24799]: Invalid user mahmoud from 185.246.128.171 port 26212
Feb 20 07:21:33 np0005625204.localdomain sshd[24799]: Disconnecting invalid user mahmoud 185.246.128.171 port 26212: Change of username or service not allowed: (mahmoud,ssh-connection) -> (github,ssh-connection) [preauth]
Feb 20 07:21:33 np0005625204.localdomain sshd[24801]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:34 np0005625204.localdomain sshd[24801]: Invalid user iksi from 189.143.72.189 port 45356
Feb 20 07:21:34 np0005625204.localdomain sshd[24801]: Received disconnect from 189.143.72.189 port 45356:11: Bye Bye [preauth]
Feb 20 07:21:34 np0005625204.localdomain sshd[24801]: Disconnected from invalid user iksi 189.143.72.189 port 45356 [preauth]
Feb 20 07:21:36 np0005625204.localdomain sshd[24803]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:38 np0005625204.localdomain sshd[24803]: Invalid user github from 185.246.128.171 port 56763
Feb 20 07:21:41 np0005625204.localdomain sshd[24803]: Disconnecting invalid user github 185.246.128.171 port 56763: Change of username or service not allowed: (github,ssh-connection) -> (ubuntu,ssh-connection) [preauth]
Feb 20 07:21:44 np0005625204.localdomain sshd[24805]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:21:48 np0005625204.localdomain sshd[24805]: Invalid user ubuntu from 185.246.128.171 port 24456
Feb 20 07:21:58 np0005625204.localdomain sshd[24805]: error: maximum authentication attempts exceeded for invalid user ubuntu from 185.246.128.171 port 24456 ssh2 [preauth]
Feb 20 07:21:58 np0005625204.localdomain sshd[24805]: Disconnecting invalid user ubuntu 185.246.128.171 port 24456: Too many authentication failures [preauth]
Feb 20 07:22:01 np0005625204.localdomain sshd[24807]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:04 np0005625204.localdomain sshd[24807]: Invalid user ubuntu from 185.246.128.171 port 28746
Feb 20 07:22:07 np0005625204.localdomain sshd[24807]: Disconnecting invalid user ubuntu 185.246.128.171 port 28746: Change of username or service not allowed: (ubuntu,ssh-connection) -> (ravi,ssh-connection) [preauth]
Feb 20 07:22:09 np0005625204.localdomain sshd[24809]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:14 np0005625204.localdomain sshd[24809]: Invalid user ravi from 185.246.128.171 port 62004
Feb 20 07:22:16 np0005625204.localdomain sshd[24809]: Disconnecting invalid user ravi 185.246.128.171 port 62004: Change of username or service not allowed: (ravi,ssh-connection) -> (userftp,ssh-connection) [preauth]
Feb 20 07:22:20 np0005625204.localdomain sshd[24811]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:23 np0005625204.localdomain sshd[24811]: Invalid user userftp from 185.246.128.171 port 39883
Feb 20 07:22:24 np0005625204.localdomain sshd[24811]: Disconnecting invalid user userftp 185.246.128.171 port 39883: Change of username or service not allowed: (userftp,ssh-connection) -> (server,ssh-connection) [preauth]
Feb 20 07:22:24 np0005625204.localdomain sshd[24813]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:25 np0005625204.localdomain sshd[24813]: Invalid user server from 185.246.128.171 port 56923
Feb 20 07:22:25 np0005625204.localdomain sshd[24813]: Disconnecting invalid user server 185.246.128.171 port 56923: Change of username or service not allowed: (server,ssh-connection) -> (splunk,ssh-connection) [preauth]
Feb 20 07:22:26 np0005625204.localdomain sshd[24815]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:30 np0005625204.localdomain sshd[24815]: Invalid user splunk from 185.246.128.171 port 63608
Feb 20 07:22:32 np0005625204.localdomain sshd[24815]: Disconnecting invalid user splunk 185.246.128.171 port 63608: Change of username or service not allowed: (splunk,ssh-connection) -> (security,ssh-connection) [preauth]
Feb 20 07:22:34 np0005625204.localdomain sshd[24817]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:36 np0005625204.localdomain sshd[24817]: Invalid user security from 185.246.128.171 port 31746
Feb 20 07:22:37 np0005625204.localdomain sshd[24817]: Disconnecting invalid user security 185.246.128.171 port 31746: Change of username or service not allowed: (security,ssh-connection) -> (1111,ssh-connection) [preauth]
Feb 20 07:22:38 np0005625204.localdomain sshd[24819]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:39 np0005625204.localdomain sshd[24819]: Invalid user 1111 from 185.246.128.171 port 47396
Feb 20 07:22:40 np0005625204.localdomain sshd[24819]: Disconnecting invalid user 1111 185.246.128.171 port 47396: Change of username or service not allowed: (1111,ssh-connection) -> (wuhan,ssh-connection) [preauth]
Feb 20 07:22:43 np0005625204.localdomain sshd[24821]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:47 np0005625204.localdomain sshd[24821]: Invalid user wuhan from 185.246.128.171 port 3825
Feb 20 07:22:48 np0005625204.localdomain sshd[24821]: Disconnecting invalid user wuhan 185.246.128.171 port 3825: Change of username or service not allowed: (wuhan,ssh-connection) -> (omsagent,ssh-connection) [preauth]
Feb 20 07:22:51 np0005625204.localdomain sshd[24823]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:54 np0005625204.localdomain sshd[24823]: Invalid user omsagent from 185.246.128.171 port 37649
Feb 20 07:22:54 np0005625204.localdomain sshd[24823]: Disconnecting invalid user omsagent 185.246.128.171 port 37649: Change of username or service not allowed: (omsagent,ssh-connection) -> (byte,ssh-connection) [preauth]
Feb 20 07:22:55 np0005625204.localdomain sshd[24825]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:22:57 np0005625204.localdomain sshd[24825]: Invalid user byte from 185.246.128.171 port 52913
Feb 20 07:22:57 np0005625204.localdomain sshd[24825]: Disconnecting invalid user byte 185.246.128.171 port 52913: Change of username or service not allowed: (byte,ssh-connection) -> (fastuser,ssh-connection) [preauth]
Feb 20 07:23:01 np0005625204.localdomain sshd[24827]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:05 np0005625204.localdomain sshd[24827]: Invalid user fastuser from 185.246.128.171 port 14841
Feb 20 07:23:07 np0005625204.localdomain sshd[24827]: Disconnecting invalid user fastuser 185.246.128.171 port 14841: Change of username or service not allowed: (fastuser,ssh-connection) -> (amits,ssh-connection) [preauth]
Feb 20 07:23:09 np0005625204.localdomain sshd[24829]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:13 np0005625204.localdomain sshd[24829]: Invalid user amits from 185.246.128.171 port 46347
Feb 20 07:23:14 np0005625204.localdomain sshd[24829]: Disconnecting invalid user amits 185.246.128.171 port 46347: Change of username or service not allowed: (amits,ssh-connection) -> (carlos,ssh-connection) [preauth]
Feb 20 07:23:16 np0005625204.localdomain sshd[24831]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:19 np0005625204.localdomain sshd[24831]: Invalid user carlos from 185.246.128.171 port 11621
Feb 20 07:23:20 np0005625204.localdomain sshd[24831]: Disconnecting invalid user carlos 185.246.128.171 port 11621: Change of username or service not allowed: (carlos,ssh-connection) -> (user,ssh-connection) [preauth]
Feb 20 07:23:24 np0005625204.localdomain sshd[24833]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:27 np0005625204.localdomain sshd[24833]: Invalid user user from 185.246.128.171 port 45443
Feb 20 07:23:30 np0005625204.localdomain sshd[24833]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 45443 ssh2 [preauth]
Feb 20 07:23:30 np0005625204.localdomain sshd[24833]: Disconnecting invalid user user 185.246.128.171 port 45443: Too many authentication failures [preauth]
Feb 20 07:23:32 np0005625204.localdomain sshd[24835]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:36 np0005625204.localdomain sshd[24835]: Invalid user user from 185.246.128.171 port 15673
Feb 20 07:23:42 np0005625204.localdomain sshd[24835]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 15673 ssh2 [preauth]
Feb 20 07:23:42 np0005625204.localdomain sshd[24835]: Disconnecting invalid user user 185.246.128.171 port 15673: Too many authentication failures [preauth]
Feb 20 07:23:43 np0005625204.localdomain sshd[24837]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:44 np0005625204.localdomain sshd[24837]: Received disconnect from 151.252.84.225 port 51528:11: Bye Bye [preauth]
Feb 20 07:23:44 np0005625204.localdomain sshd[24837]: Disconnected from authenticating user root 151.252.84.225 port 51528 [preauth]
Feb 20 07:23:44 np0005625204.localdomain sshd[24839]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:49 np0005625204.localdomain sshd[24839]: Invalid user user from 185.246.128.171 port 2231
Feb 20 07:23:55 np0005625204.localdomain sshd[24839]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 2231 ssh2 [preauth]
Feb 20 07:23:55 np0005625204.localdomain sshd[24839]: Disconnecting invalid user user 185.246.128.171 port 2231: Too many authentication failures [preauth]
Feb 20 07:23:56 np0005625204.localdomain sshd[24841]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:23:58 np0005625204.localdomain sshd[24841]: Invalid user user from 185.246.128.171 port 51794
Feb 20 07:24:03 np0005625204.localdomain sshd[24841]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 51794 ssh2 [preauth]
Feb 20 07:24:03 np0005625204.localdomain sshd[24841]: Disconnecting invalid user user 185.246.128.171 port 51794: Too many authentication failures [preauth]
Feb 20 07:24:07 np0005625204.localdomain sshd[24843]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:08 np0005625204.localdomain sshd[24843]: Invalid user user from 185.246.128.171 port 31407
Feb 20 07:24:10 np0005625204.localdomain sshd[24845]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:11 np0005625204.localdomain sshd[24845]: Invalid user user04 from 202.165.22.246 port 52812
Feb 20 07:24:11 np0005625204.localdomain sshd[24845]: Received disconnect from 202.165.22.246 port 52812:11: Bye Bye [preauth]
Feb 20 07:24:11 np0005625204.localdomain sshd[24845]: Disconnected from invalid user user04 202.165.22.246 port 52812 [preauth]
Feb 20 07:24:12 np0005625204.localdomain sshd[24843]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 31407 ssh2 [preauth]
Feb 20 07:24:12 np0005625204.localdomain sshd[24843]: Disconnecting invalid user user 185.246.128.171 port 31407: Too many authentication failures [preauth]
Feb 20 07:24:12 np0005625204.localdomain sshd[24847]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:16 np0005625204.localdomain sshd[24847]: Invalid user user from 185.246.128.171 port 55693
Feb 20 07:24:23 np0005625204.localdomain sshd[24847]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 55693 ssh2 [preauth]
Feb 20 07:24:23 np0005625204.localdomain sshd[24847]: Disconnecting invalid user user 185.246.128.171 port 55693: Too many authentication failures [preauth]
Feb 20 07:24:25 np0005625204.localdomain sshd[24849]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:30 np0005625204.localdomain sshd[24849]: Invalid user user from 185.246.128.171 port 47699
Feb 20 07:24:33 np0005625204.localdomain sshd[24849]: error: maximum authentication attempts exceeded for invalid user user from 185.246.128.171 port 47699 ssh2 [preauth]
Feb 20 07:24:33 np0005625204.localdomain sshd[24849]: Disconnecting invalid user user 185.246.128.171 port 47699: Too many authentication failures [preauth]
Feb 20 07:24:36 np0005625204.localdomain sshd[24851]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:39 np0005625204.localdomain sshd[24851]: Invalid user user from 185.246.128.171 port 28010
Feb 20 07:24:39 np0005625204.localdomain sshd[24851]: Disconnecting invalid user user 185.246.128.171 port 28010: Change of username or service not allowed: (user,ssh-connection) -> (storage,ssh-connection) [preauth]
Feb 20 07:24:41 np0005625204.localdomain sshd[24853]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:43 np0005625204.localdomain sshd[24853]: Invalid user storage from 185.246.128.171 port 48914
Feb 20 07:24:44 np0005625204.localdomain sshd[24853]: Disconnecting invalid user storage 185.246.128.171 port 48914: Change of username or service not allowed: (storage,ssh-connection) -> (vgilli,ssh-connection) [preauth]
Feb 20 07:24:47 np0005625204.localdomain sshd[24855]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:51 np0005625204.localdomain sshd[24855]: Invalid user vgilli from 185.246.128.171 port 12773
Feb 20 07:24:53 np0005625204.localdomain sshd[24855]: Disconnecting invalid user vgilli 185.246.128.171 port 12773: Change of username or service not allowed: (vgilli,ssh-connection) -> (a,ssh-connection) [preauth]
Feb 20 07:24:56 np0005625204.localdomain sshd[24857]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:24:59 np0005625204.localdomain sshd[24857]: Invalid user a from 185.246.128.171 port 50012
Feb 20 07:25:00 np0005625204.localdomain sshd[24857]: Disconnecting invalid user a 185.246.128.171 port 50012: Change of username or service not allowed: (a,ssh-connection) -> (hacluster,ssh-connection) [preauth]
Feb 20 07:25:03 np0005625204.localdomain sshd[24859]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:07 np0005625204.localdomain sshd[24859]: Invalid user hacluster from 185.246.128.171 port 14446
Feb 20 07:25:08 np0005625204.localdomain sshd[24859]: Disconnecting invalid user hacluster 185.246.128.171 port 14446: Change of username or service not allowed: (hacluster,ssh-connection) -> (eagle,ssh-connection) [preauth]
Feb 20 07:25:11 np0005625204.localdomain sshd[24861]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:14 np0005625204.localdomain sshd[24861]: Invalid user eagle from 185.246.128.171 port 54687
Feb 20 07:25:15 np0005625204.localdomain sshd[24861]: Disconnecting invalid user eagle 185.246.128.171 port 54687: Change of username or service not allowed: (eagle,ssh-connection) -> (bbs,ssh-connection) [preauth]
Feb 20 07:25:18 np0005625204.localdomain sshd[24863]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:22 np0005625204.localdomain sshd[24863]: Invalid user bbs from 185.246.128.171 port 16694
Feb 20 07:25:24 np0005625204.localdomain sshd[24863]: Disconnecting invalid user bbs 185.246.128.171 port 16694: Change of username or service not allowed: (bbs,ssh-connection) -> (guest,ssh-connection) [preauth]
Feb 20 07:25:27 np0005625204.localdomain sshd[24865]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:29 np0005625204.localdomain sshd[24866]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:29 np0005625204.localdomain sshd[24866]: Invalid user x from 189.143.72.189 port 56802
Feb 20 07:25:29 np0005625204.localdomain sshd[24866]: Received disconnect from 189.143.72.189 port 56802:11: Bye Bye [preauth]
Feb 20 07:25:29 np0005625204.localdomain sshd[24866]: Disconnected from invalid user x 189.143.72.189 port 56802 [preauth]
Feb 20 07:25:30 np0005625204.localdomain sshd[24865]: Invalid user guest from 185.246.128.171 port 59002
Feb 20 07:25:38 np0005625204.localdomain sshd[24865]: error: maximum authentication attempts exceeded for invalid user guest from 185.246.128.171 port 59002 ssh2 [preauth]
Feb 20 07:25:38 np0005625204.localdomain sshd[24865]: Disconnecting invalid user guest 185.246.128.171 port 59002: Too many authentication failures [preauth]
Feb 20 07:25:44 np0005625204.localdomain sshd[24870]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:47 np0005625204.localdomain sshd[24870]: Invalid user guest from 185.246.128.171 port 3550
Feb 20 07:25:52 np0005625204.localdomain sshd[24870]: Disconnecting invalid user guest 185.246.128.171 port 3550: Change of username or service not allowed: (guest,ssh-connection) -> (sys,ssh-connection) [preauth]
Feb 20 07:25:53 np0005625204.localdomain sshd[24872]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:25:56 np0005625204.localdomain sshd[24872]: Invalid user sys from 185.246.128.171 port 44020
Feb 20 07:25:57 np0005625204.localdomain sshd[24872]: Disconnecting invalid user sys 185.246.128.171 port 44020: Change of username or service not allowed: (sys,ssh-connection) -> (max,ssh-connection) [preauth]
Feb 20 07:25:58 np0005625204.localdomain sshd[24874]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:02 np0005625204.localdomain sshd[24874]: Invalid user max from 185.246.128.171 port 64238
Feb 20 07:26:04 np0005625204.localdomain sshd[24874]: Disconnecting invalid user max 185.246.128.171 port 64238: Change of username or service not allowed: (max,ssh-connection) -> (xiaoxiao,ssh-connection) [preauth]
Feb 20 07:26:10 np0005625204.localdomain sshd[24876]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:13 np0005625204.localdomain sshd[24876]: Invalid user xiaoxiao from 185.246.128.171 port 54112
Feb 20 07:26:14 np0005625204.localdomain sshd[24876]: Disconnecting invalid user xiaoxiao 185.246.128.171 port 54112: Change of username or service not allowed: (xiaoxiao,ssh-connection) -> (dock,ssh-connection) [preauth]
Feb 20 07:26:16 np0005625204.localdomain sshd[24878]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:21 np0005625204.localdomain sshd[24880]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:21 np0005625204.localdomain sshd[24878]: Invalid user dock from 185.246.128.171 port 15358
Feb 20 07:26:22 np0005625204.localdomain sshd[24878]: Disconnecting invalid user dock 185.246.128.171 port 15358: Change of username or service not allowed: (dock,ssh-connection) -> (fff,ssh-connection) [preauth]
Feb 20 07:26:22 np0005625204.localdomain sshd[24880]: Received disconnect from 151.252.84.225 port 38920:11: Bye Bye [preauth]
Feb 20 07:26:22 np0005625204.localdomain sshd[24880]: Disconnected from authenticating user root 151.252.84.225 port 38920 [preauth]
Feb 20 07:26:24 np0005625204.localdomain sshd[24882]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:27 np0005625204.localdomain sshd[24882]: Invalid user fff from 185.246.128.171 port 51638
Feb 20 07:26:28 np0005625204.localdomain sshd[24882]: Disconnecting invalid user fff 185.246.128.171 port 51638: Change of username or service not allowed: (fff,ssh-connection) -> (mail,ssh-connection) [preauth]
Feb 20 07:26:31 np0005625204.localdomain sshd[24884]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:37 np0005625204.localdomain sshd[24884]: Disconnecting authenticating user mail 185.246.128.171 port 17028: Change of username or service not allowed: (mail,ssh-connection) -> (fedora,ssh-connection) [preauth]
Feb 20 07:26:43 np0005625204.localdomain sshd[24886]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:46 np0005625204.localdomain sshd[24886]: Invalid user fedora from 185.246.128.171 port 4320
Feb 20 07:26:47 np0005625204.localdomain sshd[24886]: Disconnecting invalid user fedora 185.246.128.171 port 4320: Change of username or service not allowed: (fedora,ssh-connection) -> (newusername,ssh-connection) [preauth]
Feb 20 07:26:48 np0005625204.localdomain sshd[24888]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:52 np0005625204.localdomain sshd[24888]: Invalid user newusername from 185.246.128.171 port 30201
Feb 20 07:26:53 np0005625204.localdomain sshd[24888]: Disconnecting invalid user newusername 185.246.128.171 port 30201: Change of username or service not allowed: (newusername,ssh-connection) -> (pwrchute,ssh-connection) [preauth]
Feb 20 07:26:55 np0005625204.localdomain sshd[24890]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:26:59 np0005625204.localdomain sshd[24890]: Invalid user pwrchute from 185.246.128.171 port 59467
Feb 20 07:27:00 np0005625204.localdomain sshd[24890]: Disconnecting invalid user pwrchute 185.246.128.171 port 59467: Change of username or service not allowed: (pwrchute,ssh-connection) -> (john,ssh-connection) [preauth]
Feb 20 07:27:02 np0005625204.localdomain sshd[24892]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:05 np0005625204.localdomain sshd[24892]: Invalid user john from 185.246.128.171 port 25183
Feb 20 07:27:05 np0005625204.localdomain sshd[24892]: Disconnecting invalid user john 185.246.128.171 port 25183: Change of username or service not allowed: (john,ssh-connection) -> (ts,ssh-connection) [preauth]
Feb 20 07:27:06 np0005625204.localdomain sshd[24894]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:09 np0005625204.localdomain sshd[24894]: Invalid user ts from 185.246.128.171 port 42212
Feb 20 07:27:10 np0005625204.localdomain sshd[24894]: Disconnecting invalid user ts 185.246.128.171 port 42212: Change of username or service not allowed: (ts,ssh-connection) -> (kubelet,ssh-connection) [preauth]
Feb 20 07:27:13 np0005625204.localdomain sshd[24896]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:15 np0005625204.localdomain sshd[24896]: Invalid user kubelet from 185.246.128.171 port 11354
Feb 20 07:27:15 np0005625204.localdomain sshd[24896]: Disconnecting invalid user kubelet 185.246.128.171 port 11354: Change of username or service not allowed: (kubelet,ssh-connection) -> (12345,ssh-connection) [preauth]
Feb 20 07:27:17 np0005625204.localdomain sshd[24898]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:20 np0005625204.localdomain sshd[24898]: Invalid user 12345 from 185.246.128.171 port 29428
Feb 20 07:27:20 np0005625204.localdomain sshd[24898]: Disconnecting invalid user 12345 185.246.128.171 port 29428: Change of username or service not allowed: (12345,ssh-connection) -> (steam,ssh-connection) [preauth]
Feb 20 07:27:22 np0005625204.localdomain sshd[24900]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:25 np0005625204.localdomain sshd[24900]: Invalid user steam from 185.246.128.171 port 52296
Feb 20 07:27:29 np0005625204.localdomain sshd[24900]: Disconnecting invalid user steam 185.246.128.171 port 52296: Change of username or service not allowed: (steam,ssh-connection) -> (doge,ssh-connection) [preauth]
Feb 20 07:27:31 np0005625204.localdomain sshd[24902]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:34 np0005625204.localdomain sshd[24902]: Invalid user doge from 185.246.128.171 port 30850
Feb 20 07:27:34 np0005625204.localdomain sshd[24902]: Disconnecting invalid user doge 185.246.128.171 port 30850: Change of username or service not allowed: (doge,ssh-connection) -> (core,ssh-connection) [preauth]
Feb 20 07:27:35 np0005625204.localdomain sshd[24904]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:37 np0005625204.localdomain sshd[24906]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:38 np0005625204.localdomain sshd[24904]: Invalid user core from 185.246.128.171 port 47791
Feb 20 07:27:38 np0005625204.localdomain sshd[24906]: Received disconnect from 202.165.22.246 port 59460:11: Bye Bye [preauth]
Feb 20 07:27:38 np0005625204.localdomain sshd[24906]: Disconnected from authenticating user root 202.165.22.246 port 59460 [preauth]
Feb 20 07:27:38 np0005625204.localdomain sshd[24904]: Disconnecting invalid user core 185.246.128.171 port 47791: Change of username or service not allowed: (core,ssh-connection) -> (root2,ssh-connection) [preauth]
Feb 20 07:27:42 np0005625204.localdomain sshd[24908]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:45 np0005625204.localdomain sshd[24908]: Invalid user root2 from 185.246.128.171 port 11810
Feb 20 07:27:46 np0005625204.localdomain sshd[24908]: Disconnecting invalid user root2 185.246.128.171 port 11810: Change of username or service not allowed: (root2,ssh-connection) -> (vpn,ssh-connection) [preauth]
Feb 20 07:27:49 np0005625204.localdomain sshd[24910]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:51 np0005625204.localdomain sshd[24910]: Invalid user vpn from 185.246.128.171 port 43436
Feb 20 07:27:53 np0005625204.localdomain sshd[24910]: Disconnecting invalid user vpn 185.246.128.171 port 43436: Change of username or service not allowed: (vpn,ssh-connection) -> (temp,ssh-connection) [preauth]
Feb 20 07:27:55 np0005625204.localdomain sshd[24912]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:27:58 np0005625204.localdomain sshd[24912]: Invalid user temp from 185.246.128.171 port 5611
Feb 20 07:28:01 np0005625204.localdomain sshd[24912]: Disconnecting invalid user temp 185.246.128.171 port 5611: Change of username or service not allowed: (temp,ssh-connection) -> (user15,ssh-connection) [preauth]
Feb 20 07:28:03 np0005625204.localdomain sshd[24914]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:05 np0005625204.localdomain sshd[24914]: Invalid user user15 from 185.246.128.171 port 41814
Feb 20 07:28:06 np0005625204.localdomain sshd[24914]: Disconnecting invalid user user15 185.246.128.171 port 41814: Change of username or service not allowed: (user15,ssh-connection) -> (daniel,ssh-connection) [preauth]
Feb 20 07:28:09 np0005625204.localdomain sshd[24916]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:13 np0005625204.localdomain sshd[24916]: Invalid user daniel from 185.246.128.171 port 6313
Feb 20 07:28:14 np0005625204.localdomain sshd[24916]: Disconnecting invalid user daniel 185.246.128.171 port 6313: Change of username or service not allowed: (daniel,ssh-connection) -> (1234,ssh-connection) [preauth]
Feb 20 07:28:18 np0005625204.localdomain sshd[24918]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:22 np0005625204.localdomain sshd[24918]: Invalid user 1234 from 185.246.128.171 port 49538
Feb 20 07:28:22 np0005625204.localdomain sshd[24918]: Disconnecting invalid user 1234 185.246.128.171 port 49538: Change of username or service not allowed: (1234,ssh-connection) -> (lotus,ssh-connection) [preauth]
Feb 20 07:28:23 np0005625204.localdomain sshd[24920]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:25 np0005625204.localdomain sshd[24920]: Invalid user lotus from 185.246.128.171 port 5894
Feb 20 07:28:26 np0005625204.localdomain sshd[24920]: Disconnecting invalid user lotus 185.246.128.171 port 5894: Change of username or service not allowed: (lotus,ssh-connection) -> (kiosk,ssh-connection) [preauth]
Feb 20 07:28:28 np0005625204.localdomain sshd[24922]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:31 np0005625204.localdomain sshd[24922]: Invalid user kiosk from 185.246.128.171 port 29770
Feb 20 07:28:32 np0005625204.localdomain sshd[24922]: Disconnecting invalid user kiosk 185.246.128.171 port 29770: Change of username or service not allowed: (kiosk,ssh-connection) -> (mohamed,ssh-connection) [preauth]
Feb 20 07:28:35 np0005625204.localdomain sshd[24924]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:42 np0005625204.localdomain sshd[24924]: Invalid user mohamed from 185.246.128.171 port 61208
Feb 20 07:28:43 np0005625204.localdomain sshd[24924]: Disconnecting invalid user mohamed 185.246.128.171 port 61208: Change of username or service not allowed: (mohamed,ssh-connection) -> (lenovo,ssh-connection) [preauth]
Feb 20 07:28:45 np0005625204.localdomain sshd[24926]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:28:50 np0005625204.localdomain sshd[24926]: Invalid user lenovo from 185.246.128.171 port 41181
Feb 20 07:28:52 np0005625204.localdomain sshd[24926]: Disconnecting invalid user lenovo 185.246.128.171 port 41181: Change of username or service not allowed: (lenovo,ssh-connection) -> (engineer,ssh-connection) [preauth]
Feb 20 07:28:57 np0005625204.localdomain sshd[24928]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:00 np0005625204.localdomain sshd[24928]: Invalid user engineer from 185.246.128.171 port 26501
Feb 20 07:29:01 np0005625204.localdomain sshd[24928]: Disconnecting invalid user engineer 185.246.128.171 port 26501: Change of username or service not allowed: (engineer,ssh-connection) -> (hugo,ssh-connection) [preauth]
Feb 20 07:29:06 np0005625204.localdomain sshd[24930]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:08 np0005625204.localdomain sshd[24932]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:09 np0005625204.localdomain sshd[24932]: Invalid user systemd from 151.252.84.225 port 53930
Feb 20 07:29:09 np0005625204.localdomain sshd[24932]: Received disconnect from 151.252.84.225 port 53930:11: Bye Bye [preauth]
Feb 20 07:29:09 np0005625204.localdomain sshd[24932]: Disconnected from invalid user systemd 151.252.84.225 port 53930 [preauth]
Feb 20 07:29:11 np0005625204.localdomain sshd[24930]: Invalid user hugo from 185.246.128.171 port 4222
Feb 20 07:29:12 np0005625204.localdomain sshd[24930]: Disconnecting invalid user hugo 185.246.128.171 port 4222: Change of username or service not allowed: (hugo,ssh-connection) -> (debianuser,ssh-connection) [preauth]
Feb 20 07:29:13 np0005625204.localdomain sshd[24934]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:13 np0005625204.localdomain sshd[24936]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:13 np0005625204.localdomain sshd[24934]: Invalid user socksuser from 189.143.72.189 port 39996
Feb 20 07:29:13 np0005625204.localdomain sshd[24934]: Received disconnect from 189.143.72.189 port 39996:11: Bye Bye [preauth]
Feb 20 07:29:13 np0005625204.localdomain sshd[24934]: Disconnected from invalid user socksuser 189.143.72.189 port 39996 [preauth]
Feb 20 07:29:16 np0005625204.localdomain sshd[24936]: Invalid user debianuser from 185.246.128.171 port 34364
Feb 20 07:29:16 np0005625204.localdomain sshd[24936]: Disconnecting invalid user debianuser 185.246.128.171 port 34364: Change of username or service not allowed: (debianuser,ssh-connection) -> (super,ssh-connection) [preauth]
Feb 20 07:29:19 np0005625204.localdomain sshd[24938]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:22 np0005625204.localdomain sshd[24938]: Invalid user super from 185.246.128.171 port 60588
Feb 20 07:29:24 np0005625204.localdomain sshd[24938]: Disconnecting invalid user super 185.246.128.171 port 60588: Change of username or service not allowed: (super,ssh-connection) -> (webapp,ssh-connection) [preauth]
Feb 20 07:29:28 np0005625204.localdomain sshd[24940]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:31 np0005625204.localdomain sshd[24940]: Invalid user webapp from 185.246.128.171 port 37831
Feb 20 07:29:32 np0005625204.localdomain sshd[24940]: Disconnecting invalid user webapp 185.246.128.171 port 37831: Change of username or service not allowed: (webapp,ssh-connection) -> (cristi,ssh-connection) [preauth]
Feb 20 07:29:36 np0005625204.localdomain sshd[24942]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:40 np0005625204.localdomain sshd[24942]: Invalid user cristi from 185.246.128.171 port 14216
Feb 20 07:29:41 np0005625204.localdomain sshd[24942]: Disconnecting invalid user cristi 185.246.128.171 port 14216: Change of username or service not allowed: (cristi,ssh-connection) -> (pa,ssh-connection) [preauth]
Feb 20 07:29:44 np0005625204.localdomain sshd[24944]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:47 np0005625204.localdomain sshd[24944]: Invalid user pa from 185.246.128.171 port 50778
Feb 20 07:29:48 np0005625204.localdomain sshd[24944]: Disconnecting invalid user pa 185.246.128.171 port 50778: Change of username or service not allowed: (pa,ssh-connection) -> (victor,ssh-connection) [preauth]
Feb 20 07:29:53 np0005625204.localdomain sshd[24946]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:29:58 np0005625204.localdomain sshd[24946]: Invalid user victor from 185.246.128.171 port 27794
Feb 20 07:29:59 np0005625204.localdomain sshd[24946]: Disconnecting invalid user victor 185.246.128.171 port 27794: Change of username or service not allowed: (victor,ssh-connection) -> (samba,ssh-connection) [preauth]
Feb 20 07:30:00 np0005625204.localdomain sshd[24948]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:04 np0005625204.localdomain sshd[24948]: Invalid user samba from 185.246.128.171 port 59081
Feb 20 07:30:05 np0005625204.localdomain sshd[24948]: Disconnecting invalid user samba 185.246.128.171 port 59081: Change of username or service not allowed: (samba,ssh-connection) -> (yealink,ssh-connection) [preauth]
Feb 20 07:30:06 np0005625204.localdomain sshd[24950]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:07 np0005625204.localdomain sshd[24951]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:07 np0005625204.localdomain sshd[24951]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 07:30:07 np0005625204.localdomain sshd[24951]: Connection closed by 142.93.130.138 port 46534
Feb 20 07:30:09 np0005625204.localdomain sshd[24950]: Invalid user yealink from 185.246.128.171 port 26129
Feb 20 07:30:10 np0005625204.localdomain sshd[24950]: Disconnecting invalid user yealink 185.246.128.171 port 26129: Change of username or service not allowed: (yealink,ssh-connection) -> (rafael,ssh-connection) [preauth]
Feb 20 07:30:11 np0005625204.localdomain sshd[24953]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:15 np0005625204.localdomain sshd[24953]: Invalid user rafael from 185.246.128.171 port 49135
Feb 20 07:30:15 np0005625204.localdomain sshd[24953]: Disconnecting invalid user rafael 185.246.128.171 port 49135: Change of username or service not allowed: (rafael,ssh-connection) -> (bao,ssh-connection) [preauth]
Feb 20 07:30:17 np0005625204.localdomain sshd[24955]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:21 np0005625204.localdomain sshd[24955]: Invalid user bao from 185.246.128.171 port 10840
Feb 20 07:30:21 np0005625204.localdomain sshd[24955]: Disconnecting invalid user bao 185.246.128.171 port 10840: Change of username or service not allowed: (bao,ssh-connection) -> (satisfactory,ssh-connection) [preauth]
Feb 20 07:30:23 np0005625204.localdomain sshd[24957]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:26 np0005625204.localdomain sshd[24957]: Invalid user satisfactory from 185.246.128.171 port 39548
Feb 20 07:30:28 np0005625204.localdomain sshd[24957]: Disconnecting invalid user satisfactory 185.246.128.171 port 39548: Change of username or service not allowed: (satisfactory,ssh-connection) -> (ionguest,ssh-connection [preauth]
Feb 20 07:30:29 np0005625204.localdomain sshd[24959]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:32 np0005625204.localdomain sshd[24959]: Invalid user ionguest from 185.246.128.171 port 4299
Feb 20 07:30:34 np0005625204.localdomain sshd[24959]: Disconnecting invalid user ionguest 185.246.128.171 port 4299: Change of username or service not allowed: (ionguest,ssh-connection) -> (management,ssh-connection) [preauth]
Feb 20 07:30:34 np0005625204.localdomain sshd[24961]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:37 np0005625204.localdomain sshd[24961]: Invalid user management from 185.246.128.171 port 27320
Feb 20 07:30:38 np0005625204.localdomain sshd[24961]: Disconnecting invalid user management 185.246.128.171 port 27320: Change of username or service not allowed: (management,ssh-connection) -> (huawei,ssh-connection) [preauth]
Feb 20 07:30:41 np0005625204.localdomain sshd[24963]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:44 np0005625204.localdomain sshd[24963]: Invalid user huawei from 185.246.128.171 port 57582
Feb 20 07:30:49 np0005625204.localdomain sshd[24963]: Disconnecting invalid user huawei 185.246.128.171 port 57582: Change of username or service not allowed: (huawei,ssh-connection) -> (charles,ssh-connection) [preauth]
Feb 20 07:30:52 np0005625204.localdomain sshd[24965]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:30:56 np0005625204.localdomain sshd[24965]: Invalid user charles from 185.246.128.171 port 44252
Feb 20 07:31:00 np0005625204.localdomain sshd[24965]: Disconnecting invalid user charles 185.246.128.171 port 44252: Change of username or service not allowed: (charles,ssh-connection) -> (autcom,ssh-connection) [preauth]
Feb 20 07:31:02 np0005625204.localdomain sshd[24968]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:03 np0005625204.localdomain sshd[24968]: Invalid user autcom from 185.246.128.171 port 23184
Feb 20 07:31:04 np0005625204.localdomain sshd[24968]: Disconnecting invalid user autcom 185.246.128.171 port 23184: Change of username or service not allowed: (autcom,ssh-connection) -> (usuario,ssh-connection) [preauth]
Feb 20 07:31:07 np0005625204.localdomain sshd[24970]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:12 np0005625204.localdomain sshd[24970]: Invalid user usuario from 185.246.128.171 port 44717
Feb 20 07:31:16 np0005625204.localdomain sshd[24970]: Disconnecting invalid user usuario 185.246.128.171 port 44717: Change of username or service not allowed: (usuario,ssh-connection) -> (monitor,ssh-connection) [preauth]
Feb 20 07:31:17 np0005625204.localdomain sshd[24972]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:19 np0005625204.localdomain sshd[24972]: Invalid user n8n from 202.165.22.246 port 37892
Feb 20 07:31:19 np0005625204.localdomain sshd[24972]: Received disconnect from 202.165.22.246 port 37892:11: Bye Bye [preauth]
Feb 20 07:31:19 np0005625204.localdomain sshd[24972]: Disconnected from invalid user n8n 202.165.22.246 port 37892 [preauth]
Feb 20 07:31:20 np0005625204.localdomain sshd[24974]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:24 np0005625204.localdomain sshd[24974]: Invalid user monitor from 185.246.128.171 port 38120
Feb 20 07:31:26 np0005625204.localdomain sshd[24974]: Disconnecting invalid user monitor 185.246.128.171 port 38120: Change of username or service not allowed: (monitor,ssh-connection) -> (ftp_test,ssh-connection) [preauth]
Feb 20 07:31:30 np0005625204.localdomain sshd[24976]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:32 np0005625204.localdomain sshd[24976]: Invalid user ftp_test from 185.246.128.171 port 17889
Feb 20 07:31:33 np0005625204.localdomain sshd[24976]: Disconnecting invalid user ftp_test 185.246.128.171 port 17889: Change of username or service not allowed: (ftp_test,ssh-connection) -> (alex,ssh-connection) [preauth]
Feb 20 07:31:36 np0005625204.localdomain sshd[24978]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:40 np0005625204.localdomain sshd[24978]: Invalid user alex from 185.246.128.171 port 45091
Feb 20 07:31:46 np0005625204.localdomain sshd[24978]: Disconnecting invalid user alex 185.246.128.171 port 45091: Change of username or service not allowed: (alex,ssh-connection) -> (user123,ssh-connection) [preauth]
Feb 20 07:31:48 np0005625204.localdomain sshd[24980]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:51 np0005625204.localdomain sshd[24980]: Invalid user user123 from 185.246.128.171 port 34663
Feb 20 07:31:52 np0005625204.localdomain sshd[24980]: Disconnecting invalid user user123 185.246.128.171 port 34663: Change of username or service not allowed: (user123,ssh-connection) -> (User,ssh-connection) [preauth]
Feb 20 07:31:54 np0005625204.localdomain sshd[24982]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:31:57 np0005625204.localdomain sshd[24982]: Invalid user User from 185.246.128.171 port 60336
Feb 20 07:32:01 np0005625204.localdomain sshd[24984]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:02 np0005625204.localdomain sshd[24982]: Disconnecting invalid user User 185.246.128.171 port 60336: Change of username or service not allowed: (User,ssh-connection) -> (share,ssh-connection) [preauth]
Feb 20 07:32:02 np0005625204.localdomain sshd[24984]: Invalid user n8n from 151.252.84.225 port 35236
Feb 20 07:32:02 np0005625204.localdomain sshd[24984]: Received disconnect from 151.252.84.225 port 35236:11: Bye Bye [preauth]
Feb 20 07:32:02 np0005625204.localdomain sshd[24984]: Disconnected from invalid user n8n 151.252.84.225 port 35236 [preauth]
Feb 20 07:32:03 np0005625204.localdomain sshd[24986]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:05 np0005625204.localdomain sshd[24986]: Invalid user share from 185.246.128.171 port 35965
Feb 20 07:32:06 np0005625204.localdomain sshd[24986]: Disconnecting invalid user share 185.246.128.171 port 35965: Change of username or service not allowed: (share,ssh-connection) -> (deployuser,ssh-connection) [preauth]
Feb 20 07:32:08 np0005625204.localdomain sshd[24988]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:12 np0005625204.localdomain sshd[24988]: Invalid user deployuser from 185.246.128.171 port 57654
Feb 20 07:32:14 np0005625204.localdomain sshd[24988]: Disconnecting invalid user deployuser 185.246.128.171 port 57654: Change of username or service not allowed: (deployuser,ssh-connection) -> (test1,ssh-connection) [preauth]
Feb 20 07:32:17 np0005625204.localdomain sshd[24990]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:20 np0005625204.localdomain sshd[24990]: Invalid user test1 from 185.246.128.171 port 30454
Feb 20 07:32:23 np0005625204.localdomain sshd[24990]: Disconnecting invalid user test1 185.246.128.171 port 30454: Change of username or service not allowed: (test1,ssh-connection) -> (lucas,ssh-connection) [preauth]
Feb 20 07:32:24 np0005625204.localdomain sshd[24992]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:26 np0005625204.localdomain sshd[24992]: Invalid user lucas from 185.246.128.171 port 60658
Feb 20 07:32:26 np0005625204.localdomain sshd[24992]: Disconnecting invalid user lucas 185.246.128.171 port 60658: Change of username or service not allowed: (lucas,ssh-connection) -> (adriana,ssh-connection) [preauth]
Feb 20 07:32:30 np0005625204.localdomain sshd[24994]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:35 np0005625204.localdomain sshd[24994]: Invalid user adriana from 185.246.128.171 port 18540
Feb 20 07:32:36 np0005625204.localdomain sshd[24994]: Disconnecting invalid user adriana 185.246.128.171 port 18540: Change of username or service not allowed: (adriana,ssh-connection) -> (ubuntuserver,ssh-connection) [preauth]
Feb 20 07:32:40 np0005625204.localdomain sshd[24996]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:44 np0005625204.localdomain sshd[24996]: Invalid user ubuntuserver from 185.246.128.171 port 2103
Feb 20 07:32:45 np0005625204.localdomain sshd[24996]: Disconnecting invalid user ubuntuserver 185.246.128.171 port 2103: Change of username or service not allowed: (ubuntuserver,ssh-connection) -> (qemu,ssh-connection) [preauth]
Feb 20 07:32:47 np0005625204.localdomain sshd[24998]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:53 np0005625204.localdomain sshd[24998]: Invalid user qemu from 185.246.128.171 port 32831
Feb 20 07:32:53 np0005625204.localdomain sshd[24998]: Disconnecting invalid user qemu 185.246.128.171 port 32831: Change of username or service not allowed: (qemu,ssh-connection) -> (vps,ssh-connection) [preauth]
Feb 20 07:32:56 np0005625204.localdomain sshd[25000]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:57 np0005625204.localdomain sshd[25001]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:32:57 np0005625204.localdomain sshd[25001]: Invalid user user from 189.143.72.189 port 51438
Feb 20 07:32:58 np0005625204.localdomain sshd[25001]: Received disconnect from 189.143.72.189 port 51438:11: Bye Bye [preauth]
Feb 20 07:32:58 np0005625204.localdomain sshd[25001]: Disconnected from invalid user user 189.143.72.189 port 51438 [preauth]
Feb 20 07:32:59 np0005625204.localdomain sshd[25000]: Invalid user vps from 185.246.128.171 port 8322
Feb 20 07:33:00 np0005625204.localdomain sshd[25000]: Disconnecting invalid user vps 185.246.128.171 port 8322: Change of username or service not allowed: (vps,ssh-connection) -> (deploy,ssh-connection) [preauth]
Feb 20 07:33:03 np0005625204.localdomain sshd[25004]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:08 np0005625204.localdomain sshd[25004]: Invalid user deploy from 185.246.128.171 port 40506
Feb 20 07:33:09 np0005625204.localdomain sshd[25004]: Disconnecting invalid user deploy 185.246.128.171 port 40506: Change of username or service not allowed: (deploy,ssh-connection) -> (db2admin,ssh-connection) [preauth]
Feb 20 07:33:12 np0005625204.localdomain sshd[25006]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:15 np0005625204.localdomain sshd[25006]: Invalid user db2admin from 185.246.128.171 port 12557
Feb 20 07:33:16 np0005625204.localdomain sshd[25006]: Disconnecting invalid user db2admin 185.246.128.171 port 12557: Change of username or service not allowed: (db2admin,ssh-connection) -> (mary,ssh-connection) [preauth]
Feb 20 07:33:20 np0005625204.localdomain sshd[25008]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:25 np0005625204.localdomain sshd[25008]: Invalid user mary from 185.246.128.171 port 47527
Feb 20 07:33:26 np0005625204.localdomain sshd[25008]: Disconnecting invalid user mary 185.246.128.171 port 47527: Change of username or service not allowed: (mary,ssh-connection) -> (sapadm,ssh-connection) [preauth]
Feb 20 07:33:29 np0005625204.localdomain sshd[25010]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:31 np0005625204.localdomain sshd[25010]: Invalid user sapadm from 185.246.128.171 port 26520
Feb 20 07:33:33 np0005625204.localdomain sshd[25010]: Disconnecting invalid user sapadm 185.246.128.171 port 26520: Change of username or service not allowed: (sapadm,ssh-connection) -> (Y,ssh-connection) [preauth]
Feb 20 07:33:34 np0005625204.localdomain sshd[25012]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:37 np0005625204.localdomain sshd[25012]: Invalid user Y from 185.246.128.171 port 45251
Feb 20 07:33:39 np0005625204.localdomain sshd[25012]: Disconnecting invalid user Y 185.246.128.171 port 45251: Change of username or service not allowed: (Y,ssh-connection) -> (asterisk,ssh-connection) [preauth]
Feb 20 07:33:42 np0005625204.localdomain sshd[25014]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:45 np0005625204.localdomain sshd[25014]: Invalid user asterisk from 185.246.128.171 port 16246
Feb 20 07:33:45 np0005625204.localdomain sshd[25014]: Disconnecting invalid user asterisk 185.246.128.171 port 16246: Change of username or service not allowed: (asterisk,ssh-connection) -> (vpnuser,ssh-connection) [preauth]
Feb 20 07:33:49 np0005625204.localdomain sshd[25016]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:33:53 np0005625204.localdomain sshd[25016]: Invalid user vpnuser from 185.246.128.171 port 46754
Feb 20 07:33:53 np0005625204.localdomain sshd[25016]: Disconnecting invalid user vpnuser 185.246.128.171 port 46754: Change of username or service not allowed: (vpnuser,ssh-connection) -> (esadmin,ssh-connection) [preauth]
Feb 20 07:33:57 np0005625204.localdomain sshd[25018]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:01 np0005625204.localdomain sshd[25018]: Invalid user esadmin from 185.246.128.171 port 16987
Feb 20 07:34:02 np0005625204.localdomain sshd[25018]: Disconnecting invalid user esadmin 185.246.128.171 port 16987: Change of username or service not allowed: (esadmin,ssh-connection) -> (xd,ssh-connection) [preauth]
Feb 20 07:34:06 np0005625204.localdomain sshd[25020]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:11 np0005625204.localdomain sshd[25020]: Invalid user xd from 185.246.128.171 port 58694
Feb 20 07:34:12 np0005625204.localdomain sshd[25020]: Disconnecting invalid user xd 185.246.128.171 port 58694: Change of username or service not allowed: (xd,ssh-connection) -> (docker,ssh-connection) [preauth]
Feb 20 07:34:14 np0005625204.localdomain sshd[25022]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:17 np0005625204.localdomain sshd[25022]: Invalid user docker from 185.246.128.171 port 31966
Feb 20 07:34:20 np0005625204.localdomain sshd[25022]: Disconnecting invalid user docker 185.246.128.171 port 31966: Change of username or service not allowed: (docker,ssh-connection) -> (dasusr1,ssh-connection) [preauth]
Feb 20 07:34:23 np0005625204.localdomain sshd[25024]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:27 np0005625204.localdomain sshd[25024]: Invalid user dasusr1 from 185.246.128.171 port 7325
Feb 20 07:34:27 np0005625204.localdomain sshd[25024]: Disconnecting invalid user dasusr1 185.246.128.171 port 7325: Change of username or service not allowed: (dasusr1,ssh-connection) -> (ftp1,ssh-connection) [preauth]
Feb 20 07:34:30 np0005625204.localdomain sshd[25026]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:32 np0005625204.localdomain sshd[25026]: Invalid user ftp1 from 185.246.128.171 port 35314
Feb 20 07:34:35 np0005625204.localdomain sshd[25026]: Disconnecting invalid user ftp1 185.246.128.171 port 35314: Change of username or service not allowed: (ftp1,ssh-connection) -> (peertube,ssh-connection) [preauth]
Feb 20 07:34:38 np0005625204.localdomain sshd[25028]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:44 np0005625204.localdomain sshd[25028]: Invalid user peertube from 185.246.128.171 port 8050
Feb 20 07:34:45 np0005625204.localdomain sshd[25028]: Disconnecting invalid user peertube 185.246.128.171 port 8050: Change of username or service not allowed: (peertube,ssh-connection) -> (user4,ssh-connection) [preauth]
Feb 20 07:34:47 np0005625204.localdomain sshd[25030]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:51 np0005625204.localdomain sshd[25030]: Invalid user user4 from 185.246.128.171 port 49329
Feb 20 07:34:52 np0005625204.localdomain sshd[25034]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:52 np0005625204.localdomain sshd[25030]: Disconnecting invalid user user4 185.246.128.171 port 49329: Change of username or service not allowed: (user4,ssh-connection) -> (frappe,ssh-connection) [preauth]
Feb 20 07:34:53 np0005625204.localdomain sshd[25034]: Received disconnect from 151.252.84.225 port 33752:11: Bye Bye [preauth]
Feb 20 07:34:53 np0005625204.localdomain sshd[25034]: Disconnected from authenticating user root 151.252.84.225 port 33752 [preauth]
Feb 20 07:34:54 np0005625204.localdomain sshd[25036]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:55 np0005625204.localdomain sshd[25036]: Invalid user systemd from 202.165.22.246 port 44548
Feb 20 07:34:55 np0005625204.localdomain sshd[25038]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:34:55 np0005625204.localdomain sshd[25036]: Received disconnect from 202.165.22.246 port 44548:11: Bye Bye [preauth]
Feb 20 07:34:55 np0005625204.localdomain sshd[25036]: Disconnected from invalid user systemd 202.165.22.246 port 44548 [preauth]
Feb 20 07:35:01 np0005625204.localdomain anacron[19093]: Job `cron.weekly' started
Feb 20 07:35:01 np0005625204.localdomain anacron[19093]: Job `cron.weekly' terminated
Feb 20 07:35:02 np0005625204.localdomain sshd[25038]: Invalid user frappe from 185.246.128.171 port 22219
Feb 20 07:35:02 np0005625204.localdomain sshd[25042]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:03 np0005625204.localdomain sshd[25042]: Accepted publickey for zuul from 192.168.122.100 port 34020 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:35:03 np0005625204.localdomain systemd-logind[759]: New session 14 of user zuul.
Feb 20 07:35:03 np0005625204.localdomain systemd[1]: Started Session 14 of User zuul.
Feb 20 07:35:03 np0005625204.localdomain sshd[25042]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 07:35:03 np0005625204.localdomain sudo[25088]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmujdwrzatiombdxozchkalglcxawkfm ; /usr/bin/python3
Feb 20 07:35:03 np0005625204.localdomain sudo[25088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:03 np0005625204.localdomain python3[25090]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 07:35:04 np0005625204.localdomain sudo[25088]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:05 np0005625204.localdomain sudo[25175]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmyqrkxumsmwdeazifgzqvpnnqxzzfzs ; /usr/bin/python3
Feb 20 07:35:05 np0005625204.localdomain sudo[25175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:05 np0005625204.localdomain python3[25177]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:35:06 np0005625204.localdomain sshd[25038]: Disconnecting invalid user frappe 185.246.128.171 port 22219: Change of username or service not allowed: (frappe,ssh-connection) -> (adib,ssh-connection) [preauth]
Feb 20 07:35:08 np0005625204.localdomain sudo[25175]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:08 np0005625204.localdomain sudo[25192]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sulfjrwwsprazibfqyqkledlyzrxgthe ; /usr/bin/python3
Feb 20 07:35:08 np0005625204.localdomain sudo[25192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:08 np0005625204.localdomain python3[25194]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:35:08 np0005625204.localdomain sudo[25192]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:09 np0005625204.localdomain sudo[25208]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nozjtptxziuwrqdwqnmsksvyccienxuz ; /usr/bin/python3
Feb 20 07:35:09 np0005625204.localdomain sudo[25208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:09 np0005625204.localdomain python3[25210]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:09 np0005625204.localdomain kernel: loop: module loaded
Feb 20 07:35:09 np0005625204.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Feb 20 07:35:09 np0005625204.localdomain sudo[25208]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:09 np0005625204.localdomain sshd[25220]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:09 np0005625204.localdomain sudo[25234]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chujmntxdqoryaudlqrzboriitnkajyo ; /usr/bin/python3
Feb 20 07:35:09 np0005625204.localdomain sudo[25234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:09 np0005625204.localdomain python3[25236]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:09 np0005625204.localdomain lvm[25239]: PV /dev/loop3 not used.
Feb 20 07:35:10 np0005625204.localdomain lvm[25248]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 20 07:35:10 np0005625204.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Feb 20 07:35:10 np0005625204.localdomain lvm[25250]:   1 logical volume(s) in volume group "ceph_vg0" now active
Feb 20 07:35:10 np0005625204.localdomain sudo[25234]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:10 np0005625204.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Feb 20 07:35:10 np0005625204.localdomain sudo[25296]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otcpndtgqlyogqjurifnjgpbdunsfozn ; /usr/bin/python3
Feb 20 07:35:10 np0005625204.localdomain sudo[25296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:10 np0005625204.localdomain python3[25298]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:35:10 np0005625204.localdomain sudo[25296]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:11 np0005625204.localdomain sudo[25340]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wstdwmimmugfzfwdwfkiyffhcpvdxblk ; /usr/bin/python3
Feb 20 07:35:11 np0005625204.localdomain sudo[25340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:11 np0005625204.localdomain python3[25342]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572910.4004626-54744-197095392528280/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:11 np0005625204.localdomain sudo[25340]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:11 np0005625204.localdomain sudo[25370]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhjprnusgoffjettgenphnesqslerlvd ; /usr/bin/python3
Feb 20 07:35:11 np0005625204.localdomain sudo[25370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:11 np0005625204.localdomain python3[25372]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:35:11 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:35:12 np0005625204.localdomain systemd-rc-local-generator[25396]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:35:12 np0005625204.localdomain systemd-sysv-generator[25402]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:35:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:35:12 np0005625204.localdomain systemd[1]: Starting Ceph OSD losetup...
Feb 20 07:35:12 np0005625204.localdomain bash[25412]: /dev/loop3: [64516]:9169619 (/var/lib/ceph-osd-0.img)
Feb 20 07:35:12 np0005625204.localdomain systemd[1]: Finished Ceph OSD losetup.
Feb 20 07:35:12 np0005625204.localdomain lvm[25413]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 20 07:35:12 np0005625204.localdomain lvm[25413]: VG ceph_vg0 finished
Feb 20 07:35:12 np0005625204.localdomain sudo[25370]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:12 np0005625204.localdomain sudo[25428]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-soarctzmorutjdtpvulvwayqnrcxwdmt ; /usr/bin/python3
Feb 20 07:35:12 np0005625204.localdomain sudo[25428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:12 np0005625204.localdomain python3[25430]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:35:13 np0005625204.localdomain sshd[25220]: Invalid user adib from 185.246.128.171 port 21723
Feb 20 07:35:14 np0005625204.localdomain sshd[25220]: Disconnecting invalid user adib 185.246.128.171 port 21723: Change of username or service not allowed: (adib,ssh-connection) -> (backup,ssh-connection) [preauth]
Feb 20 07:35:15 np0005625204.localdomain sudo[25428]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:15 np0005625204.localdomain sudo[25445]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgmnmlnafajrwslsrxmupxfugtreakzk ; /usr/bin/python3
Feb 20 07:35:15 np0005625204.localdomain sudo[25445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:15 np0005625204.localdomain python3[25447]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:35:15 np0005625204.localdomain sudo[25445]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:16 np0005625204.localdomain sudo[25461]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owixjoitmxssnslapcthtaguoiyjqjsd ; /usr/bin/python3
Feb 20 07:35:16 np0005625204.localdomain sudo[25461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:16 np0005625204.localdomain python3[25463]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:16 np0005625204.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Feb 20 07:35:16 np0005625204.localdomain sudo[25461]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:16 np0005625204.localdomain sudo[25483]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpkzahhcyroahelyyjxeqzuyoynkoytm ; /usr/bin/python3
Feb 20 07:35:16 np0005625204.localdomain sudo[25483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:17 np0005625204.localdomain python3[25485]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:17 np0005625204.localdomain lvm[25488]: PV /dev/loop4 not used.
Feb 20 07:35:17 np0005625204.localdomain lvm[25498]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 20 07:35:17 np0005625204.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Feb 20 07:35:17 np0005625204.localdomain lvm[25500]:   1 logical volume(s) in volume group "ceph_vg1" now active
Feb 20 07:35:17 np0005625204.localdomain sudo[25483]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:17 np0005625204.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Feb 20 07:35:17 np0005625204.localdomain sudo[25546]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpuanzxdvthopjfcwmikxtedshzloukh ; /usr/bin/python3
Feb 20 07:35:17 np0005625204.localdomain sudo[25546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:17 np0005625204.localdomain python3[25548]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:35:17 np0005625204.localdomain sudo[25546]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:18 np0005625204.localdomain sudo[25589]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkcekrubchdkhfwnhuvxtlmvjfmssvwp ; /usr/bin/python3
Feb 20 07:35:18 np0005625204.localdomain sudo[25589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:18 np0005625204.localdomain python3[25591]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572917.5701373-54975-86165242145165/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:18 np0005625204.localdomain sudo[25589]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:18 np0005625204.localdomain sudo[25619]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iccvjrkmqnzwsvozkuglfhgrqwcsjgql ; /usr/bin/python3
Feb 20 07:35:18 np0005625204.localdomain sudo[25619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:18 np0005625204.localdomain sshd[25621]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:18 np0005625204.localdomain python3[25622]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:35:18 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:35:19 np0005625204.localdomain systemd-rc-local-generator[25646]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:35:19 np0005625204.localdomain systemd-sysv-generator[25649]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:35:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:35:19 np0005625204.localdomain systemd[1]: Starting Ceph OSD losetup...
Feb 20 07:35:19 np0005625204.localdomain bash[25664]: /dev/loop4: [64516]:9171554 (/var/lib/ceph-osd-1.img)
Feb 20 07:35:19 np0005625204.localdomain systemd[1]: Finished Ceph OSD losetup.
Feb 20 07:35:19 np0005625204.localdomain lvm[25665]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 20 07:35:19 np0005625204.localdomain lvm[25665]: VG ceph_vg1 finished
Feb 20 07:35:19 np0005625204.localdomain sudo[25619]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:21 np0005625204.localdomain sshd[25621]: Invalid user backup from 185.246.128.171 port 63735
Feb 20 07:35:27 np0005625204.localdomain sudo[25709]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaacejhrhtwanveqelqxginecybuytce ; /usr/bin/python3
Feb 20 07:35:27 np0005625204.localdomain sudo[25709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:28 np0005625204.localdomain python3[25711]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 07:35:28 np0005625204.localdomain sudo[25709]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:29 np0005625204.localdomain sudo[25729]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqlqeyavergdrilkxwsgplwcszmcswqk ; /usr/bin/python3
Feb 20 07:35:29 np0005625204.localdomain sudo[25729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:29 np0005625204.localdomain python3[25731]: ansible-hostname Invoked with name=np0005625204.localdomain use=None
Feb 20 07:35:29 np0005625204.localdomain systemd[1]: Starting Hostname Service...
Feb 20 07:35:29 np0005625204.localdomain systemd[1]: Started Hostname Service.
Feb 20 07:35:29 np0005625204.localdomain sudo[25729]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:29 np0005625204.localdomain sshd[25621]: error: maximum authentication attempts exceeded for invalid user backup from 185.246.128.171 port 63735 ssh2 [preauth]
Feb 20 07:35:29 np0005625204.localdomain sshd[25621]: Disconnecting invalid user backup 185.246.128.171 port 63735: Too many authentication failures [preauth]
Feb 20 07:35:31 np0005625204.localdomain sudo[25752]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulwskooppvfflkndsusvczcsapmcklcw ; /usr/bin/python3
Feb 20 07:35:31 np0005625204.localdomain sudo[25752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:31 np0005625204.localdomain python3[25754]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Feb 20 07:35:31 np0005625204.localdomain sudo[25752]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:31 np0005625204.localdomain sudo[25800]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huwejxfixopjbmdxekcqfnahnxpurarj ; /usr/bin/python3
Feb 20 07:35:31 np0005625204.localdomain sudo[25800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:32 np0005625204.localdomain python3[25802]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.2_tj08awtmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:32 np0005625204.localdomain sudo[25800]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:32 np0005625204.localdomain sudo[25830]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqznpvurkvockafjlfdlupiwmmzgyrmn ; /usr/bin/python3
Feb 20 07:35:32 np0005625204.localdomain sudo[25830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:32 np0005625204.localdomain python3[25832]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.2_tj08awtmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:32 np0005625204.localdomain sudo[25830]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:33 np0005625204.localdomain sudo[25846]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-injyvztaygadkkxmpdjxwotqqtmkukjb ; /usr/bin/python3
Feb 20 07:35:33 np0005625204.localdomain sudo[25846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:33 np0005625204.localdomain python3[25848]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.2_tj08awtmphosts insertbefore=BOF block=192.168.122.106 np0005625202.localdomain np0005625202
                                                         192.168.122.106 np0005625202.ctlplane.localdomain np0005625202.ctlplane
                                                         192.168.122.107 np0005625203.localdomain np0005625203
                                                         192.168.122.107 np0005625203.ctlplane.localdomain np0005625203.ctlplane
                                                         192.168.122.108 np0005625204.localdomain np0005625204
                                                         192.168.122.108 np0005625204.ctlplane.localdomain np0005625204.ctlplane
                                                         192.168.122.103 np0005625199.localdomain np0005625199
                                                         192.168.122.103 np0005625199.ctlplane.localdomain np0005625199.ctlplane
                                                         192.168.122.104 np0005625200.localdomain np0005625200
                                                         192.168.122.104 np0005625200.ctlplane.localdomain np0005625200.ctlplane
                                                         192.168.122.105 np0005625201.localdomain np0005625201
                                                         192.168.122.105 np0005625201.ctlplane.localdomain np0005625201.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:33 np0005625204.localdomain sudo[25846]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:33 np0005625204.localdomain sshd[25857]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:33 np0005625204.localdomain sudo[25863]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtsvsgyyvulzubumlhklsrlkqwfsdlnp ; /usr/bin/python3
Feb 20 07:35:33 np0005625204.localdomain sudo[25863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:33 np0005625204.localdomain python3[25865]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.2_tj08awtmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:33 np0005625204.localdomain sudo[25863]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:33 np0005625204.localdomain sudo[25880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckgnmikeenvtorkufpeoyqhcoqgseiuu ; /usr/bin/python3
Feb 20 07:35:33 np0005625204.localdomain sudo[25880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:34 np0005625204.localdomain python3[25882]: ansible-file Invoked with path=/tmp/ansible.2_tj08awtmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:34 np0005625204.localdomain sudo[25880]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:36 np0005625204.localdomain sudo[25897]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrarjrrnezjblwtjlzxmhkydvtqhurfk ; /usr/bin/python3
Feb 20 07:35:36 np0005625204.localdomain sudo[25897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:36 np0005625204.localdomain python3[25899]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:36 np0005625204.localdomain sudo[25897]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:36 np0005625204.localdomain sudo[25915]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nckpbdensprzdmypneesnonzmagspixl ; /usr/bin/python3
Feb 20 07:35:36 np0005625204.localdomain sudo[25915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:37 np0005625204.localdomain python3[25917]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:35:37 np0005625204.localdomain sshd[25857]: Invalid user backup from 185.246.128.171 port 4779
Feb 20 07:35:38 np0005625204.localdomain sshd[25857]: Disconnecting invalid user backup 185.246.128.171 port 4779: Change of username or service not allowed: (backup,ssh-connection) -> (peter,ssh-connection) [preauth]
Feb 20 07:35:39 np0005625204.localdomain sshd[25919]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:39 np0005625204.localdomain sudo[25915]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:40 np0005625204.localdomain sshd[25919]: Invalid user peter from 185.246.128.171 port 30907
Feb 20 07:35:40 np0005625204.localdomain sshd[25919]: Disconnecting invalid user peter 185.246.128.171 port 30907: Change of username or service not allowed: (peter,ssh-connection) -> (abc,ssh-connection) [preauth]
Feb 20 07:35:40 np0005625204.localdomain sudo[25966]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehrsjhmbakxwvkxuztgyaqrjynvukwqo ; /usr/bin/python3
Feb 20 07:35:40 np0005625204.localdomain sudo[25966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:41 np0005625204.localdomain python3[25968]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:35:41 np0005625204.localdomain sudo[25966]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:41 np0005625204.localdomain sshd[25998]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:41 np0005625204.localdomain sudo[26012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhdvlvxxtdauflktkvqemunzjapgwykm ; /usr/bin/python3
Feb 20 07:35:41 np0005625204.localdomain sudo[26012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:41 np0005625204.localdomain python3[26014]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572940.6635659-55771-249721785478797/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:41 np0005625204.localdomain sudo[26012]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:42 np0005625204.localdomain sshd[25998]: Invalid user abc from 185.246.128.171 port 41146
Feb 20 07:35:42 np0005625204.localdomain sudo[26043]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqwimnmwxdbfqijhnesuretygofajiym ; /usr/bin/python3
Feb 20 07:35:42 np0005625204.localdomain sudo[26043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:42 np0005625204.localdomain python3[26045]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:35:42 np0005625204.localdomain sudo[26043]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:43 np0005625204.localdomain sudo[26061]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnhlowgmwmgoryqrbxbxctclcjtwsner ; /usr/bin/python3
Feb 20 07:35:43 np0005625204.localdomain sudo[26061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:43 np0005625204.localdomain python3[26063]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:35:43 np0005625204.localdomain chronyd[765]: chronyd exiting
Feb 20 07:35:43 np0005625204.localdomain systemd[1]: Stopping NTP client/server...
Feb 20 07:35:43 np0005625204.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 20 07:35:43 np0005625204.localdomain systemd[1]: Stopped NTP client/server.
Feb 20 07:35:43 np0005625204.localdomain systemd[1]: chronyd.service: Consumed 113ms CPU time, read 1.9M from disk, written 0B to disk.
Feb 20 07:35:43 np0005625204.localdomain systemd[1]: Starting NTP client/server...
Feb 20 07:35:43 np0005625204.localdomain chronyd[26071]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 20 07:35:43 np0005625204.localdomain chronyd[26071]: Frequency -29.979 +/- 0.262 ppm read from /var/lib/chrony/drift
Feb 20 07:35:43 np0005625204.localdomain chronyd[26071]: Loaded seccomp filter (level 2)
Feb 20 07:35:43 np0005625204.localdomain systemd[1]: Started NTP client/server.
Feb 20 07:35:43 np0005625204.localdomain sudo[26061]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:44 np0005625204.localdomain sudo[26118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpuglhrvohdyjjssyxksuumtcyxdnhmc ; /usr/bin/python3
Feb 20 07:35:44 np0005625204.localdomain sudo[26118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:44 np0005625204.localdomain python3[26120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:35:44 np0005625204.localdomain sudo[26118]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:44 np0005625204.localdomain sudo[26161]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csuotbnkgiaqwakwbppzmkbcqggehiky ; /usr/bin/python3
Feb 20 07:35:44 np0005625204.localdomain sudo[26161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:44 np0005625204.localdomain python3[26163]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771572943.9772875-56003-222424667383480/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:35:44 np0005625204.localdomain sudo[26161]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:44 np0005625204.localdomain sudo[26191]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrzmjtinaxubkgfosvcummswfsqclrfz ; /usr/bin/python3
Feb 20 07:35:44 np0005625204.localdomain sudo[26191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:45 np0005625204.localdomain python3[26193]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:35:45 np0005625204.localdomain systemd-rc-local-generator[26214]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:35:45 np0005625204.localdomain systemd-sysv-generator[26219]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:35:45 np0005625204.localdomain systemd-rc-local-generator[26256]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:35:45 np0005625204.localdomain systemd-sysv-generator[26261]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:35:45 np0005625204.localdomain sshd[25998]: Disconnecting invalid user abc 185.246.128.171 port 41146: Change of username or service not allowed: (abc,ssh-connection) -> (tim,ssh-connection) [preauth]
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: Starting chronyd online sources service...
Feb 20 07:35:45 np0005625204.localdomain chronyc[26269]: 200 OK
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Feb 20 07:35:45 np0005625204.localdomain systemd[1]: Finished chronyd online sources service.
Feb 20 07:35:45 np0005625204.localdomain sudo[26191]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:46 np0005625204.localdomain sudo[26283]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgemxohojfrbtkihbsfzpgjgxvayriwc ; /usr/bin/python3
Feb 20 07:35:46 np0005625204.localdomain sudo[26283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:46 np0005625204.localdomain python3[26285]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:46 np0005625204.localdomain chronyd[26071]: System clock was stepped by 0.000000 seconds
Feb 20 07:35:46 np0005625204.localdomain sudo[26283]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:46 np0005625204.localdomain sudo[26300]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frgyfikdmpohwhfwmlvdpvnqqapkwubl ; /usr/bin/python3
Feb 20 07:35:46 np0005625204.localdomain sudo[26300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:46 np0005625204.localdomain python3[26302]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:35:48 np0005625204.localdomain chronyd[26071]: Selected source 23.133.168.245 (pool.ntp.org)
Feb 20 07:35:49 np0005625204.localdomain sshd[26304]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:51 np0005625204.localdomain sshd[26304]: Invalid user tim from 185.246.128.171 port 16374
Feb 20 07:35:52 np0005625204.localdomain sshd[26304]: Disconnecting invalid user tim 185.246.128.171 port 16374: Change of username or service not allowed: (tim,ssh-connection) -> (openvpn,ssh-connection) [preauth]
Feb 20 07:35:54 np0005625204.localdomain sshd[26306]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:35:56 np0005625204.localdomain sudo[26300]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:57 np0005625204.localdomain sudo[26321]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubzokdpvrysazgcaqrhlrofdwyukqhuh ; /usr/bin/python3
Feb 20 07:35:57 np0005625204.localdomain sudo[26321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:57 np0005625204.localdomain sshd[26306]: Invalid user openvpn from 185.246.128.171 port 37683
Feb 20 07:35:57 np0005625204.localdomain python3[26323]: ansible-timezone Invoked with name=UTC hwclock=None
Feb 20 07:35:57 np0005625204.localdomain systemd[1]: Starting Time & Date Service...
Feb 20 07:35:57 np0005625204.localdomain systemd[1]: Started Time & Date Service.
Feb 20 07:35:57 np0005625204.localdomain sudo[26321]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:58 np0005625204.localdomain sudo[26341]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwuanhwljdxrjzqjbkpuuulccbfpdsil ; /usr/bin/python3
Feb 20 07:35:58 np0005625204.localdomain sudo[26341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:35:58 np0005625204.localdomain sshd[26306]: Disconnecting invalid user openvpn 185.246.128.171 port 37683: Change of username or service not allowed: (openvpn,ssh-connection) -> (yesenia,ssh-connection) [preauth]
Feb 20 07:35:58 np0005625204.localdomain python3[26343]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:35:58 np0005625204.localdomain chronyd[26071]: chronyd exiting
Feb 20 07:35:58 np0005625204.localdomain systemd[1]: Stopping NTP client/server...
Feb 20 07:35:58 np0005625204.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 20 07:35:58 np0005625204.localdomain systemd[1]: Stopped NTP client/server.
Feb 20 07:35:58 np0005625204.localdomain systemd[1]: Starting NTP client/server...
Feb 20 07:35:58 np0005625204.localdomain chronyd[26351]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 20 07:35:58 np0005625204.localdomain chronyd[26351]: Frequency -29.979 +/- 0.262 ppm read from /var/lib/chrony/drift
Feb 20 07:35:58 np0005625204.localdomain chronyd[26351]: Loaded seccomp filter (level 2)
Feb 20 07:35:58 np0005625204.localdomain systemd[1]: Started NTP client/server.
Feb 20 07:35:58 np0005625204.localdomain sudo[26341]: pam_unix(sudo:session): session closed for user root
Feb 20 07:35:59 np0005625204.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 20 07:36:01 np0005625204.localdomain sshd[26356]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:02 np0005625204.localdomain chronyd[26351]: Selected source 216.232.132.102 (pool.ntp.org)
Feb 20 07:36:06 np0005625204.localdomain sshd[26356]: Invalid user yesenia from 185.246.128.171 port 9247
Feb 20 07:36:08 np0005625204.localdomain sshd[26356]: Disconnecting invalid user yesenia 185.246.128.171 port 9247: Change of username or service not allowed: (yesenia,ssh-connection) -> (website,ssh-connection) [preauth]
Feb 20 07:36:13 np0005625204.localdomain sshd[26358]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:15 np0005625204.localdomain sudo[26373]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qecbxlfqxjtysczyvtevlkdvpkafrmab ; /usr/bin/python3
Feb 20 07:36:15 np0005625204.localdomain sudo[26373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:15 np0005625204.localdomain useradd[26377]: new group: name=ceph-admin, GID=1002
Feb 20 07:36:15 np0005625204.localdomain useradd[26377]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Feb 20 07:36:15 np0005625204.localdomain sudo[26373]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:15 np0005625204.localdomain sudo[26429]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmpwgwxbopocvtsrtacvgsmzfqimdxdk ; /usr/bin/python3
Feb 20 07:36:15 np0005625204.localdomain sudo[26429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:15 np0005625204.localdomain sudo[26429]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:16 np0005625204.localdomain sudo[26472]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvpehxonagxlujtwftlfjuliccujqxvn ; /usr/bin/python3
Feb 20 07:36:16 np0005625204.localdomain sudo[26472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:16 np0005625204.localdomain sudo[26472]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:16 np0005625204.localdomain sudo[26502]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkoqbpbayzxyhonzqpoojwakpaaesrqe ; /usr/bin/python3
Feb 20 07:36:16 np0005625204.localdomain sudo[26502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:16 np0005625204.localdomain sshd[26358]: Invalid user website from 185.246.128.171 port 61930
Feb 20 07:36:16 np0005625204.localdomain sudo[26502]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:16 np0005625204.localdomain sudo[26518]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unjmawivyvyfpcteybjjlhvbrnvwecyu ; /usr/bin/python3
Feb 20 07:36:16 np0005625204.localdomain sudo[26518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:17 np0005625204.localdomain sudo[26518]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:17 np0005625204.localdomain sudo[26534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fodonjcvueeajrxbyicrzmdbqbopnvnz ; /usr/bin/python3
Feb 20 07:36:17 np0005625204.localdomain sudo[26534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:17 np0005625204.localdomain sudo[26534]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:17 np0005625204.localdomain sudo[26550]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsyfiuajjafqyacngdllczfucxqbmdmp ; /usr/bin/python3
Feb 20 07:36:17 np0005625204.localdomain sudo[26550]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:36:17 np0005625204.localdomain sshd[26358]: Disconnecting invalid user website 185.246.128.171 port 61930: Change of username or service not allowed: (website,ssh-connection) -> (tunnel,ssh-connection) [preauth]
Feb 20 07:36:18 np0005625204.localdomain sudo[26550]: pam_unix(sudo:session): session closed for user root
Feb 20 07:36:19 np0005625204.localdomain sshd[26553]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:22 np0005625204.localdomain sshd[26553]: Invalid user tunnel from 185.246.128.171 port 28237
Feb 20 07:36:22 np0005625204.localdomain sshd[26553]: Disconnecting invalid user tunnel 185.246.128.171 port 28237: Change of username or service not allowed: (tunnel,ssh-connection) -> (william,ssh-connection) [preauth]
Feb 20 07:36:25 np0005625204.localdomain sshd[26555]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:27 np0005625204.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 20 07:36:30 np0005625204.localdomain sshd[26555]: Invalid user william from 185.246.128.171 port 57385
Feb 20 07:36:33 np0005625204.localdomain sshd[26555]: Disconnecting invalid user william 185.246.128.171 port 57385: Change of username or service not allowed: (william,ssh-connection) -> (mark,ssh-connection) [preauth]
Feb 20 07:36:35 np0005625204.localdomain sshd[26559]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:38 np0005625204.localdomain sshd[26559]: Invalid user mark from 185.246.128.171 port 40208
Feb 20 07:36:40 np0005625204.localdomain sshd[26559]: Disconnecting invalid user mark 185.246.128.171 port 40208: Change of username or service not allowed: (mark,ssh-connection) -> (czr,ssh-connection) [preauth]
Feb 20 07:36:42 np0005625204.localdomain sshd[26561]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:48 np0005625204.localdomain sshd[26561]: Invalid user czr from 185.246.128.171 port 11175
Feb 20 07:36:48 np0005625204.localdomain sshd[26561]: Disconnecting invalid user czr 185.246.128.171 port 11175: Change of username or service not allowed: (czr,ssh-connection) -> (alma,ssh-connection) [preauth]
Feb 20 07:36:51 np0005625204.localdomain sshd[26563]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:52 np0005625204.localdomain sshd[26565]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:36:52 np0005625204.localdomain sshd[26563]: Invalid user dixi from 189.143.72.189 port 34678
Feb 20 07:36:52 np0005625204.localdomain sshd[26563]: Received disconnect from 189.143.72.189 port 34678:11: Bye Bye [preauth]
Feb 20 07:36:52 np0005625204.localdomain sshd[26563]: Disconnected from invalid user dixi 189.143.72.189 port 34678 [preauth]
Feb 20 07:36:57 np0005625204.localdomain sshd[26565]: Invalid user alma from 185.246.128.171 port 59396
Feb 20 07:37:02 np0005625204.localdomain sshd[26565]: Disconnecting invalid user alma 185.246.128.171 port 59396: Change of username or service not allowed: (alma,ssh-connection) -> (redis,ssh-connection) [preauth]
Feb 20 07:37:04 np0005625204.localdomain sshd[26567]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:10 np0005625204.localdomain sshd[26567]: Invalid user redis from 185.246.128.171 port 56294
Feb 20 07:37:11 np0005625204.localdomain sshd[26567]: Disconnecting invalid user redis 185.246.128.171 port 56294: Change of username or service not allowed: (redis,ssh-connection) -> (esroot,ssh-connection) [preauth]
Feb 20 07:37:15 np0005625204.localdomain sshd[26569]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:18 np0005625204.localdomain sshd[26569]: Invalid user esroot from 185.246.128.171 port 43779
Feb 20 07:37:19 np0005625204.localdomain sshd[26569]: Disconnecting invalid user esroot 185.246.128.171 port 43779: Change of username or service not allowed: (esroot,ssh-connection) -> (ahmed,ssh-connection) [preauth]
Feb 20 07:37:22 np0005625204.localdomain sshd[26571]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:27 np0005625204.localdomain sshd[26571]: Invalid user ahmed from 185.246.128.171 port 13546
Feb 20 07:37:29 np0005625204.localdomain sshd[26571]: Disconnecting invalid user ahmed 185.246.128.171 port 13546: Change of username or service not allowed: (ahmed,ssh-connection) -> (gits,ssh-connection) [preauth]
Feb 20 07:37:31 np0005625204.localdomain sshd[26573]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:34 np0005625204.localdomain sshd[26575]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:34 np0005625204.localdomain sshd[26573]: Invalid user gits from 185.246.128.171 port 55157
Feb 20 07:37:35 np0005625204.localdomain sshd[26575]: Invalid user n8n from 151.252.84.225 port 42286
Feb 20 07:37:35 np0005625204.localdomain sshd[26575]: Received disconnect from 151.252.84.225 port 42286:11: Bye Bye [preauth]
Feb 20 07:37:35 np0005625204.localdomain sshd[26575]: Disconnected from invalid user n8n 151.252.84.225 port 42286 [preauth]
Feb 20 07:37:36 np0005625204.localdomain sshd[26573]: Disconnecting invalid user gits 185.246.128.171 port 55157: Change of username or service not allowed: (gits,ssh-connection) -> (sftp,ssh-connection) [preauth]
Feb 20 07:37:38 np0005625204.localdomain sshd[26577]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:42 np0005625204.localdomain sshd[26577]: Invalid user sftp from 185.246.128.171 port 26837
Feb 20 07:37:44 np0005625204.localdomain sshd[26579]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:44 np0005625204.localdomain sshd[26579]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:37:45 np0005625204.localdomain sshd[26577]: Disconnecting invalid user sftp 185.246.128.171 port 26837: Change of username or service not allowed: (sftp,ssh-connection) -> (test4,ssh-connection) [preauth]
Feb 20 07:37:50 np0005625204.localdomain sshd[26581]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:37:53 np0005625204.localdomain sshd[26581]: Invalid user test4 from 185.246.128.171 port 22015
Feb 20 07:37:53 np0005625204.localdomain sshd[26581]: Disconnecting invalid user test4 185.246.128.171 port 22015: Change of username or service not allowed: (test4,ssh-connection) -> (raj,ssh-connection) [preauth]
Feb 20 07:37:56 np0005625204.localdomain sshd[26583]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:01 np0005625204.localdomain sshd[26583]: Invalid user raj from 185.246.128.171 port 53470
Feb 20 07:38:01 np0005625204.localdomain sshd[26583]: Disconnecting invalid user raj 185.246.128.171 port 53470: Change of username or service not allowed: (raj,ssh-connection) -> (auditadm,ssh-connection) [preauth]
Feb 20 07:38:02 np0005625204.localdomain sshd[26585]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:03 np0005625204.localdomain sshd[26585]: Invalid user auditadm from 185.246.128.171 port 19433
Feb 20 07:38:04 np0005625204.localdomain sshd[26585]: Disconnecting invalid user auditadm 185.246.128.171 port 19433: Change of username or service not allowed: (auditadm,ssh-connection) -> (t128,ssh-connection) [preauth]
Feb 20 07:38:06 np0005625204.localdomain sshd[26587]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:06 np0005625204.localdomain sshd[26587]: Accepted publickey for ceph-admin from 192.168.122.103 port 44128 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:06 np0005625204.localdomain systemd-logind[759]: New session 15 of user ceph-admin.
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 1002.
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Starting User Manager for UID 1002...
Feb 20 07:38:06 np0005625204.localdomain sshd[26591]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:06 np0005625204.localdomain sshd[26606]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Queued start job for default target Main User Target.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Created slice User Application Slice.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Reached target Paths.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Reached target Timers.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Starting D-Bus User Message Bus Socket...
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Starting Create User's Volatile Files and Directories...
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Listening on D-Bus User Message Bus Socket.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Reached target Sockets.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Finished Create User's Volatile Files and Directories.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Reached target Basic System.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Reached target Main User Target.
Feb 20 07:38:06 np0005625204.localdomain systemd[26592]: Startup finished in 115ms.
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Started User Manager for UID 1002.
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Started Session 15 of User ceph-admin.
Feb 20 07:38:06 np0005625204.localdomain sshd[26587]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:06 np0005625204.localdomain sshd[26606]: Accepted publickey for ceph-admin from 192.168.122.103 port 44140 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:06 np0005625204.localdomain systemd-logind[759]: New session 17 of user ceph-admin.
Feb 20 07:38:06 np0005625204.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Feb 20 07:38:06 np0005625204.localdomain sshd[26606]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:06 np0005625204.localdomain sudo[26613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:06 np0005625204.localdomain sudo[26613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:06 np0005625204.localdomain sudo[26613]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:07 np0005625204.localdomain sshd[26628]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:07 np0005625204.localdomain sshd[26628]: Accepted publickey for ceph-admin from 192.168.122.103 port 44154 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:07 np0005625204.localdomain systemd-logind[759]: New session 18 of user ceph-admin.
Feb 20 07:38:07 np0005625204.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Feb 20 07:38:07 np0005625204.localdomain sshd[26628]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:07 np0005625204.localdomain sudo[26632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host --expect-hostname np0005625204.localdomain
Feb 20 07:38:07 np0005625204.localdomain sudo[26632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:07 np0005625204.localdomain sudo[26632]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:07 np0005625204.localdomain sshd[26647]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:07 np0005625204.localdomain sshd[26647]: Accepted publickey for ceph-admin from 192.168.122.103 port 44160 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:07 np0005625204.localdomain systemd-logind[759]: New session 19 of user ceph-admin.
Feb 20 07:38:07 np0005625204.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Feb 20 07:38:07 np0005625204.localdomain sshd[26647]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:07 np0005625204.localdomain sudo[26652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f
Feb 20 07:38:07 np0005625204.localdomain sudo[26652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:07 np0005625204.localdomain sudo[26652]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:07 np0005625204.localdomain sshd[26667]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:07 np0005625204.localdomain sshd[26667]: Accepted publickey for ceph-admin from 192.168.122.103 port 44164 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:07 np0005625204.localdomain systemd-logind[759]: New session 20 of user ceph-admin.
Feb 20 07:38:07 np0005625204.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Feb 20 07:38:07 np0005625204.localdomain sshd[26667]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:08 np0005625204.localdomain sudo[26671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:38:08 np0005625204.localdomain sudo[26671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:08 np0005625204.localdomain sudo[26671]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:08 np0005625204.localdomain sshd[26686]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:08 np0005625204.localdomain sshd[26686]: Accepted publickey for ceph-admin from 192.168.122.103 port 44166 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:08 np0005625204.localdomain systemd-logind[759]: New session 21 of user ceph-admin.
Feb 20 07:38:08 np0005625204.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Feb 20 07:38:08 np0005625204.localdomain sshd[26686]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:08 np0005625204.localdomain sudo[26690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:38:08 np0005625204.localdomain sudo[26690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:08 np0005625204.localdomain sudo[26690]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:08 np0005625204.localdomain sshd[26705]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:08 np0005625204.localdomain sshd[26705]: Accepted publickey for ceph-admin from 192.168.122.103 port 44170 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:08 np0005625204.localdomain systemd-logind[759]: New session 22 of user ceph-admin.
Feb 20 07:38:08 np0005625204.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Feb 20 07:38:08 np0005625204.localdomain sshd[26705]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:08 np0005625204.localdomain sudo[26709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f.new
Feb 20 07:38:08 np0005625204.localdomain sudo[26709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:08 np0005625204.localdomain sudo[26709]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:09 np0005625204.localdomain sshd[26724]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:09 np0005625204.localdomain sshd[26724]: Accepted publickey for ceph-admin from 192.168.122.103 port 44182 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:09 np0005625204.localdomain systemd-logind[759]: New session 23 of user ceph-admin.
Feb 20 07:38:09 np0005625204.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Feb 20 07:38:09 np0005625204.localdomain sshd[26724]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:09 np0005625204.localdomain sudo[26728]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:38:09 np0005625204.localdomain sudo[26728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:09 np0005625204.localdomain sudo[26728]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:09 np0005625204.localdomain sshd[26743]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:09 np0005625204.localdomain sshd[26743]: Accepted publickey for ceph-admin from 192.168.122.103 port 44190 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:09 np0005625204.localdomain systemd-logind[759]: New session 24 of user ceph-admin.
Feb 20 07:38:09 np0005625204.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Feb 20 07:38:09 np0005625204.localdomain sshd[26743]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:09 np0005625204.localdomain sudo[26747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f.new
Feb 20 07:38:09 np0005625204.localdomain sudo[26747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:09 np0005625204.localdomain sudo[26747]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:09 np0005625204.localdomain sshd[26762]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:09 np0005625204.localdomain sshd[26762]: Accepted publickey for ceph-admin from 192.168.122.103 port 55794 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:09 np0005625204.localdomain sshd[26591]: Invalid user t128 from 185.246.128.171 port 37588
Feb 20 07:38:09 np0005625204.localdomain systemd-logind[759]: New session 25 of user ceph-admin.
Feb 20 07:38:09 np0005625204.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Feb 20 07:38:09 np0005625204.localdomain sshd[26762]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:10 np0005625204.localdomain sshd[26779]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:10 np0005625204.localdomain sshd[26779]: Accepted publickey for ceph-admin from 192.168.122.103 port 55800 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:10 np0005625204.localdomain systemd-logind[759]: New session 26 of user ceph-admin.
Feb 20 07:38:10 np0005625204.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Feb 20 07:38:10 np0005625204.localdomain sshd[26779]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:10 np0005625204.localdomain sudo[26783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f
Feb 20 07:38:10 np0005625204.localdomain sudo[26783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:10 np0005625204.localdomain sudo[26783]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:10 np0005625204.localdomain sshd[26798]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:10 np0005625204.localdomain sshd[26798]: Accepted publickey for ceph-admin from 192.168.122.103 port 55806 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 07:38:10 np0005625204.localdomain systemd-logind[759]: New session 27 of user ceph-admin.
Feb 20 07:38:10 np0005625204.localdomain systemd[1]: Started Session 27 of User ceph-admin.
Feb 20 07:38:10 np0005625204.localdomain sshd[26798]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 07:38:10 np0005625204.localdomain sudo[26802]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host --expect-hostname np0005625204.localdomain
Feb 20 07:38:10 np0005625204.localdomain sudo[26802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:11 np0005625204.localdomain sudo[26802]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:11 np0005625204.localdomain sshd[26591]: Disconnecting invalid user t128 185.246.128.171 port 37588: Change of username or service not allowed: (t128,ssh-connection) -> (luis,ssh-connection) [preauth]
Feb 20 07:38:14 np0005625204.localdomain sshd[26837]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:18 np0005625204.localdomain sshd[26837]: Invalid user luis from 185.246.128.171 port 13770
Feb 20 07:38:20 np0005625204.localdomain sshd[26837]: Disconnecting invalid user luis 185.246.128.171 port 13770: Change of username or service not allowed: (luis,ssh-connection) -> (default,ssh-connection) [preauth]
Feb 20 07:38:23 np0005625204.localdomain sshd[26839]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:24 np0005625204.localdomain sudo[26841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:38:24 np0005625204.localdomain sudo[26841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:24 np0005625204.localdomain sudo[26841]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:24 np0005625204.localdomain sudo[26856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:24 np0005625204.localdomain sudo[26856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:24 np0005625204.localdomain sudo[26856]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:24 np0005625204.localdomain sudo[26871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 07:38:24 np0005625204.localdomain sudo[26871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:25 np0005625204.localdomain sudo[26871]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:25 np0005625204.localdomain sudo[26907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:25 np0005625204.localdomain sudo[26907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:25 np0005625204.localdomain sudo[26907]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:25 np0005625204.localdomain sudo[26922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 07:38:25 np0005625204.localdomain sudo[26922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:25 np0005625204.localdomain sudo[26922]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:25 np0005625204.localdomain sudo[26974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:25 np0005625204.localdomain sudo[26974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:25 np0005625204.localdomain sudo[26974]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:25 np0005625204.localdomain sudo[26989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:38:25 np0005625204.localdomain sudo[26989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:26 np0005625204.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 27016 (sysctl)
Feb 20 07:38:26 np0005625204.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 20 07:38:26 np0005625204.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 20 07:38:26 np0005625204.localdomain sudo[26989]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:26 np0005625204.localdomain sudo[27038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:26 np0005625204.localdomain sudo[27038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:26 np0005625204.localdomain sudo[27038]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:26 np0005625204.localdomain sudo[27053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 07:38:26 np0005625204.localdomain sudo[27053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:27 np0005625204.localdomain sudo[27053]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:27 np0005625204.localdomain sshd[27087]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:27 np0005625204.localdomain sudo[27089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:27 np0005625204.localdomain sudo[27089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:27 np0005625204.localdomain sudo[27089]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:27 np0005625204.localdomain sudo[27104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- inventory --format=json-pretty --filter-for-batch
Feb 20 07:38:27 np0005625204.localdomain sudo[27104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:27 np0005625204.localdomain sshd[26839]: Invalid user default from 185.246.128.171 port 57352
Feb 20 07:38:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:28 np0005625204.localdomain sshd[27087]: Invalid user nutanix from 202.165.22.246 port 51198
Feb 20 07:38:28 np0005625204.localdomain sshd[27087]: Received disconnect from 202.165.22.246 port 51198:11: Bye Bye [preauth]
Feb 20 07:38:28 np0005625204.localdomain sshd[27087]: Disconnected from invalid user nutanix 202.165.22.246 port 51198 [preauth]
Feb 20 07:38:31 np0005625204.localdomain kernel: VFS: idmapped mount is not enabled.
Feb 20 07:38:33 np0005625204.localdomain sshd[26839]: error: maximum authentication attempts exceeded for invalid user default from 185.246.128.171 port 57352 ssh2 [preauth]
Feb 20 07:38:33 np0005625204.localdomain sshd[26839]: Disconnecting invalid user default 185.246.128.171 port 57352: Too many authentication failures [preauth]
Feb 20 07:38:36 np0005625204.localdomain sshd[27268]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:36 np0005625204.localdomain sshd[27268]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:38:36 np0005625204.localdomain sshd[27270]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:42 np0005625204.localdomain sshd[27270]: Invalid user default from 185.246.128.171 port 60027
Feb 20 07:38:42 np0005625204.localdomain sshd[27270]: Disconnecting invalid user default 185.246.128.171 port 60027: Change of username or service not allowed: (default,ssh-connection) -> (minima,ssh-connection) [preauth]
Feb 20 07:38:42 np0005625204.localdomain sshd[27285]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:46 np0005625204.localdomain sshd[27285]: Invalid user minima from 185.246.128.171 port 27222
Feb 20 07:38:51 np0005625204.localdomain sshd[27285]: Disconnecting invalid user minima 185.246.128.171 port 27222: Change of username or service not allowed: (minima,ssh-connection) -> (zxcloudsetup,ssh-connection) [preauth]
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 2026-02-20 07:38:27.873420556 +0000 UTC m=+0.038013065 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 2026-02-20 07:38:51.450891786 +0000 UTC m=+23.615484275 container create 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, GIT_BRANCH=main, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.42.2, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Feb 20 07:38:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3672036252-merged.mount: Deactivated successfully.
Feb 20 07:38:51 np0005625204.localdomain systemd[1]: Created slice Slice /machine.
Feb 20 07:38:51 np0005625204.localdomain systemd[1]: Started libpod-conmon-57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab.scope.
Feb 20 07:38:51 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 2026-02-20 07:38:51.560986311 +0000 UTC m=+23.725578830 container init 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git)
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 2026-02-20 07:38:51.575422045 +0000 UTC m=+23.740014554 container start 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, ceph=True, build-date=2026-02-09T10:25:24Z, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.42.2)
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 2026-02-20 07:38:51.575821116 +0000 UTC m=+23.740413665 container attach 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_CLEAN=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, io.buildah.version=1.42.2, distribution-scope=public, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7)
Feb 20 07:38:51 np0005625204.localdomain blissful_curran[27301]: 167 167
Feb 20 07:38:51 np0005625204.localdomain systemd[1]: libpod-57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab.scope: Deactivated successfully.
Feb 20 07:38:51 np0005625204.localdomain podman[27156]: 2026-02-20 07:38:51.579424697 +0000 UTC m=+23.744017216 container died 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Feb 20 07:38:51 np0005625204.localdomain podman[27307]: 2026-02-20 07:38:51.681165198 +0000 UTC m=+0.086372631 container remove 57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_curran, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, ceph=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, version=7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:38:51 np0005625204.localdomain systemd[1]: libpod-conmon-57b244f5e80595dad6eafde058ac4210579ba3f215c0e41a99d3378ae6c484ab.scope: Deactivated successfully.
Feb 20 07:38:51 np0005625204.localdomain podman[27328]: 
Feb 20 07:38:52 np0005625204.localdomain podman[27328]: 2026-02-20 07:38:51.90323746 +0000 UTC m=+0.055926438 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:38:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-625fe6f7ab104b417b2859b195c9cbaa95ba9966850788fccd54bf3f215481ed-merged.mount: Deactivated successfully.
Feb 20 07:38:54 np0005625204.localdomain sshd[27583]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:38:55 np0005625204.localdomain podman[27328]: 2026-02-20 07:38:55.251660996 +0000 UTC m=+3.404349994 container create d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, ceph=True, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main)
Feb 20 07:38:55 np0005625204.localdomain systemd[1]: Started libpod-conmon-d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead.scope.
Feb 20 07:38:55 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:38:55 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d86875042f7cf3862c6ab37bebedd3f25b67e045a4a5c79e56937af643f90b0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:38:55 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4d86875042f7cf3862c6ab37bebedd3f25b67e045a4a5c79e56937af643f90b0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:38:55 np0005625204.localdomain podman[27328]: 2026-02-20 07:38:55.344562098 +0000 UTC m=+3.497251096 container init d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, description=Red Hat Ceph Storage 7)
Feb 20 07:38:55 np0005625204.localdomain podman[27328]: 2026-02-20 07:38:55.360114205 +0000 UTC m=+3.512803203 container start d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1770267347, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 07:38:55 np0005625204.localdomain podman[27328]: 2026-02-20 07:38:55.360428303 +0000 UTC m=+3.513117351 container attach d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, name=rhceph, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]: [
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:     {
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "available": false,
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "ceph_device": false,
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "lsm_data": {},
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "lvs": [],
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "path": "/dev/sr0",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "rejected_reasons": [
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "Insufficient space (<5GB)",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "Has a FileSystem"
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         ],
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         "sys_api": {
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "actuators": null,
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "device_nodes": "sr0",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "human_readable_size": "482.00 KB",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "id_bus": "ata",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "model": "QEMU DVD-ROM",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "nr_requests": "2",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "partitions": {},
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "path": "/dev/sr0",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "removable": "1",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "rev": "2.5+",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "ro": "0",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "rotational": "1",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "sas_address": "",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "sas_device_handle": "",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "scheduler_mode": "mq-deadline",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "sectors": 0,
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "sectorsize": "2048",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "size": 493568.0,
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "support_discard": "0",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "type": "disk",
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:             "vendor": "QEMU"
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:         }
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]:     }
Feb 20 07:38:56 np0005625204.localdomain gracious_ramanujan[27586]: ]
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: libpod-d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead.scope: Deactivated successfully.
Feb 20 07:38:56 np0005625204.localdomain podman[27328]: 2026-02-20 07:38:56.116309462 +0000 UTC m=+4.268998490 container died d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7)
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4d86875042f7cf3862c6ab37bebedd3f25b67e045a4a5c79e56937af643f90b0-merged.mount: Deactivated successfully.
Feb 20 07:38:56 np0005625204.localdomain podman[28972]: 2026-02-20 07:38:56.178336449 +0000 UTC m=+0.056506784 container remove d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gracious_ramanujan, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.expose-services=, name=rhceph, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: libpod-conmon-d65910bc66951be51b6e861b7cb62545364ddcca7db3f179877e4132278e7ead.scope: Deactivated successfully.
Feb 20 07:38:56 np0005625204.localdomain sudo[27104]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:56 np0005625204.localdomain sudo[28987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:38:56 np0005625204.localdomain sudo[28987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:56 np0005625204.localdomain sudo[28987]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:56 np0005625204.localdomain sudo[29002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 _orch set-coredump-overrides --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 --coredump-max-size=32G
Feb 20 07:38:56 np0005625204.localdomain sudo[29002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: Closed Process Core Dump Socket.
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: Stopping Process Core Dump Socket...
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: Listening on Process Core Dump Socket.
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:38:56 np0005625204.localdomain systemd-sysv-generator[29057]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:38:56 np0005625204.localdomain systemd-rc-local-generator[29052]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:38:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:38:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:38:57 np0005625204.localdomain systemd-rc-local-generator[29092]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:38:57 np0005625204.localdomain systemd-sysv-generator[29098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:38:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:38:57 np0005625204.localdomain sudo[29002]: pam_unix(sudo:session): session closed for user root
Feb 20 07:38:58 np0005625204.localdomain sshd[27583]: Invalid user zxcloudsetup from 185.246.128.171 port 20462
Feb 20 07:38:58 np0005625204.localdomain sshd[27583]: Disconnecting invalid user zxcloudsetup 185.246.128.171 port 20462: Change of username or service not allowed: (zxcloudsetup,ssh-connection) -> (rahul,ssh-connection) [preauth]
Feb 20 07:39:01 np0005625204.localdomain sshd[29104]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:03 np0005625204.localdomain sshd[29104]: Invalid user rahul from 185.246.128.171 port 52456
Feb 20 07:39:03 np0005625204.localdomain sshd[29104]: Disconnecting invalid user rahul 185.246.128.171 port 52456: Change of username or service not allowed: (rahul,ssh-connection) -> (supervisor,ssh-connection) [preauth]
Feb 20 07:39:06 np0005625204.localdomain sshd[29106]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:09 np0005625204.localdomain sshd[29106]: Invalid user supervisor from 185.246.128.171 port 15034
Feb 20 07:39:09 np0005625204.localdomain sshd[29106]: Disconnecting invalid user supervisor 185.246.128.171 port 15034: Change of username or service not allowed: (supervisor,ssh-connection) -> (su,ssh-connection) [preauth]
Feb 20 07:39:13 np0005625204.localdomain sshd[29108]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:15 np0005625204.localdomain sshd[29108]: Invalid user su from 185.246.128.171 port 49133
Feb 20 07:39:16 np0005625204.localdomain sshd[29108]: Disconnecting invalid user su 185.246.128.171 port 49133: Change of username or service not allowed: (su,ssh-connection) -> (jack,ssh-connection) [preauth]
Feb 20 07:39:19 np0005625204.localdomain sshd[29110]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:22 np0005625204.localdomain sshd[29110]: Invalid user jack from 185.246.128.171 port 17480
Feb 20 07:39:25 np0005625204.localdomain sudo[29112]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:25 np0005625204.localdomain sudo[29112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:25 np0005625204.localdomain sudo[29112]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:25 np0005625204.localdomain sudo[29127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:39:25 np0005625204.localdomain sudo[29127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:25 np0005625204.localdomain sshd[29155]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:39:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:39:25 np0005625204.localdomain sshd[29155]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:39:25 np0005625204.localdomain podman[29185]: 
Feb 20 07:39:25 np0005625204.localdomain podman[29185]: 2026-02-20 07:39:25.945478043 +0000 UTC m=+0.070302589 container create 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, vcs-type=git, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 07:39:25 np0005625204.localdomain systemd[1]: Started libpod-conmon-787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b.scope.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:26 np0005625204.localdomain podman[29185]: 2026-02-20 07:39:26.015694078 +0000 UTC m=+0.140518644 container init 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main)
Feb 20 07:39:26 np0005625204.localdomain podman[29185]: 2026-02-20 07:39:25.918476923 +0000 UTC m=+0.043301499 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:26 np0005625204.localdomain podman[29185]: 2026-02-20 07:39:26.024995695 +0000 UTC m=+0.149820271 container start 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, distribution-scope=public)
Feb 20 07:39:26 np0005625204.localdomain podman[29185]: 2026-02-20 07:39:26.025253523 +0000 UTC m=+0.150078089 container attach 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, release=1770267347, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public)
Feb 20 07:39:26 np0005625204.localdomain eager_kilby[29200]: 167 167
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: libpod-787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b.scope: Deactivated successfully.
Feb 20 07:39:26 np0005625204.localdomain podman[29185]: 2026-02-20 07:39:26.029664649 +0000 UTC m=+0.154489265 container died 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, release=1770267347, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-02-09T10:25:24Z)
Feb 20 07:39:26 np0005625204.localdomain podman[29205]: 2026-02-20 07:39:26.115189388 +0000 UTC m=+0.076313657 container remove 787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eager_kilby, vendor=Red Hat, Inc., GIT_CLEAN=True, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347)
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: libpod-conmon-787470a7fd0ad21ff86273de5972b7d5a6e40c30d983f48a2fdf3a4bf65dc50b.scope: Deactivated successfully.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:26 np0005625204.localdomain systemd-rc-local-generator[29242]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:26 np0005625204.localdomain systemd-sysv-generator[29247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:26 np0005625204.localdomain systemd-rc-local-generator[29283]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:26 np0005625204.localdomain systemd-sysv-generator[29287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Reached target All Ceph clusters and services.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:26 np0005625204.localdomain systemd-sysv-generator[29326]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:26 np0005625204.localdomain systemd-rc-local-generator[29322]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:26 np0005625204.localdomain sshd[29110]: Disconnecting invalid user jack 185.246.128.171 port 17480: Change of username or service not allowed: (jack,ssh-connection) -> (tazos,ssh-connection) [preauth]
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Reached target Ceph cluster a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 07:39:26 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:26 np0005625204.localdomain systemd-rc-local-generator[29360]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:26 np0005625204.localdomain systemd-sysv-generator[29366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:27 np0005625204.localdomain systemd-rc-local-generator[29404]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:27 np0005625204.localdomain systemd-sysv-generator[29408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: Created slice Slice /system/ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: Reached target System Time Set.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: Reached target System Time Synchronized.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: Starting Ceph crash.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 20 07:39:27 np0005625204.localdomain podman[29464]: 
Feb 20 07:39:27 np0005625204.localdomain podman[29464]: 2026-02-20 07:39:27.726997503 +0000 UTC m=+0.076411991 container create 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 20 07:39:27 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19ace6a7f463760f85db583445ebc2a43a1a1df86c33f2f74f5f1cf9aa4feaca/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:27 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19ace6a7f463760f85db583445ebc2a43a1a1df86c33f2f74f5f1cf9aa4feaca/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:27 np0005625204.localdomain podman[29464]: 2026-02-20 07:39:27.695733542 +0000 UTC m=+0.045148030 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:27 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19ace6a7f463760f85db583445ebc2a43a1a1df86c33f2f74f5f1cf9aa4feaca/merged/etc/ceph/ceph.client.crash.np0005625204.keyring supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:27 np0005625204.localdomain podman[29464]: 2026-02-20 07:39:27.815736358 +0000 UTC m=+0.165150856 container init 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.42.2, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, ceph=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:27 np0005625204.localdomain podman[29464]: 2026-02-20 07:39:27.82759134 +0000 UTC m=+0.177005838 container start 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, build-date=2026-02-09T10:25:24Z, version=7, vcs-type=git, io.buildah.version=1.42.2, name=rhceph, description=Red Hat Ceph Storage 7, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:39:27 np0005625204.localdomain bash[29464]: 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92
Feb 20 07:39:27 np0005625204.localdomain systemd[1]: Started Ceph crash.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 07:39:27 np0005625204.localdomain sudo[29127]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:27 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: INFO:ceph-crash:pinging cluster to exercise our key, trying key client.crash.np0005625204.
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:   cluster:
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     id:     a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     health: HEALTH_WARN
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:             OSD count 0 < osd_pool_default_size 3
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:  
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:   services:
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     mon: 3 daemons, quorum np0005625199,np0005625201,np0005625200 (age 13s)
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     mgr: np0005625199.ileebh(active, since 2m), standbys: np0005625201.mtnyvu, np0005625200.ypbkax
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     osd: 0 osds: 0 up, 0 in
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:  
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:   data:
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     pools:   0 pools, 0 pgs
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     objects: 0 objects, 0 B
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     usage:   0 B used, 0 B / 0 B avail
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     pgs:     
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:  
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:   progress:
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:     Updating crash deployment (+4 -> 6) (8s)
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:       [==============..............] (remaining: 8s)
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]:  
Feb 20 07:39:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204[29478]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Feb 20 07:39:30 np0005625204.localdomain sshd[29506]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:30 np0005625204.localdomain sudo[29507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:30 np0005625204.localdomain sudo[29507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:30 np0005625204.localdomain sudo[29507]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:30 np0005625204.localdomain sudo[29522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Feb 20 07:39:30 np0005625204.localdomain sudo[29522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 2026-02-20 07:39:31.482410246 +0000 UTC m=+0.057528078 container create 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, ceph=True, build-date=2026-02-09T10:25:24Z, distribution-scope=public, release=1770267347, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: Started libpod-conmon-57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64.scope.
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: tmp-crun.60qbEo.mount: Deactivated successfully.
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 2026-02-20 07:39:31.454919499 +0000 UTC m=+0.030037361 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 2026-02-20 07:39:31.559757426 +0000 UTC m=+0.134875258 container init 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, io.buildah.version=1.42.2, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_BRANCH=main, architecture=x86_64, build-date=2026-02-09T10:25:24Z, version=7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 2026-02-20 07:39:31.567278085 +0000 UTC m=+0.142395917 container start 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, release=1770267347, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 2026-02-20 07:39:31.567561114 +0000 UTC m=+0.142678986 container attach 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, GIT_CLEAN=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=)
Feb 20 07:39:31 np0005625204.localdomain xenodochial_khorana[29591]: 167 167
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: libpod-57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64.scope: Deactivated successfully.
Feb 20 07:39:31 np0005625204.localdomain podman[29576]: 2026-02-20 07:39:31.571194663 +0000 UTC m=+0.146312525 container died 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, GIT_CLEAN=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph)
Feb 20 07:39:31 np0005625204.localdomain podman[29596]: 2026-02-20 07:39:31.665736071 +0000 UTC m=+0.080158604 container remove 57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_khorana, name=rhceph, distribution-scope=public, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main)
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: libpod-conmon-57e439aeb5a6a9b11b4e5135636c6ddacd5db2df1717b9a4d1f53c5d68579a64.scope: Deactivated successfully.
Feb 20 07:39:31 np0005625204.localdomain podman[29617]: 
Feb 20 07:39:31 np0005625204.localdomain podman[29617]: 2026-02-20 07:39:31.878905839 +0000 UTC m=+0.079086328 container create f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main)
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: Started libpod-conmon-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope.
Feb 20 07:39:31 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:31 np0005625204.localdomain podman[29617]: 2026-02-20 07:39:31.842185949 +0000 UTC m=+0.042366448 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:31 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:31 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:31 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:31 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:31 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:31 np0005625204.localdomain podman[29617]: 2026-02-20 07:39:31.997045875 +0000 UTC m=+0.197226344 container init f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container)
Feb 20 07:39:32 np0005625204.localdomain podman[29617]: 2026-02-20 07:39:32.007335804 +0000 UTC m=+0.207516293 container start f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, RELEASE=main, version=7, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public)
Feb 20 07:39:32 np0005625204.localdomain podman[29617]: 2026-02-20 07:39:32.007593703 +0000 UTC m=+0.207774192 container attach f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, version=7, description=Red Hat Ceph Storage 7, ceph=True, io.openshift.tags=rhceph ceph, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public)
Feb 20 07:39:32 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> passed data devices: 0 physical, 2 LVM
Feb 20 07:39:32 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> relative data size: 1.0
Feb 20 07:39:32 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 20 07:39:32 np0005625204.localdomain systemd[1]: tmp-crun.ZYF5rj.mount: Deactivated successfully.
Feb 20 07:39:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-12aef165a6b94b5a43546700d80dfe1cf5541d2ba7c6202746b560edbf0420bc-merged.mount: Deactivated successfully.
Feb 20 07:39:32 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 246e60bc-5fa8-45c8-b746-372a7c540a58
Feb 20 07:39:33 np0005625204.localdomain lvm[29688]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 20 07:39:33 np0005625204.localdomain lvm[29688]: VG ceph_vg0 finished
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]:  stderr: got monmap epoch 3
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> Creating keyring file for osd.0
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Feb 20 07:39:33 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 246e60bc-5fa8-45c8-b746-372a7c540a58 --setuser ceph --setgroup ceph
Feb 20 07:39:34 np0005625204.localdomain sshd[29506]: Invalid user tazos from 185.246.128.171 port 6889
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]:  stderr: 2026-02-20T07:39:33.652+0000 7f0743d42a80 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]:  stderr: 2026-02-20T07:39:33.653+0000 7f0743d42a80 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> ceph-volume lvm activate successful for osd ID: 0
Feb 20 07:39:35 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 1635aa65-16b7-4b42-b3ab-efa9a5fbb750
Feb 20 07:39:36 np0005625204.localdomain lvm[30637]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 20 07:39:36 np0005625204.localdomain lvm[30637]: VG ceph_vg1 finished
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-3
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:36 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-3/activate.monmap
Feb 20 07:39:36 np0005625204.localdomain sshd[29506]: Disconnecting invalid user tazos 185.246.128.171 port 6889: Change of username or service not allowed: (tazos,ssh-connection) -> (p,ssh-connection) [preauth]
Feb 20 07:39:37 np0005625204.localdomain hopeful_chandrasekhar[29632]:  stderr: got monmap epoch 3
Feb 20 07:39:37 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> Creating keyring file for osd.3
Feb 20 07:39:37 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/keyring
Feb 20 07:39:37 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3/
Feb 20 07:39:37 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 3 --monmap /var/lib/ceph/osd/ceph-3/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-3/ --osd-uuid 1635aa65-16b7-4b42-b3ab-efa9a5fbb750 --setuser ceph --setgroup ceph
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]:  stderr: 2026-02-20T07:39:37.240+0000 7f5b6ae59a80 -1 bluestore(/var/lib/ceph/osd/ceph-3//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]:  stderr: 2026-02-20T07:39:37.240+0000 7f5b6ae59a80 -1 bluestore(/var/lib/ceph/osd/ceph-3/) _read_fsid unparsable uuid
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-3 --no-mon-config
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> ceph-volume lvm activate successful for osd ID: 3
Feb 20 07:39:39 np0005625204.localdomain hopeful_chandrasekhar[29632]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Feb 20 07:39:39 np0005625204.localdomain systemd[1]: libpod-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope: Deactivated successfully.
Feb 20 07:39:39 np0005625204.localdomain systemd[1]: libpod-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope: Consumed 3.675s CPU time.
Feb 20 07:39:39 np0005625204.localdomain podman[31555]: 2026-02-20 07:39:39.635213312 +0000 UTC m=+0.038139349 container died f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, version=7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 07:39:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-17f88e85752956434d7f090471dec4595df6aa95d245382b5628371817bf9541-merged.mount: Deactivated successfully.
Feb 20 07:39:39 np0005625204.localdomain podman[31555]: 2026-02-20 07:39:39.669824393 +0000 UTC m=+0.072750390 container remove f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_chandrasekhar, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:39:39 np0005625204.localdomain systemd[1]: libpod-conmon-f21b15117d73faa42d5fd489add5c739f0ce7427f331cb4ae1280ba46e25c80f.scope: Deactivated successfully.
Feb 20 07:39:39 np0005625204.localdomain sudo[29522]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:39 np0005625204.localdomain sudo[31571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:39 np0005625204.localdomain sudo[31571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:39 np0005625204.localdomain sudo[31571]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:39 np0005625204.localdomain sudo[31586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- lvm list --format json
Feb 20 07:39:39 np0005625204.localdomain sudo[31586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:39 np0005625204.localdomain sshd[31601]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 2026-02-20 07:39:40.337371743 +0000 UTC m=+0.107400792 container create 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, release=1770267347, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 2026-02-20 07:39:40.256588489 +0000 UTC m=+0.026617578 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:40 np0005625204.localdomain systemd[1]: Started libpod-conmon-89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289.scope.
Feb 20 07:39:40 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 2026-02-20 07:39:40.402057145 +0000 UTC m=+0.172086204 container init 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 2026-02-20 07:39:40.412044965 +0000 UTC m=+0.182074024 container start 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_BRANCH=main, release=1770267347, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.expose-services=)
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 2026-02-20 07:39:40.412337635 +0000 UTC m=+0.182366684 container attach 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 07:39:40 np0005625204.localdomain determined_franklin[31654]: 167 167
Feb 20 07:39:40 np0005625204.localdomain systemd[1]: libpod-89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289.scope: Deactivated successfully.
Feb 20 07:39:40 np0005625204.localdomain podman[31638]: 2026-02-20 07:39:40.415353394 +0000 UTC m=+0.185382463 container died 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, GIT_CLEAN=True, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=)
Feb 20 07:39:40 np0005625204.localdomain podman[31659]: 2026-02-20 07:39:40.500988248 +0000 UTC m=+0.076916838 container remove 89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_franklin, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z)
Feb 20 07:39:40 np0005625204.localdomain systemd[1]: libpod-conmon-89dd5fe4b6a3eb73224b659b6f6941f9ce44982c11ab8cb1963646a4d7a22289.scope: Deactivated successfully.
Feb 20 07:39:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-88fd5bb677778f9d42b1999f16126bd7fc0445005e779092c3691ea9909bd16f-merged.mount: Deactivated successfully.
Feb 20 07:39:40 np0005625204.localdomain podman[31679]: 
Feb 20 07:39:40 np0005625204.localdomain podman[31679]: 2026-02-20 07:39:40.674851561 +0000 UTC m=+0.038811481 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:40 np0005625204.localdomain podman[31679]: 2026-02-20 07:39:40.989229286 +0000 UTC m=+0.353189166 container create cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2)
Feb 20 07:39:41 np0005625204.localdomain systemd[1]: Started libpod-conmon-cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308.scope.
Feb 20 07:39:41 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:41 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:41 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:41 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:41 np0005625204.localdomain podman[31679]: 2026-02-20 07:39:41.059238894 +0000 UTC m=+0.423198774 container init cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, version=7, RELEASE=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64)
Feb 20 07:39:41 np0005625204.localdomain systemd[1]: tmp-crun.oapkBT.mount: Deactivated successfully.
Feb 20 07:39:41 np0005625204.localdomain podman[31679]: 2026-02-20 07:39:41.07336577 +0000 UTC m=+0.437325660 container start cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main)
Feb 20 07:39:41 np0005625204.localdomain podman[31679]: 2026-02-20 07:39:41.073636549 +0000 UTC m=+0.437596439 container attach cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]: {
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:     "0": [
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:         {
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "devices": [
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "/dev/loop3"
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             ],
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_name": "ceph_lv0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_size": "7511998464",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=z1sb4p-FPG9-bE4e-guP3-EYfP-SVIc-HQbUof,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=a8557ee9-b55d-5519-942c-cf8f6172f1d8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=246e60bc-5fa8-45c8-b746-372a7c540a58,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_uuid": "z1sb4p-FPG9-bE4e-guP3-EYfP-SVIc-HQbUof",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "name": "ceph_lv0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "path": "/dev/ceph_vg0/ceph_lv0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "tags": {
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.block_uuid": "z1sb4p-FPG9-bE4e-guP3-EYfP-SVIc-HQbUof",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.cephx_lockbox_secret": "",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.cluster_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.cluster_name": "ceph",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.crush_device_class": "",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.encrypted": "0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.osd_fsid": "246e60bc-5fa8-45c8-b746-372a7c540a58",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.osd_id": "0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.type": "block",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.vdo": "0"
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             },
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "type": "block",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "vg_name": "ceph_vg0"
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:         }
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:     ],
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:     "3": [
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:         {
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "devices": [
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "/dev/loop4"
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             ],
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_name": "ceph_lv1",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_size": "7511998464",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=ci6Jyp-kDDl-Vyqq-NknI-f3us-8bH1-9NWith,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=a8557ee9-b55d-5519-942c-cf8f6172f1d8,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=1635aa65-16b7-4b42-b3ab-efa9a5fbb750,ceph.osd_id=3,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "lv_uuid": "ci6Jyp-kDDl-Vyqq-NknI-f3us-8bH1-9NWith",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "name": "ceph_lv1",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "path": "/dev/ceph_vg1/ceph_lv1",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "tags": {
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.block_uuid": "ci6Jyp-kDDl-Vyqq-NknI-f3us-8bH1-9NWith",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.cephx_lockbox_secret": "",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.cluster_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.cluster_name": "ceph",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.crush_device_class": "",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.encrypted": "0",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.osd_fsid": "1635aa65-16b7-4b42-b3ab-efa9a5fbb750",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.osd_id": "3",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.type": "block",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:                 "ceph.vdo": "0"
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             },
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "type": "block",
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:             "vg_name": "ceph_vg1"
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:         }
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]:     ]
Feb 20 07:39:41 np0005625204.localdomain laughing_hamilton[31694]: }
Feb 20 07:39:41 np0005625204.localdomain systemd[1]: libpod-cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308.scope: Deactivated successfully.
Feb 20 07:39:41 np0005625204.localdomain podman[31679]: 2026-02-20 07:39:41.40821469 +0000 UTC m=+0.772174630 container died cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:41 np0005625204.localdomain podman[31704]: 2026-02-20 07:39:41.480814674 +0000 UTC m=+0.065773259 container remove cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=laughing_hamilton, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, ceph=True, architecture=x86_64, io.buildah.version=1.42.2)
Feb 20 07:39:41 np0005625204.localdomain systemd[1]: libpod-conmon-cbf611063e11939ed30cb2c91c69730f775eea61a47aa38607a305dc33cb5308.scope: Deactivated successfully.
Feb 20 07:39:41 np0005625204.localdomain sudo[31586]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:41 np0005625204.localdomain sudo[31719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:41 np0005625204.localdomain sudo[31719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:41 np0005625204.localdomain sudo[31719]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-b4de34948a604837cd4aadbee0dab07955e3e1a07e0ead3294f35725c305c7e1-merged.mount: Deactivated successfully.
Feb 20 07:39:41 np0005625204.localdomain sudo[31734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:39:41 np0005625204.localdomain sudo[31734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 2026-02-20 07:39:42.175660825 +0000 UTC m=+0.056614018 container create 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, io.openshift.expose-services=, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_BRANCH=main, vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: Started libpod-conmon-7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049.scope.
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 2026-02-20 07:39:42.234379811 +0000 UTC m=+0.115333014 container init 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 2026-02-20 07:39:42.243306756 +0000 UTC m=+0.124259979 container start 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 2026-02-20 07:39:42.243679698 +0000 UTC m=+0.124632891 container attach 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, ceph=True, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 07:39:42 np0005625204.localdomain youthful_visvesvaraya[31806]: 167 167
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: libpod-7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049.scope: Deactivated successfully.
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 2026-02-20 07:39:42.246983357 +0000 UTC m=+0.127936600 container died 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, ceph=True, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, version=7, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:42 np0005625204.localdomain podman[31791]: 2026-02-20 07:39:42.148961475 +0000 UTC m=+0.029914708 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:42 np0005625204.localdomain podman[31811]: 2026-02-20 07:39:42.335120032 +0000 UTC m=+0.074676582 container remove 7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=youthful_visvesvaraya, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, release=1770267347, vcs-type=git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7)
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: libpod-conmon-7641a1c35183c61d167f39f4e53d9997c5ddce37ca83234dff30e9153236e049.scope: Deactivated successfully.
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: tmp-crun.hw6ubF.mount: Deactivated successfully.
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-9b0dadab9abd311e939ddbceacdf99b9c066f88d556eb5a95407461778d365e2-merged.mount: Deactivated successfully.
Feb 20 07:39:42 np0005625204.localdomain podman[31838]: 
Feb 20 07:39:42 np0005625204.localdomain podman[31838]: 2026-02-20 07:39:42.655764585 +0000 UTC m=+0.071581861 container create 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True)
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: Started libpod-conmon-21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9.scope.
Feb 20 07:39:42 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:42 np0005625204.localdomain podman[31838]: 2026-02-20 07:39:42.627587556 +0000 UTC m=+0.043404842 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:42 np0005625204.localdomain podman[31838]: 2026-02-20 07:39:42.775977839 +0000 UTC m=+0.191795115 container init 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.42.2, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=)
Feb 20 07:39:42 np0005625204.localdomain podman[31838]: 2026-02-20 07:39:42.791518911 +0000 UTC m=+0.207336187 container start 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.expose-services=, vcs-type=git)
Feb 20 07:39:42 np0005625204.localdomain podman[31838]: 2026-02-20 07:39:42.791838391 +0000 UTC m=+0.207655677 container attach 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Feb 20 07:39:42 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test[31853]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Feb 20 07:39:42 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test[31853]:                             [--no-systemd] [--no-tmpfs]
Feb 20 07:39:42 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test[31853]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: libpod-21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9.scope: Deactivated successfully.
Feb 20 07:39:43 np0005625204.localdomain podman[31838]: 2026-02-20 07:39:43.007244684 +0000 UTC m=+0.423061970 container died 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, version=7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 07:39:43 np0005625204.localdomain podman[31858]: 2026-02-20 07:39:43.097400146 +0000 UTC m=+0.077393502 container remove 21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate-test, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2)
Feb 20 07:39:43 np0005625204.localdomain systemd-journald[618]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Feb 20 07:39:43 np0005625204.localdomain systemd-journald[618]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 07:39:43 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: libpod-conmon-21a22e3c6a62085b08ef01db25a28d051f1f2ca79e8c5ff97bacdaad9bd26cc9.scope: Deactivated successfully.
Feb 20 07:39:43 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:43 np0005625204.localdomain systemd-rc-local-generator[31913]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:43 np0005625204.localdomain systemd-sysv-generator[31917]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-a3d3d7040125500c847cce3699e454850f0c4ef5f665ca57e837e1e764793950-merged.mount: Deactivated successfully.
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: tmp-crun.KO4bMA.mount: Deactivated successfully.
Feb 20 07:39:43 np0005625204.localdomain sshd[31601]: Invalid user p from 185.246.128.171 port 55167
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:43 np0005625204.localdomain systemd-rc-local-generator[31954]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:43 np0005625204.localdomain systemd-sysv-generator[31960]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:43 np0005625204.localdomain systemd[1]: Starting Ceph osd.0 for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 07:39:44 np0005625204.localdomain podman[32018]: 
Feb 20 07:39:44 np0005625204.localdomain podman[32018]: 2026-02-20 07:39:44.201573114 +0000 UTC m=+0.068910994 container create b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public, release=1770267347, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git)
Feb 20 07:39:44 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:44 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:44 np0005625204.localdomain podman[32018]: 2026-02-20 07:39:44.173498898 +0000 UTC m=+0.040836778 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:44 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:44 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:44 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:44 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:44 np0005625204.localdomain podman[32018]: 2026-02-20 07:39:44.316216904 +0000 UTC m=+0.183554784 container init b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.expose-services=, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container)
Feb 20 07:39:44 np0005625204.localdomain podman[32018]: 2026-02-20 07:39:44.322428889 +0000 UTC m=+0.189766769 container start b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z)
Feb 20 07:39:44 np0005625204.localdomain podman[32018]: 2026-02-20 07:39:44.322638936 +0000 UTC m=+0.189976816 container attach b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, GIT_CLEAN=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7)
Feb 20 07:39:44 np0005625204.localdomain sshd[31601]: Disconnecting invalid user p 185.246.128.171 port 55167: Change of username or service not allowed: (p,ssh-connection) -> (oozie,ssh-connection) [preauth]
Feb 20 07:39:44 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 20 07:39:44 np0005625204.localdomain bash[32018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 20 07:39:44 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Feb 20 07:39:44 np0005625204.localdomain bash[32018]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-0 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Feb 20 07:39:44 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Feb 20 07:39:44 np0005625204.localdomain bash[32018]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Feb 20 07:39:44 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 20 07:39:44 np0005625204.localdomain bash[32018]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 20 07:39:44 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:44 np0005625204.localdomain bash[32018]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:44 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 20 07:39:44 np0005625204.localdomain bash[32018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Feb 20 07:39:45 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate[32032]: --> ceph-volume raw activate successful for osd ID: 0
Feb 20 07:39:45 np0005625204.localdomain bash[32018]: --> ceph-volume raw activate successful for osd ID: 0
Feb 20 07:39:45 np0005625204.localdomain systemd[1]: libpod-b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0.scope: Deactivated successfully.
Feb 20 07:39:45 np0005625204.localdomain podman[32146]: 2026-02-20 07:39:45.085451467 +0000 UTC m=+0.038125278 container died b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Feb 20 07:39:45 np0005625204.localdomain systemd[1]: tmp-crun.N3mrgR.mount: Deactivated successfully.
Feb 20 07:39:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-46c64cccb981ffcdb8413d7881683fbf4e05579e8ed50df34c073449accefe2c-merged.mount: Deactivated successfully.
Feb 20 07:39:45 np0005625204.localdomain podman[32146]: 2026-02-20 07:39:45.136643954 +0000 UTC m=+0.089317685 container remove b381ac1742a33219024ee5334eb046e612cccb63705acc2fc8cf21f73f66d8b0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0-activate, GIT_BRANCH=main, release=1770267347, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph)
Feb 20 07:39:45 np0005625204.localdomain podman[32207]: 
Feb 20 07:39:45 np0005625204.localdomain podman[32207]: 2026-02-20 07:39:45.433240594 +0000 UTC m=+0.069225533 container create ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0, RELEASE=main, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, version=7, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Feb 20 07:39:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:45 np0005625204.localdomain podman[32207]: 2026-02-20 07:39:45.406152381 +0000 UTC m=+0.042137340 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0b76a8bdff0d38702c817391a0cadb61451aafe81aeec004bd8b4ea381d1c80/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:45 np0005625204.localdomain podman[32207]: 2026-02-20 07:39:45.550885403 +0000 UTC m=+0.186870342 container init ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, name=rhceph, io.buildah.version=1.42.2)
Feb 20 07:39:45 np0005625204.localdomain podman[32207]: 2026-02-20 07:39:45.558825195 +0000 UTC m=+0.194810134 container start ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Feb 20 07:39:45 np0005625204.localdomain bash[32207]: ced4780c50d845341e762bdcc6bd66af77b06dafe5ec206731ae918e72f08b86
Feb 20 07:39:45 np0005625204.localdomain systemd[1]: Started Ceph osd.0 for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: set uid:gid to 167:167 (ceph:ceph)
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2
Feb 20 07:39:45 np0005625204.localdomain sudo[31734]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: pidfile_write: ignore empty --pid-file
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) close
Feb 20 07:39:45 np0005625204.localdomain sudo[32239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:45 np0005625204.localdomain sudo[32239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:45 np0005625204.localdomain sudo[32239]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:45 np0005625204.localdomain sudo[32254]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 07:39:45 np0005625204.localdomain sudo[32254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:45 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) close
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: load: jerasure load: lrc 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) close
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 2026-02-20 07:39:46.34601191 +0000 UTC m=+0.068117157 container create 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, distribution-scope=public, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Feb 20 07:39:46 np0005625204.localdomain systemd[1]: Started libpod-conmon-7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306.scope.
Feb 20 07:39:46 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 2026-02-20 07:39:46.412049838 +0000 UTC m=+0.134155075 container init 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, release=1770267347, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main)
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 2026-02-20 07:39:46.318813343 +0000 UTC m=+0.040918610 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 2026-02-20 07:39:46.42276455 +0000 UTC m=+0.144869787 container start 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, io.buildah.version=1.42.2, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vendor=Red Hat, Inc., GIT_CLEAN=True)
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 2026-02-20 07:39:46.423738983 +0000 UTC m=+0.145844240 container attach 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, build-date=2026-02-09T10:25:24Z, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True)
Feb 20 07:39:46 np0005625204.localdomain funny_carver[32334]: 167 167
Feb 20 07:39:46 np0005625204.localdomain systemd[1]: libpod-7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306.scope: Deactivated successfully.
Feb 20 07:39:46 np0005625204.localdomain podman[32319]: 2026-02-20 07:39:46.426928858 +0000 UTC m=+0.149034125 container died 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, GIT_CLEAN=True)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) close
Feb 20 07:39:46 np0005625204.localdomain podman[32341]: 2026-02-20 07:39:46.510968499 +0000 UTC m=+0.072302395 container remove 7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_carver, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, vcs-type=git, RELEASE=main, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.42.2, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 07:39:46 np0005625204.localdomain systemd[1]: libpod-conmon-7e819f4c5cefa48d4466740f9a94110c3c49153b3dbaf4db50daf21615bef306.scope: Deactivated successfully.
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd80e00 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs mount
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs mount shared_bdev_used = 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: RocksDB version: 7.9.2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Git sha 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: DB SUMMARY
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: DB Session ID:  B1RZZQUR7VWFY9T1SVAY
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: CURRENT file:  CURRENT
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: IDENTITY file:  IDENTITY
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.error_if_exists: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.create_if_missing: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.paranoid_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                     Options.env: 0x55bf8e014c40
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                Options.info_log: 0x55bf8ed1c7c0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_file_opening_threads: 16
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                              Options.statistics: (nil)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.use_fsync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.max_log_file_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.allow_fallocate: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.use_direct_reads: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.create_missing_column_families: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                              Options.db_log_dir: 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                 Options.wal_dir: db.wal
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.advise_random_on_open: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.write_buffer_manager: 0x55bf8dd6a140
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                            Options.rate_limiter: (nil)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.unordered_write: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.row_cache: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                              Options.wal_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.allow_ingest_behind: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.two_write_queues: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.manual_wal_flush: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.wal_compression: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.atomic_flush: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.log_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.allow_data_in_errors: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.db_host_id: __hostname__
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_background_jobs: 4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_background_compactions: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_subcompactions: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.max_open_files: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.bytes_per_sync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.max_background_flushes: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Compression algorithms supported:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kZSTD supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kXpressCompression supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kBZip2Compression supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kLZ4Compression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kZlibCompression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kLZ4HCCompression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kSnappyCompression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1c980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd58850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1cba0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1cba0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed1cba0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 58a199a0-9c9f-484d-9a27-dda744c2ce19
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573186732885, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573186733097, "job": 1, "event": "recovery_finished"}
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: freelist init
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: freelist _read_cfg
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs umount
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) close
Feb 20 07:39:46 np0005625204.localdomain podman[32568]: 
Feb 20 07:39:46 np0005625204.localdomain podman[32568]: 2026-02-20 07:39:46.832343046 +0000 UTC m=+0.073770344 container create f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 20 07:39:46 np0005625204.localdomain systemd[1]: Started libpod-conmon-f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477.scope.
Feb 20 07:39:46 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:46 np0005625204.localdomain podman[32568]: 2026-02-20 07:39:46.808642394 +0000 UTC m=+0.050069752 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:46 np0005625204.localdomain podman[32568]: 2026-02-20 07:39:46.960555702 +0000 UTC m=+0.201983030 container init f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, release=1770267347, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=)
Feb 20 07:39:46 np0005625204.localdomain podman[32568]: 2026-02-20 07:39:46.971499113 +0000 UTC m=+0.212926431 container start f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, io.openshift.tags=rhceph ceph, release=1770267347, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.buildah.version=1.42.2, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 07:39:46 np0005625204.localdomain podman[32568]: 2026-02-20 07:39:46.971860675 +0000 UTC m=+0.213288133 container attach f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_CLEAN=True, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vcs-type=git, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bdev(0x55bf8dd81180 /var/lib/ceph/osd/ceph-0/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 7.0 GiB
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs mount
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluefs mount shared_bdev_used = 4718592
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: RocksDB version: 7.9.2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Git sha 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: DB SUMMARY
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: DB Session ID:  B1RZZQUR7VWFY9T1SVAZ
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: CURRENT file:  CURRENT
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: IDENTITY file:  IDENTITY
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.error_if_exists: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.create_if_missing: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.paranoid_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                     Options.env: 0x55bf8ddc4690
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                Options.info_log: 0x55bf8ed883c0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_file_opening_threads: 16
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                              Options.statistics: (nil)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.use_fsync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.max_log_file_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.allow_fallocate: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.use_direct_reads: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.create_missing_column_families: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                              Options.db_log_dir: 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                                 Options.wal_dir: db.wal
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.advise_random_on_open: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.write_buffer_manager: 0x55bf8dd6b5e0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                            Options.rate_limiter: (nil)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.unordered_write: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.row_cache: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                              Options.wal_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.allow_ingest_behind: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.two_write_queues: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.manual_wal_flush: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.wal_compression: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.atomic_flush: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.log_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.allow_data_in_errors: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.db_host_id: __hostname__
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_background_jobs: 4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_background_compactions: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_subcompactions: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.max_open_files: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.bytes_per_sync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.max_background_flushes: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Compression algorithms supported:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kZSTD supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kXpressCompression supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kBZip2Compression supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kLZ4Compression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kZlibCompression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kLZ4HCCompression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         kSnappyCompression supported: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: 
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:46 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed88620)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd582d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed89980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd59610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed89980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd59610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bf8ed89980)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55bf8dd59610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 58a199a0-9c9f-484d-9a27-dda744c2ce19
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187009304, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187015153, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573187, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "58a199a0-9c9f-484d-9a27-dda744c2ce19", "db_session_id": "B1RZZQUR7VWFY9T1SVAZ", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187019626, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573187, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "58a199a0-9c9f-484d-9a27-dda744c2ce19", "db_session_id": "B1RZZQUR7VWFY9T1SVAZ", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187023584, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573187, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "58a199a0-9c9f-484d-9a27-dda744c2ce19", "db_session_id": "B1RZZQUR7VWFY9T1SVAZ", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573187028190, "job": 1, "event": "recovery_finished"}
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bf8de1a700
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: DB pointer 0x55bf8ec73a00
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.6e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: _get_class not permitted to load lua
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: _get_class not permitted to load sdk
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: _get_class not permitted to load test_remote_reads
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 load_pgs
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 load_pgs opened 0 pgs
Feb 20 07:39:47 np0005625204.localdomain ceph-osd[32226]: osd.0 0 log_to_monitors true
Feb 20 07:39:47 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0[32221]: 2026-02-20T07:39:47.067+0000 7fd036886a80 -1 osd.0 0 log_to_monitors true
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-88af1a01f39f8f316dcc9d82ed7d17047230f7f134add5e5b1e51207e3286387-merged.mount: Deactivated successfully.
Feb 20 07:39:47 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test[32583]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Feb 20 07:39:47 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test[32583]:                             [--no-systemd] [--no-tmpfs]
Feb 20 07:39:47 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test[32583]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: libpod-f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477.scope: Deactivated successfully.
Feb 20 07:39:47 np0005625204.localdomain podman[32568]: 2026-02-20 07:39:47.188809639 +0000 UTC m=+0.430236957 container died f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-90dc0c2971019d6c646ad1bada8ad39e2bd373082d0234fed4f8203a76badce5-merged.mount: Deactivated successfully.
Feb 20 07:39:47 np0005625204.localdomain podman[32803]: 2026-02-20 07:39:47.262479178 +0000 UTC m=+0.067979602 container remove f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate-test, ceph=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, version=7)
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: libpod-conmon-f828d04cbbea06ebf94d21ad45ec374cdab17e46dec2409af6e1ede7366c5477.scope: Deactivated successfully.
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:47 np0005625204.localdomain systemd-sysv-generator[32857]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:47 np0005625204.localdomain systemd-rc-local-generator[32850]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:39:47 np0005625204.localdomain systemd-rc-local-generator[32900]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:39:47 np0005625204.localdomain systemd-sysv-generator[32905]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:39:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 20 07:39:48 np0005625204.localdomain systemd[1]: Starting Ceph osd.3 for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 07:39:48 np0005625204.localdomain sshd[32965]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:48 np0005625204.localdomain podman[32959]: 
Feb 20 07:39:48 np0005625204.localdomain podman[32959]: 2026-02-20 07:39:48.461268954 +0000 UTC m=+0.072977997 container create 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Feb 20 07:39:48 np0005625204.localdomain systemd[1]: tmp-crun.QEeBDS.mount: Deactivated successfully.
Feb 20 07:39:48 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:48 np0005625204.localdomain podman[32959]: 2026-02-20 07:39:48.432209607 +0000 UTC m=+0.043918650 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:48 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:48 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:48 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:48 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:48 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:48 np0005625204.localdomain podman[32959]: 2026-02-20 07:39:48.594352083 +0000 UTC m=+0.206061136 container init 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc.)
Feb 20 07:39:48 np0005625204.localdomain podman[32959]: 2026-02-20 07:39:48.603087891 +0000 UTC m=+0.214796944 container start 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, version=7, GIT_CLEAN=True, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, name=rhceph, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public)
Feb 20 07:39:48 np0005625204.localdomain podman[32959]: 2026-02-20 07:39:48.603291167 +0000 UTC m=+0.215000220 container attach 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0 done with init, starting boot process
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0 start_boot
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 20 07:39:48 np0005625204.localdomain ceph-osd[32226]: osd.0 0  bench count 12288000 bsize 4 KiB
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-3 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3
Feb 20 07:39:49 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate[32973]: --> ceph-volume raw activate successful for osd ID: 3
Feb 20 07:39:49 np0005625204.localdomain bash[32959]: --> ceph-volume raw activate successful for osd ID: 3
Feb 20 07:39:49 np0005625204.localdomain systemd[1]: libpod-734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9.scope: Deactivated successfully.
Feb 20 07:39:49 np0005625204.localdomain podman[32959]: 2026-02-20 07:39:49.306229035 +0000 UTC m=+0.917938078 container died 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, vcs-type=git, io.openshift.expose-services=, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7)
Feb 20 07:39:49 np0005625204.localdomain podman[33095]: 2026-02-20 07:39:49.41073929 +0000 UTC m=+0.094895770 container remove 734c5288eeb483d2c32d8c741c144682f81ddea135ae401087305cdc346d7fb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3-activate, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, release=1770267347, io.openshift.expose-services=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, name=rhceph)
Feb 20 07:39:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-73617776f9e5a9cc5d703c1e623c73b48d758ddef93d5b4b722f4efc33c80925-merged.mount: Deactivated successfully.
Feb 20 07:39:49 np0005625204.localdomain podman[33159]: 
Feb 20 07:39:49 np0005625204.localdomain podman[33159]: 2026-02-20 07:39:49.736729299 +0000 UTC m=+0.088112016 container create bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=1770267347, version=7, name=rhceph)
Feb 20 07:39:49 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:49 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:49 np0005625204.localdomain podman[33159]: 2026-02-20 07:39:49.705096916 +0000 UTC m=+0.056479693 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:49 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:49 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:49 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e35ab3f357633a6efd70c4e65e07713639c500d3fb221f683d0eea4ba060100/merged/var/lib/ceph/osd/ceph-3 supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:49 np0005625204.localdomain podman[33159]: 2026-02-20 07:39:49.834082889 +0000 UTC m=+0.185465636 container init bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 07:39:49 np0005625204.localdomain podman[33159]: 2026-02-20 07:39:49.851194713 +0000 UTC m=+0.202577430 container start bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, build-date=2026-02-09T10:25:24Z, name=rhceph, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 07:39:49 np0005625204.localdomain bash[33159]: bdc3228407c0c36f5a92fd5f014341a45c6fa1b30fe114cac39db6f5a19a0d35
Feb 20 07:39:49 np0005625204.localdomain systemd[1]: Started Ceph osd.3 for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: set uid:gid to 167:167 (ceph:ceph)
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-osd, pid 2
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: pidfile_write: ignore empty --pid-file
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Feb 20 07:39:49 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) close
Feb 20 07:39:49 np0005625204.localdomain sudo[32254]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:50 np0005625204.localdomain sudo[33190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:50 np0005625204.localdomain sudo[33190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:50 np0005625204.localdomain sudo[33190]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:50 np0005625204.localdomain sudo[33205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- raw list --format json
Feb 20 07:39:50 np0005625204.localdomain sudo[33205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) close
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: starting osd.3 osd_data /var/lib/ceph/osd/ceph-3 /var/lib/ceph/osd/ceph-3/journal
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: load: jerasure load: lrc 
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) close
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 2026-02-20 07:39:50.661060136 +0000 UTC m=+0.077328480 container create b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, vcs-type=git)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) close
Feb 20 07:39:50 np0005625204.localdomain systemd[1]: Started libpod-conmon-b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862.scope.
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 2026-02-20 07:39:50.624725988 +0000 UTC m=+0.040994332 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:50 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 2026-02-20 07:39:50.750289718 +0000 UTC m=+0.166558032 container init b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 07:39:50 np0005625204.localdomain systemd[1]: tmp-crun.otriDV.mount: Deactivated successfully.
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 2026-02-20 07:39:50.76246215 +0000 UTC m=+0.178730474 container start b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1770267347, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7)
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 2026-02-20 07:39:50.762643106 +0000 UTC m=+0.178911500 container attach b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc.)
Feb 20 07:39:50 np0005625204.localdomain systemd[1]: libpod-b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862.scope: Deactivated successfully.
Feb 20 07:39:50 np0005625204.localdomain adoring_benz[33283]: 167 167
Feb 20 07:39:50 np0005625204.localdomain podman[33264]: 2026-02-20 07:39:50.766749871 +0000 UTC m=+0.183018195 container died b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, ceph=True, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, release=1770267347)
Feb 20 07:39:50 np0005625204.localdomain podman[33288]: 2026-02-20 07:39:50.919075844 +0000 UTC m=+0.137617779 container remove b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_benz, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.expose-services=, release=1770267347)
Feb 20 07:39:50 np0005625204.localdomain systemd[1]: libpod-conmon-b07321a0a22abb734f6aa7ad896fbbdb16de080574b455b2d5b72e4f805e9862.scope: Deactivated successfully.
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: osd.3:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6972e00 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluefs mount
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluefs mount shared_bdev_used = 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: RocksDB version: 7.9.2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Git sha 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: DB SUMMARY
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: DB Session ID:  O6R872HUA9PK85WK15LU
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: CURRENT file:  CURRENT
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: IDENTITY file:  IDENTITY
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.error_if_exists: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.create_if_missing: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.paranoid_checks: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                     Options.env: 0x55cba6c06cb0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                Options.info_log: 0x55cba7910d00
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_file_opening_threads: 16
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                              Options.statistics: (nil)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.use_fsync: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.max_log_file_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.allow_fallocate: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.use_direct_reads: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.create_missing_column_families: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                              Options.db_log_dir: 
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                 Options.wal_dir: db.wal
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.advise_random_on_open: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.write_buffer_manager: 0x55cba695c140
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                            Options.rate_limiter: (nil)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.unordered_write: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.row_cache: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                              Options.wal_filter: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.allow_ingest_behind: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.two_write_queues: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.manual_wal_flush: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.wal_compression: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.atomic_flush: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.log_readahead_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.allow_data_in_errors: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.db_host_id: __hostname__
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_background_jobs: 4
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_background_compactions: -1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_subcompactions: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.max_open_files: -1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.bytes_per_sync: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.max_background_flushes: -1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Compression algorithms supported:
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kZSTD supported: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kXpressCompression supported: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kBZip2Compression supported: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kLZ4Compression supported: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kZlibCompression supported: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kLZ4HCCompression supported: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kSnappyCompression supported: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: 
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7910ec0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba79110e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba79110e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba79110e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 06dd77e8-a884-4c07-8af2-d9bd01a9e776
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191000172, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191000358, "job": 1, "event": "recovery_finished"}
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old nid_max 1025
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta old blobid_max 10240
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_super_meta min_alloc_size 0x1000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: freelist init
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: freelist _read_cfg
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluefs umount
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) close
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 2026-02-20 07:39:51.12702054 +0000 UTC m=+0.076320448 container create a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-02-09T10:25:24Z, name=rhceph, vcs-type=git, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, ceph=True)
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 2026-02-20 07:39:51.084132725 +0000 UTC m=+0.033432643 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:51 np0005625204.localdomain systemd[1]: Started libpod-conmon-a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892.scope.
Feb 20 07:39:51 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open path /var/lib/ceph/osd/ceph-3/block
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-3/block failed: (22) Invalid argument
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bdev(0x55cba6973180 /var/lib/ceph/osd/ceph-3/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-3/block size 7.0 GiB
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluefs mount
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluefs mount shared_bdev_used = 4718592
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: RocksDB version: 7.9.2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Git sha 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: DB SUMMARY
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: DB Session ID:  O6R872HUA9PK85WK15LV
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: CURRENT file:  CURRENT
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: IDENTITY file:  IDENTITY
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.error_if_exists: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.create_if_missing: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.paranoid_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                     Options.env: 0x55cba6a98690
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                Options.info_log: 0x55cba79288a0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_file_opening_threads: 16
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                              Options.statistics: (nil)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.use_fsync: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.max_log_file_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.allow_fallocate: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.use_direct_reads: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.create_missing_column_families: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                              Options.db_log_dir: 
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                                 Options.wal_dir: db.wal
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.advise_random_on_open: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.write_buffer_manager: 0x55cba695d5e0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                            Options.rate_limiter: (nil)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.unordered_write: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.row_cache: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                              Options.wal_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.allow_ingest_behind: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.two_write_queues: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.manual_wal_flush: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.wal_compression: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.atomic_flush: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.log_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.allow_data_in_errors: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.db_host_id: __hostname__
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_background_jobs: 4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_background_compactions: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_subcompactions: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.max_open_files: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.bytes_per_sync: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.max_background_flushes: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Compression algorithms supported:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kZSTD supported: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kXpressCompression supported: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kBZip2Compression supported: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kLZ4Compression supported: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kZlibCompression supported: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kLZ4HCCompression supported: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         kSnappyCompression supported: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: 
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7928940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694a2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7929920)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7929920)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:           Options.merge_operator: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55cba7929920)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55cba694b610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.write_buffer_size: 16777216
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.max_write_buffer_number: 64
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.compression: LZ4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.num_levels: 7
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.bloom_locality: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                               Options.ttl: 2592000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                       Options.enable_blob_files: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                           Options.min_blob_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 06dd77e8-a884-4c07-8af2-d9bd01a9e776
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191283678, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191297060, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573191, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06dd77e8-a884-4c07-8af2-d9bd01a9e776", "db_session_id": "O6R872HUA9PK85WK15LV", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191309314, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1607, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 466, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573191, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06dd77e8-a884-4c07-8af2-d9bd01a9e776", "db_session_id": "O6R872HUA9PK85WK15LV", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 2026-02-20 07:39:51.30991227 +0000 UTC m=+0.259212168 container init a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191315169, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771573191, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "06dd77e8-a884-4c07-8af2-d9bd01a9e776", "db_session_id": "O6R872HUA9PK85WK15LV", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 2026-02-20 07:39:51.322513596 +0000 UTC m=+0.271813484 container start a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git)
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 2026-02-20 07:39:51.325437533 +0000 UTC m=+0.274737421 container attach a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, release=1770267347, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, ceph=True)
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771573191327512, "job": 1, "event": "recovery_finished"}
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55cba69b2700
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: DB pointer 0x55cba786fa00
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super from 4, latest 4
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _upgrade_super done
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.01 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 5e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: _get_class not permitted to load lua
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: _get_class not permitted to load sdk
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: _get_class not permitted to load test_remote_reads
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 load_pgs
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 load_pgs opened 0 pgs
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[33177]: osd.3 0 log_to_monitors true
Feb 20 07:39:51 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3[33173]: 2026-02-20T07:39:51.374+0000 7fcb116d6a80 -1 osd.3 0 log_to_monitors true
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 16.388 iops: 4195.397 elapsed_sec: 0.715
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [WRN] : OSD bench result of 4195.396697 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 0 waiting for initial osdmap
Feb 20 07:39:51 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0[32221]: 2026-02-20T07:39:51.380+0000 7fd032805640 -1 osd.0 0 waiting for initial osdmap
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 crush map has features 288514050185494528, adjusting msgr requires for clients
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 crush map has features 3314932999778484224, adjusting msgr requires for osds
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 check_osdmap_features require_osd_release unknown -> reef
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 set_numa_affinity not setting numa affinity
Feb 20 07:39:51 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-0[32221]: 2026-02-20T07:39:51.395+0000 7fd02de2f640 -1 osd.0 10 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 10 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Feb 20 07:39:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8624e117190d60c41c582dbf41f5a1c10e803c6c47b932fd4c755d9c01a0e2-merged.mount: Deactivated successfully.
Feb 20 07:39:51 np0005625204.localdomain ceph-osd[32226]: osd.0 11 state: booting -> active
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]: {
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:     "1635aa65-16b7-4b42-b3ab-efa9a5fbb750": {
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "ceph_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8",
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "osd_id": 3,
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "osd_uuid": "1635aa65-16b7-4b42-b3ab-efa9a5fbb750",
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "type": "bluestore"
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:     },
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:     "246e60bc-5fa8-45c8-b746-372a7c540a58": {
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "ceph_fsid": "a8557ee9-b55d-5519-942c-cf8f6172f1d8",
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "osd_id": 0,
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "osd_uuid": "246e60bc-5fa8-45c8-b746-372a7c540a58",
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:         "type": "bluestore"
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]:     }
Feb 20 07:39:51 np0005625204.localdomain vibrant_dubinsky[33519]: }
Feb 20 07:39:51 np0005625204.localdomain systemd[1]: libpod-a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892.scope: Deactivated successfully.
Feb 20 07:39:51 np0005625204.localdomain podman[33503]: 2026-02-20 07:39:51.922960094 +0000 UTC m=+0.872259982 container died a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, version=7, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main)
Feb 20 07:39:51 np0005625204.localdomain systemd[1]: tmp-crun.zo0Sm6.mount: Deactivated successfully.
Feb 20 07:39:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-59e5a885b05ebb4140d5a3db78aad18887a69bf3e60bc75b8105f640c4d5cdd6-merged.mount: Deactivated successfully.
Feb 20 07:39:52 np0005625204.localdomain podman[33773]: 2026-02-20 07:39:52.025144893 +0000 UTC m=+0.091303381 container remove a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_dubinsky, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7)
Feb 20 07:39:52 np0005625204.localdomain systemd[1]: libpod-conmon-a59d7c63a810c61f66417d8f176ffe2139e40e729fe6790f0c9b3f844c557892.scope: Deactivated successfully.
Feb 20 07:39:52 np0005625204.localdomain sudo[33205]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 20 07:39:52 np0005625204.localdomain sshd[32965]: Invalid user oozie from 185.246.128.171 port 32533
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0 done with init, starting boot process
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0 start_boot
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 20 07:39:52 np0005625204.localdomain ceph-osd[33177]: osd.3 0  bench count 12288000 bsize 4 KiB
Feb 20 07:39:53 np0005625204.localdomain sshd[32965]: Disconnecting invalid user oozie 185.246.128.171 port 32533: Change of username or service not allowed: (oozie,ssh-connection) -> (caja,ssh-connection) [preauth]
Feb 20 07:39:53 np0005625204.localdomain sudo[33788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:39:53 np0005625204.localdomain sudo[33788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:53 np0005625204.localdomain sudo[33788]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:53 np0005625204.localdomain sudo[33803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:53 np0005625204.localdomain sudo[33803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:53 np0005625204.localdomain sudo[33803]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:53 np0005625204.localdomain sudo[33818]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 07:39:53 np0005625204.localdomain sudo[33818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:53 np0005625204.localdomain ceph-osd[32226]: osd.0 13 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 20 07:39:53 np0005625204.localdomain ceph-osd[32226]: osd.0 13 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Feb 20 07:39:53 np0005625204.localdomain ceph-osd[32226]: osd.0 13 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 20 07:39:54 np0005625204.localdomain podman[33901]: 2026-02-20 07:39:54.036007285 +0000 UTC m=+0.089001776 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, release=1770267347, version=7, CEPH_POINT_RELEASE=)
Feb 20 07:39:54 np0005625204.localdomain podman[33901]: 2026-02-20 07:39:54.138439523 +0000 UTC m=+0.191434044 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=)
Feb 20 07:39:54 np0005625204.localdomain sudo[33818]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:54 np0005625204.localdomain sudo[33969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:54 np0005625204.localdomain sudo[33969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:54 np0005625204.localdomain sudo[33969]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:54 np0005625204.localdomain sudo[33984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:39:54 np0005625204.localdomain sudo[33984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 31.913 iops: 8169.697 elapsed_sec: 0.367
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [WRN] : OSD bench result of 8169.696602 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.3. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 0 waiting for initial osdmap
Feb 20 07:39:55 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3[33173]: 2026-02-20T07:39:55.141+0000 7fcb0de6a640 -1 osd.3 0 waiting for initial osdmap
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 check_osdmap_features require_osd_release unknown -> reef
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 set_numa_affinity not setting numa affinity
Feb 20 07:39:55 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-osd-3[33173]: 2026-02-20T07:39:55.159+0000 7fcb08c7f640 -1 osd.3 14 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 14 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Feb 20 07:39:55 np0005625204.localdomain sudo[33984]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:55 np0005625204.localdomain sudo[34032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:39:55 np0005625204.localdomain sudo[34032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:55 np0005625204.localdomain sudo[34032]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:55 np0005625204.localdomain sudo[34047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- inventory --format=json-pretty --filter-for-batch
Feb 20 07:39:55 np0005625204.localdomain sudo[34047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:55 np0005625204.localdomain sshd[34062]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:39:55 np0005625204.localdomain ceph-osd[33177]: osd.3 15 state: booting -> active
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 2026-02-20 07:39:56.029541606 +0000 UTC m=+0.059467091 container create 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True)
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: Started libpod-conmon-29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264.scope.
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 2026-02-20 07:39:56.101033533 +0000 UTC m=+0.130959018 container init 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vendor=Red Hat, Inc., release=1770267347, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True)
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 2026-02-20 07:39:56.01025445 +0000 UTC m=+0.040179925 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 2026-02-20 07:39:56.111157867 +0000 UTC m=+0.141083362 container start 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vcs-type=git, version=7, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.42.2, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 07:39:56 np0005625204.localdomain wizardly_blackburn[34116]: 167 167
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: libpod-29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264.scope: Deactivated successfully.
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 2026-02-20 07:39:56.111416146 +0000 UTC m=+0.141341681 container attach 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, io.k8s.description=Red Hat Ceph Storage 7, version=7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, GIT_CLEAN=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347)
Feb 20 07:39:56 np0005625204.localdomain podman[34100]: 2026-02-20 07:39:56.118012813 +0000 UTC m=+0.147938298 container died 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-950dc5cbe605b0669c02a8a224b7d97add694902b56109e9ba2a7af24623ea27-merged.mount: Deactivated successfully.
Feb 20 07:39:56 np0005625204.localdomain podman[34121]: 2026-02-20 07:39:56.204460033 +0000 UTC m=+0.074951842 container remove 29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_blackburn, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: libpod-conmon-29f6087a53a1f39c3d9583d6aad1a35904f78c72b3befb2f51bb2a76bc872264.scope: Deactivated successfully.
Feb 20 07:39:56 np0005625204.localdomain podman[34143]: 
Feb 20 07:39:56 np0005625204.localdomain podman[34143]: 2026-02-20 07:39:56.391377147 +0000 UTC m=+0.060435724 container create f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, build-date=2026-02-09T10:25:24Z)
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: Started libpod-conmon-f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39.scope.
Feb 20 07:39:56 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:39:56 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:56 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:56 np0005625204.localdomain podman[34143]: 2026-02-20 07:39:56.3614511 +0000 UTC m=+0.030509717 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 07:39:56 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 07:39:56 np0005625204.localdomain podman[34143]: 2026-02-20 07:39:56.477895469 +0000 UTC m=+0.146954096 container init f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, distribution-scope=public, ceph=True, GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=)
Feb 20 07:39:56 np0005625204.localdomain podman[34143]: 2026-02-20 07:39:56.488738887 +0000 UTC m=+0.157797474 container start f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 07:39:56 np0005625204.localdomain podman[34143]: 2026-02-20 07:39:56.489110689 +0000 UTC m=+0.158169286 container attach f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_BRANCH=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.component=rhceph-container)
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]: [
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:     {
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "available": false,
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "ceph_device": false,
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "lsm_data": {},
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "lvs": [],
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "path": "/dev/sr0",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "rejected_reasons": [
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "Has a FileSystem",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "Insufficient space (<5GB)"
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         ],
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         "sys_api": {
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "actuators": null,
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "device_nodes": "sr0",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "human_readable_size": "482.00 KB",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "id_bus": "ata",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "model": "QEMU DVD-ROM",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "nr_requests": "2",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "partitions": {},
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "path": "/dev/sr0",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "removable": "1",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "rev": "2.5+",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "ro": "0",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "rotational": "1",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "sas_address": "",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "sas_device_handle": "",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "scheduler_mode": "mq-deadline",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "sectors": 0,
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "sectorsize": "2048",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "size": 493568.0,
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "support_discard": "0",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "type": "disk",
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:             "vendor": "QEMU"
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:         }
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]:     }
Feb 20 07:39:57 np0005625204.localdomain optimistic_black[34159]: ]
Feb 20 07:39:57 np0005625204.localdomain systemd[1]: libpod-f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39.scope: Deactivated successfully.
Feb 20 07:39:57 np0005625204.localdomain podman[34143]: 2026-02-20 07:39:57.313225472 +0000 UTC m=+0.982284089 container died f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.42.2, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, ceph=True, build-date=2026-02-09T10:25:24Z, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1770267347, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 07:39:57 np0005625204.localdomain systemd[1]: tmp-crun.4cMZNW.mount: Deactivated successfully.
Feb 20 07:39:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f17d37de071d78a94b5c31c4f795f633b996934185049a4e499965d957885e9e-merged.mount: Deactivated successfully.
Feb 20 07:39:57 np0005625204.localdomain podman[35633]: 2026-02-20 07:39:57.401453971 +0000 UTC m=+0.079817773 container remove f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=optimistic_black, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, release=1770267347, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main)
Feb 20 07:39:57 np0005625204.localdomain systemd[1]: libpod-conmon-f5df760f1939e8525dd7d0574c725efcb65bff5f3738d3fda78c9dab37a5ae39.scope: Deactivated successfully.
Feb 20 07:39:57 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 15 pg[1.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=15) [1,5,3] r=2 lpr=15 pi=[13,15)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:39:57 np0005625204.localdomain sudo[34047]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:59 np0005625204.localdomain sudo[35647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:39:59 np0005625204.localdomain sudo[35647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:39:59 np0005625204.localdomain sudo[35647]: pam_unix(sudo:session): session closed for user root
Feb 20 07:39:59 np0005625204.localdomain sshd[34062]: Invalid user caja from 185.246.128.171 port 5553
Feb 20 07:40:00 np0005625204.localdomain sshd[34062]: Disconnecting invalid user caja 185.246.128.171 port 5553: Change of username or service not allowed: (caja,ssh-connection) -> (zhao,ssh-connection) [preauth]
Feb 20 07:40:03 np0005625204.localdomain sshd[35662]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:05 np0005625204.localdomain sudo[35664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:40:05 np0005625204.localdomain sudo[35664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:40:05 np0005625204.localdomain sudo[35664]: pam_unix(sudo:session): session closed for user root
Feb 20 07:40:05 np0005625204.localdomain sudo[35679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 07:40:05 np0005625204.localdomain sudo[35679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:40:06 np0005625204.localdomain podman[35762]: 2026-02-20 07:40:06.35963873 +0000 UTC m=+0.075651105 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 07:40:06 np0005625204.localdomain podman[35762]: 2026-02-20 07:40:06.432064718 +0000 UTC m=+0.148077133 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., version=7, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.42.2, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 07:40:06 np0005625204.localdomain sudo[35679]: pam_unix(sudo:session): session closed for user root
Feb 20 07:40:07 np0005625204.localdomain sshd[35662]: Invalid user zhao from 185.246.128.171 port 46400
Feb 20 07:40:07 np0005625204.localdomain sudo[35827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:40:07 np0005625204.localdomain sudo[35827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:40:07 np0005625204.localdomain sudo[35827]: pam_unix(sudo:session): session closed for user root
Feb 20 07:40:07 np0005625204.localdomain sshd[35662]: Disconnecting invalid user zhao 185.246.128.171 port 46400: Change of username or service not allowed: (zhao,ssh-connection) -> (tester,ssh-connection) [preauth]
Feb 20 07:40:08 np0005625204.localdomain sshd[35842]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:08 np0005625204.localdomain sshd[35843]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:09 np0005625204.localdomain sshd[35843]: Invalid user httpd from 151.252.84.225 port 57114
Feb 20 07:40:09 np0005625204.localdomain sshd[35843]: Received disconnect from 151.252.84.225 port 57114:11: Bye Bye [preauth]
Feb 20 07:40:09 np0005625204.localdomain sshd[35843]: Disconnected from invalid user httpd 151.252.84.225 port 57114 [preauth]
Feb 20 07:40:12 np0005625204.localdomain sshd[35842]: Invalid user tester from 185.246.128.171 port 4282
Feb 20 07:40:15 np0005625204.localdomain sshd[35847]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:15 np0005625204.localdomain sshd[35847]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:40:16 np0005625204.localdomain sshd[35842]: Disconnecting invalid user tester 185.246.128.171 port 4282: Change of username or service not allowed: (tester,ssh-connection) -> (ftpguest,ssh-connection) [preauth]
Feb 20 07:40:18 np0005625204.localdomain sshd[35849]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:20 np0005625204.localdomain sshd[35849]: Invalid user ftpguest from 185.246.128.171 port 53156
Feb 20 07:40:21 np0005625204.localdomain sshd[35849]: Disconnecting invalid user ftpguest 185.246.128.171 port 53156: Change of username or service not allowed: (ftpguest,ssh-connection) -> (sshd,ssh-connection) [preauth]
Feb 20 07:40:25 np0005625204.localdomain sshd[35851]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:31 np0005625204.localdomain sshd[35851]: Disconnecting authenticating user sshd 185.246.128.171 port 25174: Change of username or service not allowed: (sshd,ssh-connection) -> (vboxuser,ssh-connection) [preauth]
Feb 20 07:40:34 np0005625204.localdomain sshd[35853]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:40 np0005625204.localdomain sshd[35853]: Invalid user vboxuser from 185.246.128.171 port 10428
Feb 20 07:40:40 np0005625204.localdomain sshd[35853]: Disconnecting invalid user vboxuser 185.246.128.171 port 10428: Change of username or service not allowed: (vboxuser,ssh-connection) -> (factory,ssh-connection) [preauth]
Feb 20 07:40:40 np0005625204.localdomain sshd[35855]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:41 np0005625204.localdomain sshd[35855]: Received disconnect from 189.143.72.189 port 46136:11: Bye Bye [preauth]
Feb 20 07:40:41 np0005625204.localdomain sshd[35855]: Disconnected from authenticating user root 189.143.72.189 port 46136 [preauth]
Feb 20 07:40:43 np0005625204.localdomain sshd[35857]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:46 np0005625204.localdomain sshd[35857]: Invalid user factory from 185.246.128.171 port 55090
Feb 20 07:40:47 np0005625204.localdomain sshd[35857]: Disconnecting invalid user factory 185.246.128.171 port 55090: Change of username or service not allowed: (factory,ssh-connection) -> (timothy,ssh-connection) [preauth]
Feb 20 07:40:49 np0005625204.localdomain sshd[35859]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:40:53 np0005625204.localdomain sshd[35859]: Invalid user timothy from 185.246.128.171 port 18657
Feb 20 07:40:54 np0005625204.localdomain sshd[35859]: Disconnecting invalid user timothy 185.246.128.171 port 18657: Change of username or service not allowed: (timothy,ssh-connection) -> (oracle,ssh-connection) [preauth]
Feb 20 07:40:56 np0005625204.localdomain systemd[26592]: Starting Mark boot as successful...
Feb 20 07:40:56 np0005625204.localdomain systemd[26592]: Finished Mark boot as successful.
Feb 20 07:40:57 np0005625204.localdomain sshd[35862]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:02 np0005625204.localdomain sshd[35862]: Invalid user oracle from 185.246.128.171 port 60482
Feb 20 07:41:05 np0005625204.localdomain sshd[35864]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:05 np0005625204.localdomain sshd[35864]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:41:07 np0005625204.localdomain sudo[35866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:41:07 np0005625204.localdomain sudo[35866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:41:07 np0005625204.localdomain sudo[35866]: pam_unix(sudo:session): session closed for user root
Feb 20 07:41:07 np0005625204.localdomain sudo[35881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 07:41:07 np0005625204.localdomain sudo[35881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:41:08 np0005625204.localdomain podman[35964]: 2026-02-20 07:41:08.262277884 +0000 UTC m=+0.083647745 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, ceph=True, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347)
Feb 20 07:41:08 np0005625204.localdomain podman[35964]: 2026-02-20 07:41:08.395382946 +0000 UTC m=+0.216752797 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Feb 20 07:41:08 np0005625204.localdomain sudo[35881]: pam_unix(sudo:session): session closed for user root
Feb 20 07:41:08 np0005625204.localdomain sudo[36030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:41:08 np0005625204.localdomain sudo[36030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:41:08 np0005625204.localdomain sudo[36030]: pam_unix(sudo:session): session closed for user root
Feb 20 07:41:08 np0005625204.localdomain sudo[36045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:41:08 np0005625204.localdomain sudo[36045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:41:09 np0005625204.localdomain sudo[36045]: pam_unix(sudo:session): session closed for user root
Feb 20 07:41:10 np0005625204.localdomain sudo[36091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:41:10 np0005625204.localdomain sudo[36091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:41:10 np0005625204.localdomain sudo[36091]: pam_unix(sudo:session): session closed for user root
Feb 20 07:41:11 np0005625204.localdomain sshd[35862]: error: maximum authentication attempts exceeded for invalid user oracle from 185.246.128.171 port 60482 ssh2 [preauth]
Feb 20 07:41:11 np0005625204.localdomain sshd[35862]: Disconnecting invalid user oracle 185.246.128.171 port 60482: Too many authentication failures [preauth]
Feb 20 07:41:12 np0005625204.localdomain sshd[36106]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:14 np0005625204.localdomain sshd[36106]: Invalid user oracle from 185.246.128.171 port 13314
Feb 20 07:41:18 np0005625204.localdomain sshd[25045]: Received disconnect from 192.168.122.100 port 34020:11: disconnected by user
Feb 20 07:41:18 np0005625204.localdomain sshd[25045]: Disconnected from user zuul 192.168.122.100 port 34020
Feb 20 07:41:18 np0005625204.localdomain sshd[25042]: pam_unix(sshd:session): session closed for user zuul
Feb 20 07:41:18 np0005625204.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Feb 20 07:41:18 np0005625204.localdomain systemd[1]: session-14.scope: Consumed 21.687s CPU time.
Feb 20 07:41:18 np0005625204.localdomain systemd-logind[759]: Session 14 logged out. Waiting for processes to exit.
Feb 20 07:41:18 np0005625204.localdomain systemd-logind[759]: Removed session 14.
Feb 20 07:41:21 np0005625204.localdomain sshd[36106]: error: maximum authentication attempts exceeded for invalid user oracle from 185.246.128.171 port 13314 ssh2 [preauth]
Feb 20 07:41:21 np0005625204.localdomain sshd[36106]: Disconnecting invalid user oracle 185.246.128.171 port 13314: Too many authentication failures [preauth]
Feb 20 07:41:23 np0005625204.localdomain sshd[36108]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:28 np0005625204.localdomain sshd[36108]: Invalid user oracle from 185.246.128.171 port 5733
Feb 20 07:41:31 np0005625204.localdomain sshd[36108]: error: maximum authentication attempts exceeded for invalid user oracle from 185.246.128.171 port 5733 ssh2 [preauth]
Feb 20 07:41:31 np0005625204.localdomain sshd[36108]: Disconnecting invalid user oracle 185.246.128.171 port 5733: Too many authentication failures [preauth]
Feb 20 07:41:32 np0005625204.localdomain sshd[36110]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:38 np0005625204.localdomain sshd[36110]: Invalid user oracle from 185.246.128.171 port 53006
Feb 20 07:41:44 np0005625204.localdomain sshd[36110]: error: maximum authentication attempts exceeded for invalid user oracle from 185.246.128.171 port 53006 ssh2 [preauth]
Feb 20 07:41:44 np0005625204.localdomain sshd[36110]: Disconnecting invalid user oracle 185.246.128.171 port 53006: Too many authentication failures [preauth]
Feb 20 07:41:48 np0005625204.localdomain sshd[36112]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:52 np0005625204.localdomain sshd[36112]: Invalid user oracle from 185.246.128.171 port 6337
Feb 20 07:41:54 np0005625204.localdomain sshd[36114]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:54 np0005625204.localdomain sshd[36112]: Disconnecting invalid user oracle 185.246.128.171 port 6337: Change of username or service not allowed: (oracle,ssh-connection) -> (omar,ssh-connection) [preauth]
Feb 20 07:41:55 np0005625204.localdomain sshd[36114]: Received disconnect from 202.165.22.246 port 57844:11: Bye Bye [preauth]
Feb 20 07:41:55 np0005625204.localdomain sshd[36114]: Disconnected from authenticating user root 202.165.22.246 port 57844 [preauth]
Feb 20 07:41:55 np0005625204.localdomain sshd[36116]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:41:56 np0005625204.localdomain sshd[36116]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:41:57 np0005625204.localdomain sshd[36118]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:01 np0005625204.localdomain sshd[36118]: Invalid user omar from 185.246.128.171 port 59117
Feb 20 07:42:02 np0005625204.localdomain sshd[36118]: Disconnecting invalid user omar 185.246.128.171 port 59117: Change of username or service not allowed: (omar,ssh-connection) -> (vagrant,ssh-connection) [preauth]
Feb 20 07:42:05 np0005625204.localdomain sshd[36120]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:08 np0005625204.localdomain sshd[36120]: Invalid user vagrant from 185.246.128.171 port 37470
Feb 20 07:42:09 np0005625204.localdomain sshd[36120]: Disconnecting invalid user vagrant 185.246.128.171 port 37470: Change of username or service not allowed: (vagrant,ssh-connection) -> (devops,ssh-connection) [preauth]
Feb 20 07:42:10 np0005625204.localdomain sudo[36122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:42:10 np0005625204.localdomain sudo[36122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:42:10 np0005625204.localdomain sudo[36122]: pam_unix(sudo:session): session closed for user root
Feb 20 07:42:10 np0005625204.localdomain sudo[36137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:42:10 np0005625204.localdomain sudo[36137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:42:10 np0005625204.localdomain sudo[36137]: pam_unix(sudo:session): session closed for user root
Feb 20 07:42:11 np0005625204.localdomain sshd[36183]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:11 np0005625204.localdomain sudo[36184]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:42:11 np0005625204.localdomain sudo[36184]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:42:11 np0005625204.localdomain sudo[36184]: pam_unix(sudo:session): session closed for user root
Feb 20 07:42:14 np0005625204.localdomain sshd[36183]: Invalid user devops from 185.246.128.171 port 7665
Feb 20 07:42:16 np0005625204.localdomain sshd[36183]: Disconnecting invalid user devops 185.246.128.171 port 7665: Change of username or service not allowed: (devops,ssh-connection) -> (ceshi,ssh-connection) [preauth]
Feb 20 07:42:19 np0005625204.localdomain sshd[36200]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:23 np0005625204.localdomain sshd[36200]: Invalid user ceshi from 185.246.128.171 port 54965
Feb 20 07:42:25 np0005625204.localdomain sshd[36200]: Disconnecting invalid user ceshi 185.246.128.171 port 54965: Change of username or service not allowed: (ceshi,ssh-connection) -> (user07,ssh-connection) [preauth]
Feb 20 07:42:27 np0005625204.localdomain sshd[36202]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:31 np0005625204.localdomain sshd[36202]: Invalid user user07 from 185.246.128.171 port 36358
Feb 20 07:42:33 np0005625204.localdomain sshd[36202]: Disconnecting invalid user user07 185.246.128.171 port 36358: Change of username or service not allowed: (user07,ssh-connection) -> (erp,ssh-connection) [preauth]
Feb 20 07:42:35 np0005625204.localdomain sshd[36204]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:40 np0005625204.localdomain sshd[36204]: Invalid user erp from 185.246.128.171 port 17840
Feb 20 07:42:40 np0005625204.localdomain sshd[36204]: Disconnecting invalid user erp 185.246.128.171 port 17840: Change of username or service not allowed: (erp,ssh-connection) -> (support1,ssh-connection) [preauth]
Feb 20 07:42:43 np0005625204.localdomain sshd[36206]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:45 np0005625204.localdomain sshd[36208]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:45 np0005625204.localdomain sshd[36208]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:42:47 np0005625204.localdomain sshd[36206]: Invalid user support1 from 185.246.128.171 port 64792
Feb 20 07:42:49 np0005625204.localdomain sshd[36206]: Disconnecting invalid user support1 185.246.128.171 port 64792: Change of username or service not allowed: (support1,ssh-connection) -> (ftptest,ssh-connection) [preauth]
Feb 20 07:42:49 np0005625204.localdomain sshd[36210]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:42:51 np0005625204.localdomain sshd[36210]: Invalid user ftptest from 185.246.128.171 port 34203
Feb 20 07:42:52 np0005625204.localdomain sshd[36210]: Disconnecting invalid user ftptest 185.246.128.171 port 34203: Change of username or service not allowed: (ftptest,ssh-connection) -> (riad,ssh-connection) [preauth]
Feb 20 07:42:56 np0005625204.localdomain sshd[36212]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:00 np0005625204.localdomain sshd[36212]: Invalid user riad from 185.246.128.171 port 11499
Feb 20 07:43:01 np0005625204.localdomain sshd[36212]: Disconnecting invalid user riad 185.246.128.171 port 11499: Change of username or service not allowed: (riad,ssh-connection) -> (testuser,ssh-connection) [preauth]
Feb 20 07:43:02 np0005625204.localdomain sshd[36214]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:05 np0005625204.localdomain sshd[36214]: Invalid user testuser from 185.246.128.171 port 42395
Feb 20 07:43:08 np0005625204.localdomain sshd[36214]: error: maximum authentication attempts exceeded for invalid user testuser from 185.246.128.171 port 42395 ssh2 [preauth]
Feb 20 07:43:08 np0005625204.localdomain sshd[36214]: Disconnecting invalid user testuser 185.246.128.171 port 42395: Too many authentication failures [preauth]
Feb 20 07:43:09 np0005625204.localdomain sshd[36216]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:11 np0005625204.localdomain sshd[36216]: Invalid user testuser from 185.246.128.171 port 19859
Feb 20 07:43:11 np0005625204.localdomain sshd[36216]: Disconnecting invalid user testuser 185.246.128.171 port 19859: Change of username or service not allowed: (testuser,ssh-connection) -> (mysql,ssh-connection) [preauth]
Feb 20 07:43:11 np0005625204.localdomain sudo[36218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:43:11 np0005625204.localdomain sudo[36218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:43:11 np0005625204.localdomain sudo[36218]: pam_unix(sudo:session): session closed for user root
Feb 20 07:43:11 np0005625204.localdomain sudo[36233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:43:11 np0005625204.localdomain sudo[36233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:43:12 np0005625204.localdomain sshd[36266]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:12 np0005625204.localdomain sudo[36233]: pam_unix(sudo:session): session closed for user root
Feb 20 07:43:13 np0005625204.localdomain sudo[36282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:43:13 np0005625204.localdomain sudo[36282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:43:13 np0005625204.localdomain sudo[36282]: pam_unix(sudo:session): session closed for user root
Feb 20 07:43:13 np0005625204.localdomain sshd[36266]: Invalid user mysql from 185.246.128.171 port 33847
Feb 20 07:43:16 np0005625204.localdomain sshd[36266]: Disconnecting invalid user mysql 185.246.128.171 port 33847: Change of username or service not allowed: (mysql,ssh-connection) -> (user03,ssh-connection) [preauth]
Feb 20 07:43:20 np0005625204.localdomain sshd[36297]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:23 np0005625204.localdomain sshd[36297]: Invalid user user03 from 185.246.128.171 port 19090
Feb 20 07:43:24 np0005625204.localdomain sshd[36297]: Disconnecting invalid user user03 185.246.128.171 port 19090: Change of username or service not allowed: (user03,ssh-connection) -> (ftpadmin,ssh-connection) [preauth]
Feb 20 07:43:27 np0005625204.localdomain sshd[36299]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:32 np0005625204.localdomain sshd[36299]: Invalid user ftpadmin from 185.246.128.171 port 57676
Feb 20 07:43:34 np0005625204.localdomain sshd[36301]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:34 np0005625204.localdomain sshd[36299]: Disconnecting invalid user ftpadmin 185.246.128.171 port 57676: Change of username or service not allowed: (ftpadmin,ssh-connection) -> (isaac,ssh-connection) [preauth]
Feb 20 07:43:34 np0005625204.localdomain sshd[36301]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:43:35 np0005625204.localdomain sshd[36303]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:37 np0005625204.localdomain sshd[36303]: Invalid user isaac from 185.246.128.171 port 39525
Feb 20 07:43:37 np0005625204.localdomain sshd[36303]: Disconnecting invalid user isaac 185.246.128.171 port 39525: Change of username or service not allowed: (isaac,ssh-connection) -> (hive,ssh-connection) [preauth]
Feb 20 07:43:41 np0005625204.localdomain sshd[36305]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:45 np0005625204.localdomain sshd[36305]: Invalid user hive from 185.246.128.171 port 11191
Feb 20 07:43:46 np0005625204.localdomain sshd[36305]: Disconnecting invalid user hive 185.246.128.171 port 11191: Change of username or service not allowed: (hive,ssh-connection) -> (manish,ssh-connection) [preauth]
Feb 20 07:43:49 np0005625204.localdomain sshd[36307]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:43:52 np0005625204.localdomain sshd[36307]: Invalid user manish from 185.246.128.171 port 54867
Feb 20 07:43:53 np0005625204.localdomain sshd[36307]: Disconnecting invalid user manish 185.246.128.171 port 54867: Change of username or service not allowed: (manish,ssh-connection) -> (invite,ssh-connection) [preauth]
Feb 20 07:43:56 np0005625204.localdomain systemd[26592]: Created slice User Background Tasks Slice.
Feb 20 07:43:56 np0005625204.localdomain systemd[26592]: Starting Cleanup of User's Temporary Files and Directories...
Feb 20 07:43:56 np0005625204.localdomain systemd[26592]: Finished Cleanup of User's Temporary Files and Directories.
Feb 20 07:43:58 np0005625204.localdomain sshd[36310]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:02 np0005625204.localdomain sshd[36310]: Invalid user invite from 185.246.128.171 port 42368
Feb 20 07:44:02 np0005625204.localdomain sshd[36310]: Disconnecting invalid user invite 185.246.128.171 port 42368: Change of username or service not allowed: (invite,ssh-connection) -> (qbtuser,ssh-connection) [preauth]
Feb 20 07:44:04 np0005625204.localdomain sshd[36312]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:06 np0005625204.localdomain sshd[36312]: Invalid user qbtuser from 185.246.128.171 port 12975
Feb 20 07:44:06 np0005625204.localdomain sshd[36312]: Disconnecting invalid user qbtuser 185.246.128.171 port 12975: Change of username or service not allowed: (qbtuser,ssh-connection) -> (Grace,ssh-connection) [preauth]
Feb 20 07:44:06 np0005625204.localdomain sshd[36314]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:09 np0005625204.localdomain sshd[36314]: Invalid user Grace from 185.246.128.171 port 23888
Feb 20 07:44:10 np0005625204.localdomain sshd[36314]: Disconnecting invalid user Grace 185.246.128.171 port 23888: Change of username or service not allowed: (Grace,ssh-connection) -> (aziz,ssh-connection) [preauth]
Feb 20 07:44:13 np0005625204.localdomain sudo[36316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:44:13 np0005625204.localdomain sudo[36316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:44:13 np0005625204.localdomain sudo[36316]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:13 np0005625204.localdomain sudo[36331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:44:13 np0005625204.localdomain sudo[36331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:44:13 np0005625204.localdomain sshd[36346]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:13 np0005625204.localdomain sudo[36331]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:14 np0005625204.localdomain sudo[36379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:44:14 np0005625204.localdomain sudo[36379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:44:14 np0005625204.localdomain sudo[36379]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:16 np0005625204.localdomain sshd[36346]: Invalid user aziz from 185.246.128.171 port 61573
Feb 20 07:44:16 np0005625204.localdomain sshd[36346]: Disconnecting invalid user aziz 185.246.128.171 port 61573: Change of username or service not allowed: (aziz,ssh-connection) -> (oscar,ssh-connection) [preauth]
Feb 20 07:44:18 np0005625204.localdomain sshd[36395]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:21 np0005625204.localdomain sshd[36397]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:21 np0005625204.localdomain sshd[36397]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:44:21 np0005625204.localdomain sshd[36395]: Invalid user oscar from 185.246.128.171 port 26394
Feb 20 07:44:24 np0005625204.localdomain sshd[36395]: Disconnecting invalid user oscar 185.246.128.171 port 26394: Change of username or service not allowed: (oscar,ssh-connection) -> (utente,ssh-connection) [preauth]
Feb 20 07:44:26 np0005625204.localdomain sshd[36399]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:31 np0005625204.localdomain sshd[36399]: Invalid user utente from 185.246.128.171 port 8361
Feb 20 07:44:32 np0005625204.localdomain sshd[36399]: Disconnecting invalid user utente 185.246.128.171 port 8361: Change of username or service not allowed: (utente,ssh-connection) -> (finance,ssh-connection) [preauth]
Feb 20 07:44:35 np0005625204.localdomain sshd[36401]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:37 np0005625204.localdomain sshd[36401]: Invalid user finance from 185.246.128.171 port 56610
Feb 20 07:44:38 np0005625204.localdomain sshd[36401]: Disconnecting invalid user finance 185.246.128.171 port 56610: Change of username or service not allowed: (finance,ssh-connection) -> (sc,ssh-connection) [preauth]
Feb 20 07:44:39 np0005625204.localdomain sshd[36403]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:39 np0005625204.localdomain sshd[36403]: Invalid user n8n from 189.143.72.189 port 57608
Feb 20 07:44:40 np0005625204.localdomain sshd[36403]: Received disconnect from 189.143.72.189 port 57608:11: Bye Bye [preauth]
Feb 20 07:44:40 np0005625204.localdomain sshd[36403]: Disconnected from invalid user n8n 189.143.72.189 port 57608 [preauth]
Feb 20 07:44:40 np0005625204.localdomain sshd[36405]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:44 np0005625204.localdomain sshd[36405]: Invalid user sc from 185.246.128.171 port 21770
Feb 20 07:44:44 np0005625204.localdomain sshd[36405]: Disconnecting invalid user sc 185.246.128.171 port 21770: Change of username or service not allowed: (sc,ssh-connection) -> (ui,ssh-connection) [preauth]
Feb 20 07:44:46 np0005625204.localdomain sshd[36407]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:47 np0005625204.localdomain sshd[36407]: Invalid user ui from 185.246.128.171 port 53942
Feb 20 07:44:48 np0005625204.localdomain sshd[36407]: Disconnecting invalid user ui 185.246.128.171 port 53942: Change of username or service not allowed: (ui,ssh-connection) -> (Daniel,ssh-connection) [preauth]
Feb 20 07:44:49 np0005625204.localdomain sshd[36409]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:49 np0005625204.localdomain sshd[36409]: Accepted publickey for zuul from 192.168.122.100 port 60408 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:44:49 np0005625204.localdomain systemd-logind[759]: New session 28 of user zuul.
Feb 20 07:44:49 np0005625204.localdomain systemd[1]: Started Session 28 of User zuul.
Feb 20 07:44:49 np0005625204.localdomain sshd[36409]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 07:44:50 np0005625204.localdomain sudo[36455]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbhjulpbcxpfhbplgiykcxcrhovuuhyb ; /usr/bin/python3
Feb 20 07:44:50 np0005625204.localdomain sudo[36455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:50 np0005625204.localdomain python3[36457]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 20 07:44:50 np0005625204.localdomain sudo[36455]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:50 np0005625204.localdomain sudo[36500]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpsqkbqwkkjrsvoicdrzxziqqhurhnps ; /usr/bin/python3
Feb 20 07:44:50 np0005625204.localdomain sudo[36500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:50 np0005625204.localdomain sshd[36502]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:44:51 np0005625204.localdomain python3[36503]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 07:44:51 np0005625204.localdomain sudo[36500]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:51 np0005625204.localdomain sudo[36521]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zapmzwfyyaggkwmlqvsokyxtmelxviks ; /usr/bin/python3
Feb 20 07:44:51 np0005625204.localdomain sudo[36521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:51 np0005625204.localdomain python3[36523]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 20 07:44:51 np0005625204.localdomain useradd[36525]: new group: name=tripleo-admin, GID=1003
Feb 20 07:44:51 np0005625204.localdomain useradd[36525]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Feb 20 07:44:51 np0005625204.localdomain sudo[36521]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:51 np0005625204.localdomain sudo[36577]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puwrgmyvgyznyxezyyjawfwrpvbnazqt ; /usr/bin/python3
Feb 20 07:44:51 np0005625204.localdomain sudo[36577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:52 np0005625204.localdomain python3[36579]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:44:52 np0005625204.localdomain sudo[36577]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:52 np0005625204.localdomain sudo[36620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgwqvjmpciyuwaofopgxrgxfhdvmlrxc ; /usr/bin/python3
Feb 20 07:44:52 np0005625204.localdomain sudo[36620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:52 np0005625204.localdomain python3[36622]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771573491.7957625-66446-198631752352309/source _original_basename=tmp_6yb4k5j follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:44:52 np0005625204.localdomain sudo[36620]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:52 np0005625204.localdomain sudo[36650]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgvqlqfuaartjmplmqmvidffnlifzwir ; /usr/bin/python3
Feb 20 07:44:52 np0005625204.localdomain sudo[36650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:52 np0005625204.localdomain python3[36653]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:44:52 np0005625204.localdomain sudo[36650]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:53 np0005625204.localdomain sudo[36667]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqytdsbhtzwkfkxtodvbedajmuyitqpt ; /usr/bin/python3
Feb 20 07:44:53 np0005625204.localdomain sudo[36667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:53 np0005625204.localdomain python3[36669]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:44:53 np0005625204.localdomain sudo[36667]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:53 np0005625204.localdomain sudo[36683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrnggfruwjzidmalylruhdmxetqgawlb ; /usr/bin/python3
Feb 20 07:44:53 np0005625204.localdomain sudo[36683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:53 np0005625204.localdomain python3[36685]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:44:53 np0005625204.localdomain sudo[36683]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:54 np0005625204.localdomain sudo[36699]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vahrtxwxuvbadjxqiffwseyaoblimalp ; /usr/bin/python3
Feb 20 07:44:54 np0005625204.localdomain sudo[36699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 07:44:54 np0005625204.localdomain python3[36701]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDMsWl2iDqidUqzeXIedhPUnMaIwstb6f7CXhkhrGMXj2w21P81bPYesLJBAmIee3BaCMGYsrUpWu7u92b6J2nM7sdG3Lo3YZqie0IUZG/NpfJ+OEJsLIQ5zFT7jGNI77bmiCBX4jibhRJuDVtuombA5bly/q1x7yqRZ5UuwmTahAz49MPmun69R00RpbzOjt8OgQQ6ZdjDclItely6T7y+tDL3UemjBYcKGNckzXB1nWJEjviTyIXiLLqJdcxWarPXzxewMUoyRiTPQU6uRQ1VhuBRFyiofrACRWtFHfLEuWMVqQAL1OtJD5P+KSuVKHkT/MAFSdo0OptGxMmFN5ZvbruOyab6MovPK/9sTaFrhzT6tE78w+fEIFe4kPvM2bw80sKzO9yrkvqs1LL9ToVf9r2TufxRZVCibTFSa9Cw0U9yILFdCrlCZ2GMFNTCFuM7vQDAKwrKndvATAxKrm+D8Xme2+SVCv3oKJtx0m4D6gc9/IMzFhC7Ibh6bJaSj2s= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:44:54 np0005625204.localdomain sudo[36699]: pam_unix(sudo:session): session closed for user root
Feb 20 07:44:55 np0005625204.localdomain python3[36715]: ansible-ping Invoked with data=pong
Feb 20 07:44:55 np0005625204.localdomain sshd[36502]: Invalid user Daniel from 185.246.128.171 port 14894
Feb 20 07:44:56 np0005625204.localdomain sshd[36502]: Disconnecting invalid user Daniel 185.246.128.171 port 14894: Change of username or service not allowed: (Daniel,ssh-connection) -> (admin2,ssh-connection) [preauth]
Feb 20 07:44:58 np0005625204.localdomain sshd[36716]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:02 np0005625204.localdomain sshd[36716]: Invalid user admin2 from 185.246.128.171 port 57671
Feb 20 07:45:02 np0005625204.localdomain sshd[36716]: Disconnecting invalid user admin2 185.246.128.171 port 57671: Change of username or service not allowed: (admin2,ssh-connection) -> (intern,ssh-connection) [preauth]
Feb 20 07:45:04 np0005625204.localdomain sshd[36718]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:06 np0005625204.localdomain sshd[36720]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:06 np0005625204.localdomain sshd[36720]: Accepted publickey for tripleo-admin from 192.168.122.100 port 46784 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 07:45:06 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 20 07:45:06 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 20 07:45:06 np0005625204.localdomain systemd-logind[759]: New session 29 of user tripleo-admin.
Feb 20 07:45:06 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 20 07:45:06 np0005625204.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Queued start job for default target Main User Target.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Created slice User Application Slice.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Reached target Paths.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Reached target Timers.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Starting D-Bus User Message Bus Socket...
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Starting Create User's Volatile Files and Directories...
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Listening on D-Bus User Message Bus Socket.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Reached target Sockets.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Finished Create User's Volatile Files and Directories.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Reached target Basic System.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Reached target Main User Target.
Feb 20 07:45:06 np0005625204.localdomain systemd[36724]: Startup finished in 123ms.
Feb 20 07:45:06 np0005625204.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 20 07:45:06 np0005625204.localdomain systemd[1]: Started Session 29 of User tripleo-admin.
Feb 20 07:45:06 np0005625204.localdomain sshd[36720]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 20 07:45:06 np0005625204.localdomain sudo[36783]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evlnprnovspygrddizsusspzgysyxgsd ; /usr/bin/python3
Feb 20 07:45:06 np0005625204.localdomain sudo[36783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:07 np0005625204.localdomain python3[36785]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 07:45:07 np0005625204.localdomain sudo[36783]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:07 np0005625204.localdomain sshd[36718]: Invalid user intern from 185.246.128.171 port 25649
Feb 20 07:45:08 np0005625204.localdomain sshd[36790]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:08 np0005625204.localdomain sshd[36718]: Disconnecting invalid user intern 185.246.128.171 port 25649: Change of username or service not allowed: (intern,ssh-connection) -> (weewx,ssh-connection) [preauth]
Feb 20 07:45:08 np0005625204.localdomain sshd[36790]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:45:10 np0005625204.localdomain sshd[36792]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:11 np0005625204.localdomain sudo[36806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uauuytghxabctllwpoojprbuzcelmpfk ; /usr/bin/python3
Feb 20 07:45:11 np0005625204.localdomain sudo[36806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:12 np0005625204.localdomain python3[36808]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Feb 20 07:45:12 np0005625204.localdomain sudo[36806]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:12 np0005625204.localdomain sudo[36823]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khrnztyejayuascuplxfylwzjkgghuzu ; /usr/bin/python3
Feb 20 07:45:12 np0005625204.localdomain sudo[36823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:12 np0005625204.localdomain python3[36825]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Feb 20 07:45:12 np0005625204.localdomain sudo[36823]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:13 np0005625204.localdomain sudo[36871]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sksvusyemhjbjmqohaqlbtujbfransve ; /usr/bin/python3
Feb 20 07:45:13 np0005625204.localdomain sudo[36871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:13 np0005625204.localdomain python3[36873]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.tazbhho9tmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:45:13 np0005625204.localdomain sudo[36871]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:13 np0005625204.localdomain sudo[36901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kismxzuniggvrunugfzraedhmlelgkey ; /usr/bin/python3
Feb 20 07:45:13 np0005625204.localdomain sudo[36901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:13 np0005625204.localdomain python3[36903]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.tazbhho9tmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:45:13 np0005625204.localdomain sshd[36792]: Invalid user weewx from 185.246.128.171 port 55697
Feb 20 07:45:13 np0005625204.localdomain sudo[36901]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:14 np0005625204.localdomain sudo[36904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:45:14 np0005625204.localdomain sudo[36904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:45:14 np0005625204.localdomain sudo[36904]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:14 np0005625204.localdomain sudo[36919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:45:14 np0005625204.localdomain sudo[36919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:45:14 np0005625204.localdomain sudo[36947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plrornbidrslbojzsrheakizccmynxiv ; /usr/bin/python3
Feb 20 07:45:14 np0005625204.localdomain sudo[36947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:15 np0005625204.localdomain python3[36949]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.tazbhho9tmphosts insertbefore=BOF block=172.17.0.106 np0005625202.localdomain np0005625202
                                                         172.18.0.106 np0005625202.storage.localdomain np0005625202.storage
                                                         172.20.0.106 np0005625202.storagemgmt.localdomain np0005625202.storagemgmt
                                                         172.17.0.106 np0005625202.internalapi.localdomain np0005625202.internalapi
                                                         172.19.0.106 np0005625202.tenant.localdomain np0005625202.tenant
                                                         192.168.122.106 np0005625202.ctlplane.localdomain np0005625202.ctlplane
                                                         172.17.0.107 np0005625203.localdomain np0005625203
                                                         172.18.0.107 np0005625203.storage.localdomain np0005625203.storage
                                                         172.20.0.107 np0005625203.storagemgmt.localdomain np0005625203.storagemgmt
                                                         172.17.0.107 np0005625203.internalapi.localdomain np0005625203.internalapi
                                                         172.19.0.107 np0005625203.tenant.localdomain np0005625203.tenant
                                                         192.168.122.107 np0005625203.ctlplane.localdomain np0005625203.ctlplane
                                                         172.17.0.108 np0005625204.localdomain np0005625204
                                                         172.18.0.108 np0005625204.storage.localdomain np0005625204.storage
                                                         172.20.0.108 np0005625204.storagemgmt.localdomain np0005625204.storagemgmt
                                                         172.17.0.108 np0005625204.internalapi.localdomain np0005625204.internalapi
                                                         172.19.0.108 np0005625204.tenant.localdomain np0005625204.tenant
                                                         192.168.122.108 np0005625204.ctlplane.localdomain np0005625204.ctlplane
                                                         172.17.0.103 np0005625199.localdomain np0005625199
                                                         172.18.0.103 np0005625199.storage.localdomain np0005625199.storage
                                                         172.20.0.103 np0005625199.storagemgmt.localdomain np0005625199.storagemgmt
                                                         172.17.0.103 np0005625199.internalapi.localdomain np0005625199.internalapi
                                                         172.19.0.103 np0005625199.tenant.localdomain np0005625199.tenant
                                                         192.168.122.103 np0005625199.ctlplane.localdomain np0005625199.ctlplane
                                                         172.17.0.104 np0005625200.localdomain np0005625200
                                                         172.18.0.104 np0005625200.storage.localdomain np0005625200.storage
                                                         172.20.0.104 np0005625200.storagemgmt.localdomain np0005625200.storagemgmt
                                                         172.17.0.104 np0005625200.internalapi.localdomain np0005625200.internalapi
                                                         172.19.0.104 np0005625200.tenant.localdomain np0005625200.tenant
                                                         192.168.122.104 np0005625200.ctlplane.localdomain np0005625200.ctlplane
                                                         172.17.0.105 np0005625201.localdomain np0005625201
                                                         172.18.0.105 np0005625201.storage.localdomain np0005625201.storage
                                                         172.20.0.105 np0005625201.storagemgmt.localdomain np0005625201.storagemgmt
                                                         172.17.0.105 np0005625201.internalapi.localdomain np0005625201.internalapi
                                                         172.19.0.105 np0005625201.tenant.localdomain np0005625201.tenant
                                                         192.168.122.105 np0005625201.ctlplane.localdomain np0005625201.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.217  overcloud.storage.localdomain
                                                         172.20.0.250  overcloud.storagemgmt.localdomain
                                                         172.17.0.130  overcloud.internalapi.localdomain
                                                         172.21.0.142  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:45:15 np0005625204.localdomain sudo[36947]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:15 np0005625204.localdomain sudo[36919]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:15 np0005625204.localdomain sudo[36995]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggpsawlkketdhpavrqijmvldmxfgpzai ; /usr/bin/python3
Feb 20 07:45:15 np0005625204.localdomain sudo[36995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:15 np0005625204.localdomain python3[36997]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.tazbhho9tmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:45:15 np0005625204.localdomain sudo[36995]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:15 np0005625204.localdomain sudo[37012]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxuuckmzagtmumevafbcosmjrtdeylon ; /usr/bin/python3
Feb 20 07:45:15 np0005625204.localdomain sudo[37012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:15 np0005625204.localdomain sshd[36792]: Disconnecting invalid user weewx 185.246.128.171 port 55697: Change of username or service not allowed: (weewx,ssh-connection) -> (tempuser,ssh-connection) [preauth]
Feb 20 07:45:15 np0005625204.localdomain python3[37014]: ansible-file Invoked with path=/tmp/ansible.tazbhho9tmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:45:15 np0005625204.localdomain sudo[37012]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:15 np0005625204.localdomain sudo[37015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:45:15 np0005625204.localdomain sudo[37015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:45:15 np0005625204.localdomain sudo[37015]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:16 np0005625204.localdomain sudo[37043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddileqsesrtzyhqcuewrvglmekwubiyd ; /usr/bin/python3
Feb 20 07:45:16 np0005625204.localdomain sudo[37043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:16 np0005625204.localdomain python3[37045]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:45:16 np0005625204.localdomain sudo[37043]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:17 np0005625204.localdomain sudo[37060]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grhgyrpvmrsmdbwlmrvlrxxmxwigejjn ; /usr/bin/python3
Feb 20 07:45:17 np0005625204.localdomain sudo[37060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:17 np0005625204.localdomain python3[37062]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:45:19 np0005625204.localdomain sshd[37064]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:21 np0005625204.localdomain sudo[37060]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:21 np0005625204.localdomain sudo[37081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnfrcpiwxlsodbzesisesizpjazgqnyk ; /usr/bin/python3
Feb 20 07:45:21 np0005625204.localdomain sudo[37081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:22 np0005625204.localdomain python3[37083]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:45:22 np0005625204.localdomain sudo[37081]: pam_unix(sudo:session): session closed for user root
Feb 20 07:45:22 np0005625204.localdomain sudo[37098]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pntfohvyaxuoalfuyrigaanzfbyvfdsg ; /usr/bin/python3
Feb 20 07:45:22 np0005625204.localdomain sudo[37098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:45:22 np0005625204.localdomain python3[37100]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:45:25 np0005625204.localdomain sshd[37102]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:25 np0005625204.localdomain sshd[37064]: Invalid user tempuser from 185.246.128.171 port 35954
Feb 20 07:45:26 np0005625204.localdomain sshd[37102]: Invalid user claude from 202.165.22.246 port 36258
Feb 20 07:45:26 np0005625204.localdomain sshd[37102]: Received disconnect from 202.165.22.246 port 36258:11: Bye Bye [preauth]
Feb 20 07:45:26 np0005625204.localdomain sshd[37102]: Disconnected from invalid user claude 202.165.22.246 port 36258 [preauth]
Feb 20 07:45:26 np0005625204.localdomain sshd[37064]: Disconnecting invalid user tempuser 185.246.128.171 port 35954: Change of username or service not allowed: (tempuser,ssh-connection) -> (shutdown,ssh-connection) [preauth]
Feb 20 07:45:29 np0005625204.localdomain sshd[37173]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:34 np0005625204.localdomain sshd[37173]: Disconnecting authenticating user shutdown 185.246.128.171 port 24379: Change of username or service not allowed: (shutdown,ssh-connection) -> (wss,ssh-connection) [preauth]
Feb 20 07:45:36 np0005625204.localdomain groupadd[37276]: group added to /etc/group: name=puppet, GID=52
Feb 20 07:45:36 np0005625204.localdomain groupadd[37276]: group added to /etc/gshadow: name=puppet
Feb 20 07:45:36 np0005625204.localdomain groupadd[37276]: new group: name=puppet, GID=52
Feb 20 07:45:36 np0005625204.localdomain useradd[37283]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Feb 20 07:45:39 np0005625204.localdomain sshd[37293]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:43 np0005625204.localdomain sshd[37293]: Invalid user wss from 185.246.128.171 port 12745
Feb 20 07:45:44 np0005625204.localdomain sshd[37293]: Disconnecting invalid user wss 185.246.128.171 port 12745: Change of username or service not allowed: (wss,ssh-connection) -> (portal,ssh-connection) [preauth]
Feb 20 07:45:46 np0005625204.localdomain sshd[37590]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:48 np0005625204.localdomain sshd[37590]: Invalid user portal from 185.246.128.171 port 50662
Feb 20 07:45:49 np0005625204.localdomain sshd[37590]: Disconnecting invalid user portal 185.246.128.171 port 50662: Change of username or service not allowed: (portal,ssh-connection) -> (wilson,ssh-connection) [preauth]
Feb 20 07:45:50 np0005625204.localdomain sshd[37609]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:54 np0005625204.localdomain sshd[37609]: Invalid user wilson from 185.246.128.171 port 10460
Feb 20 07:45:55 np0005625204.localdomain sshd[37636]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:45:55 np0005625204.localdomain sshd[37636]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:45:56 np0005625204.localdomain sshd[37609]: Disconnecting invalid user wilson 185.246.128.171 port 10460: Change of username or service not allowed: (wilson,ssh-connection) -> (scpuser,ssh-connection) [preauth]
Feb 20 07:45:58 np0005625204.localdomain sshd[37657]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:01 np0005625204.localdomain sshd[37657]: Invalid user scpuser from 185.246.128.171 port 58263
Feb 20 07:46:02 np0005625204.localdomain sshd[37657]: Disconnecting invalid user scpuser 185.246.128.171 port 58263: Change of username or service not allowed: (scpuser,ssh-connection) -> (uftp,ssh-connection) [preauth]
Feb 20 07:46:03 np0005625204.localdomain sshd[37694]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:06 np0005625204.localdomain sshd[37694]: Invalid user uftp from 185.246.128.171 port 20702
Feb 20 07:46:07 np0005625204.localdomain sshd[37694]: Disconnecting invalid user uftp 185.246.128.171 port 20702: Change of username or service not allowed: (uftp,ssh-connection) -> (satya,ssh-connection) [preauth]
Feb 20 07:46:08 np0005625204.localdomain sshd[37729]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:12 np0005625204.localdomain sshd[37729]: Invalid user satya from 185.246.128.171 port 49666
Feb 20 07:46:13 np0005625204.localdomain sshd[37729]: Disconnecting invalid user satya 185.246.128.171 port 49666: Change of username or service not allowed: (satya,ssh-connection) -> (aaa,ssh-connection) [preauth]
Feb 20 07:46:16 np0005625204.localdomain sudo[37744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:46:16 np0005625204.localdomain sudo[37744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:46:16 np0005625204.localdomain sudo[37744]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:16 np0005625204.localdomain sshd[37768]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:16 np0005625204.localdomain sudo[37759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:46:16 np0005625204.localdomain sudo[37759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:46:16 np0005625204.localdomain sudo[37759]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:17 np0005625204.localdomain sudo[37808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:46:17 np0005625204.localdomain sudo[37808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:46:17 np0005625204.localdomain sudo[37808]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:19 np0005625204.localdomain sshd[37768]: Invalid user aaa from 185.246.128.171 port 25784
Feb 20 07:46:20 np0005625204.localdomain sshd[37768]: Disconnecting invalid user aaa 185.246.128.171 port 25784: Change of username or service not allowed: (aaa,ssh-connection) -> (anthony,ssh-connection) [preauth]
Feb 20 07:46:23 np0005625204.localdomain sshd[37854]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:28 np0005625204.localdomain sshd[37854]: Invalid user anthony from 185.246.128.171 port 4748
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  Converting 2700 SID table entries...
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:46:29 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:46:29 np0005625204.localdomain sshd[37854]: Disconnecting invalid user anthony 185.246.128.171 port 4748: Change of username or service not allowed: (anthony,ssh-connection) -> (netlink,ssh-connection) [preauth]
Feb 20 07:46:29 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:46:30 np0005625204.localdomain systemd-rc-local-generator[37993]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:46:30 np0005625204.localdomain systemd-sysv-generator[37996]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 07:46:30 np0005625204.localdomain systemd[1]: run-ra020798fb2eb49aaaa2a151af8915f20.service: Deactivated successfully.
Feb 20 07:46:31 np0005625204.localdomain sudo[37098]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:31 np0005625204.localdomain sshd[38413]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:33 np0005625204.localdomain sudo[38428]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkdvxhmuefywsboaghflyevjtahyzszb ; /usr/bin/python3
Feb 20 07:46:33 np0005625204.localdomain sudo[38428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:33 np0005625204.localdomain python3[38430]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:34 np0005625204.localdomain sudo[38428]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:35 np0005625204.localdomain sudo[38567]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flgcnytznewkozeocoskyohlyiiwkctx ; /usr/bin/python3
Feb 20 07:46:35 np0005625204.localdomain sudo[38567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:35 np0005625204.localdomain python3[38569]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:46:35 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:46:35 np0005625204.localdomain systemd-rc-local-generator[38595]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:46:35 np0005625204.localdomain systemd-sysv-generator[38600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:46:35 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:46:35 np0005625204.localdomain sudo[38567]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:36 np0005625204.localdomain sshd[38413]: Invalid user netlink from 185.246.128.171 port 47618
Feb 20 07:46:36 np0005625204.localdomain sudo[38621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stiunhjucqzxgomyehmcpchprxobcnuz ; /usr/bin/python3
Feb 20 07:46:36 np0005625204.localdomain sudo[38621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:37 np0005625204.localdomain python3[38623]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:46:37 np0005625204.localdomain sudo[38621]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:37 np0005625204.localdomain sudo[38637]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhwtzptnqtzkxraztvbqszbkutgeeais ; /usr/bin/python3
Feb 20 07:46:37 np0005625204.localdomain sudo[38637]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:37 np0005625204.localdomain python3[38639]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:37 np0005625204.localdomain sudo[38637]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:37 np0005625204.localdomain sudo[38654]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubkmulyifeucucqeruxzhiazyzjimkom ; /usr/bin/python3
Feb 20 07:46:37 np0005625204.localdomain sudo[38654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:38 np0005625204.localdomain python3[38656]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 20 07:46:38 np0005625204.localdomain sshd[38413]: Disconnecting invalid user netlink 185.246.128.171 port 47618: Change of username or service not allowed: (netlink,ssh-connection) -> (mina,ssh-connection) [preauth]
Feb 20 07:46:38 np0005625204.localdomain sudo[38654]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:38 np0005625204.localdomain sudo[38672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbxqmskizpyzfiilxgobkewzvbytdwww ; /usr/bin/python3
Feb 20 07:46:38 np0005625204.localdomain sudo[38672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:38 np0005625204.localdomain python3[38674]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:46:38 np0005625204.localdomain sudo[38672]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:38 np0005625204.localdomain sudo[38690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpnfugugysgcgcquddaffmcsultzhjyv ; /usr/bin/python3
Feb 20 07:46:38 np0005625204.localdomain sudo[38690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:39 np0005625204.localdomain python3[38692]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:46:39 np0005625204.localdomain sudo[38690]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:39 np0005625204.localdomain sudo[38708]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qusmrnkgrnjsbssbkfdynnjtwxkxnuof ; /usr/bin/python3
Feb 20 07:46:39 np0005625204.localdomain sudo[38708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:39 np0005625204.localdomain python3[38710]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:46:39 np0005625204.localdomain systemd[1]: Reloading Network Manager...
Feb 20 07:46:39 np0005625204.localdomain NetworkManager[5988]: <info>  [1771573599.7518] audit: op="reload" arg="0" pid=38713 uid=0 result="success"
Feb 20 07:46:39 np0005625204.localdomain NetworkManager[5988]: <info>  [1771573599.7528] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Feb 20 07:46:39 np0005625204.localdomain NetworkManager[5988]: <info>  [1771573599.7528] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Feb 20 07:46:39 np0005625204.localdomain systemd[1]: Reloaded Network Manager.
Feb 20 07:46:39 np0005625204.localdomain sudo[38708]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:40 np0005625204.localdomain sudo[38727]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oealsvwpwgztecqhqtrtlkxvoztirmvh ; /usr/bin/python3
Feb 20 07:46:40 np0005625204.localdomain sudo[38727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:40 np0005625204.localdomain python3[38729]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:40 np0005625204.localdomain sudo[38727]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:40 np0005625204.localdomain sudo[38744]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkboocuzkidgrzzyjlyvjzkmogiaoqqr ; /usr/bin/python3
Feb 20 07:46:40 np0005625204.localdomain sudo[38744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:40 np0005625204.localdomain python3[38746]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:46:40 np0005625204.localdomain sudo[38744]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:40 np0005625204.localdomain sshd[38749]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:40 np0005625204.localdomain sudo[38763]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozcptbaekwpmtfdzqxlnqdzvfaereyuu ; /usr/bin/python3
Feb 20 07:46:40 np0005625204.localdomain sudo[38763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:41 np0005625204.localdomain python3[38765]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:46:41 np0005625204.localdomain sudo[38763]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:41 np0005625204.localdomain sudo[38779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqhtswqwenkkwlhatsbfsiwmmhxhiaji ; /usr/bin/python3
Feb 20 07:46:41 np0005625204.localdomain sudo[38779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:41 np0005625204.localdomain python3[38781]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:46:41 np0005625204.localdomain sudo[38779]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:42 np0005625204.localdomain sudo[38795]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekekrylanrrsfxsionxbgfinrxjsihih ; /usr/bin/python3
Feb 20 07:46:42 np0005625204.localdomain sudo[38795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:42 np0005625204.localdomain python3[38797]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 20 07:46:42 np0005625204.localdomain sudo[38795]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:42 np0005625204.localdomain sudo[38811]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfhfqdsadsqoglukrcdpciqxxrdxjffg ; /usr/bin/python3
Feb 20 07:46:42 np0005625204.localdomain sudo[38811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:42 np0005625204.localdomain python3[38813]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:46:42 np0005625204.localdomain sudo[38811]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:43 np0005625204.localdomain sudo[38828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtaehrnadtvmhavgbsdrwbafneedkszf ; /usr/bin/python3
Feb 20 07:46:43 np0005625204.localdomain sudo[38828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:43 np0005625204.localdomain python3[38830]: ansible-blockinfile Invoked with path=/tmp/ansible.j_w9c8fi block=[192.168.122.106]*,[np0005625202.ctlplane.localdomain]*,[172.17.0.106]*,[np0005625202.internalapi.localdomain]*,[172.18.0.106]*,[np0005625202.storage.localdomain]*,[172.20.0.106]*,[np0005625202.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005625202.tenant.localdomain]*,[np0005625202.localdomain]*,[np0005625202]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDr8sejencX7nSCX6AegGtTuiZL3yclu/L7ZVN4B6dKPdmHqVr33QJD40sEk28GHpx8BrkPU2Qj1de9H6mGtrlwhmJr7Pccg/YqzKoTCQD5rZQ4youU8H70As6YX5ZlXyulwI1SH70XjMm37x4ptKALFOjRnHg0WIXah/tAmzrY/orh+/eCcns7APVjN9B1o+MqP4r47WrWrGU/KxtsHc6dflWxZW7BWUCCNS0e3C4yWLRjy8Hhj7Qkpssv/UBcj+olVHadUUOYiaQZ5Y33MjxwIg8o1MuC7C1dNIn8eXOXXiA8jd/lJd9kImrCGUtkVqj8VQgsMh4vRYMD+0SNLYRDVwxdemOzJYgwQhgiWZ0G+cVhnTBpMmXyIws2OpOKU8R3HjTC3jz+BxvjwEvMDoQfpGgsHB9NCXnkQzs2F8EA8LpA823Ef1SMgPdDCaQzvN5oQPZkWAPMVHvq31xpN9q+KXg/bg0uDaIZXUxW2rGnem7pFS78rRUGL6MfSMn1zs=
                                                         [192.168.122.107]*,[np0005625203.ctlplane.localdomain]*,[172.17.0.107]*,[np0005625203.internalapi.localdomain]*,[172.18.0.107]*,[np0005625203.storage.localdomain]*,[172.20.0.107]*,[np0005625203.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005625203.tenant.localdomain]*,[np0005625203.localdomain]*,[np0005625203]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtf1NXQ3EGQGdpLLLxuODKBdTGwqsiHL2QZ6zcfpGAa7EhDIxuEcLboqOGjQO0FM3u+kl2gIgKF0UsY5Vjcv4mDCMp7A7srq7TVo5lE5cCppbbXr0/PH2L/naHU3W+W83aT5RE17XPJ0Acn3W51WFBoICCCc4jjWTGmkNEgurKBJmdr0n8NeIcUWZ7Abrs/N2xzNftEFIjAPwebxgEwgCx0hMbdjTFhKbB/V7CjKaCU/UjirWMW5aDQJQEfrCM9u4NHuGaWKzJgar4/shNHaRvkCDbVrRPTCyfNebE04J/R42X3yWmvww4TMZVpRROd/u6Pgg1P2tbPGfQ0XvS0rfY6W4/VnHcyRDqxILH5BoeCAbTuVFmR0hbQu9fNbNxTP+o+na9mHEbNxbhcREnkal8+M0l11YftCRkr4132JITxe7y93gN/dwxE3nJLHLXRuRskWc3GTDT2MVU2Sj64yizD9KOM3oiMBXdPbNbgZywu3hqQvpO00GVg6QRjEJoiFc=
                                                         [192.168.122.108]*,[np0005625204.ctlplane.localdomain]*,[172.17.0.108]*,[np0005625204.internalapi.localdomain]*,[172.18.0.108]*,[np0005625204.storage.localdomain]*,[172.20.0.108]*,[np0005625204.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005625204.tenant.localdomain]*,[np0005625204.localdomain]*,[np0005625204]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAo6exxFtNk/Y5qEGYenJyhnCsS7iZmCGsFaQtJElNSeTTX9a1P0P2EmjtHolRxnljCZ2X8HgWx/irhJvWLoS+dzF5l+KcyQy83+048h51mbnj7zV2uG9i8LkO0egs1uBBp5E+hauHMsuf0nIDFl45W86ZXuf+MfFEKCInhjB5gfE9tTjwmKwKhgO1DE7Vpx3OYy1FHkq0YDBCqQHuuhYPrLZPjfVv3vGOaHH/XCsxX3h8/ixsZbobD56dDBKF/8CFyC/guH8pNUhZHG0dEhz5BT8PcE2Q/M9pPttzmRQksfg9+q7lVy9eCoOVpzqfTgjE1cm5yISwuMZzaNxwjJKB54EWpfl5xxnkC14B+xdvowxpl1PcMNZ0q1fWofJF4TrJAwWCUYZf45aUV2yb5R8WavUT0pX32xmd4zFbXusoafiw2FcgnxoGz3N4ZgIxTPPmgUe13blr1SK44huXWPioaolFBo82xVVFHc+01vfLF3xvs86d6EpqpLH+yaCeUjE=
                                                         [192.168.122.103]*,[np0005625199.ctlplane.localdomain]*,[172.17.0.103]*,[np0005625199.internalapi.localdomain]*,[172.18.0.103]*,[np0005625199.storage.localdomain]*,[172.20.0.103]*,[np0005625199.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005625199.tenant.localdomain]*,[np0005625199.localdomain]*,[np0005625199]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrnsozeOPJKYg9sx2Tj6QOLRhujK5RVh5RZQ3sb0pk+DbWHQKqS1YvJUg2hV4WxbxPnNUCBtJ+RZ8lVm6RLM+hc3ffe2sOMOz5upO/hTlIpBSfJpQORkiNW+XIXdDVxgE418veFd2hASFmiCmKoFSKXsvnmFU9oTEpja1plcXSqCobFMVYKlhcRo66O0ySlGOR+o3Ar2yNJQjFErEGvZLoDEa/VlA6zreYmTaIsnlUDie0gbm5teTlsCcEYkvWcTzcfOEX2kXQRQbS5qlPtGg7c+KMv5e40rE+2QOigLmOOPVGwNYuLuhb/EHT0C8hK8otW4tiXxBlSZ5ONKY6YYQOpy7krNkWRxNXzK0LfXo2bt2apDaMzebPOvuBj1YyBiLpa6/aLvS/dtGolQNPDpFivPbP/mSpat1qTs0W3/2HyBovwWSGJDW8MMYxbZJ0Z6tnuOwdrPTdkhIibfW9wxgL7EHrDYrGx5CvA2vUM4KDKRntz/cCMGE/zKacSJ48nNk=
                                                         [192.168.122.104]*,[np0005625200.ctlplane.localdomain]*,[172.17.0.104]*,[np0005625200.internalapi.localdomain]*,[172.18.0.104]*,[np0005625200.storage.localdomain]*,[172.20.0.104]*,[np0005625200.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005625200.tenant.localdomain]*,[np0005625200.localdomain]*,[np0005625200]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDW88346W6zU6nxCpqapHtIr5nRG8Jn9LFit3r5klBfauCkmAGONb4X8IwKjo8MD9etebUVbo6aX9gBMBMSs7bSoHzsEQuMLpBDrweSbahQj+gqZ5TmQ/xvwbhws04z3/IJxapAk2xWu7khVGjvOPUE1CROkP+1LiGktQ6Xj1ar1TbLNud2Dq/R5ZalbpK0OT3+no3x0oAJT3W649tW4nmCWcNaxykPsLREsUlH2qVoceAzLEDCSde9/1TONc/URyB4acVqmEwJDHeX51bh31tpQwp/WSe0vKQ6eUw63Tmpn+dRI9xbnFhc6mgGAPcEw7cAUkM7oM6bYMSvVxYDmzMhuXUU/9i3mdMnDBkMyZ5Oed6ZSmFQIJe5k7cz3783d35ZXfl/HsYMqoZ3lmDgbeS59pQrI+BldKyv3sTnoCDahfcmzmiHssxqa7tT5KOuR444q7Nj6wJEIZMEEJEHtMlh1iSBRJZOEOaKjo7h+jV7KMe75aPRasvu9K1v0dqyG6U=
                                                         [192.168.122.105]*,[np0005625201.ctlplane.localdomain]*,[172.17.0.105]*,[np0005625201.internalapi.localdomain]*,[172.18.0.105]*,[np0005625201.storage.localdomain]*,[172.20.0.105]*,[np0005625201.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005625201.tenant.localdomain]*,[np0005625201.localdomain]*,[np0005625201]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyGkX26ECIsvqnvJegedSF6KicDAAqjaifawEd//OuK9zdHIWqO3XmlEszZqWPsdQhPFkelfzXR+sy3gbPNv+yjT7phsw1sq7zHXeogQFlP5iOQZrf6hCnfXxVk2ckIXMT0UJVZ8FCTwsQi+HKkR/IEj08pR7EjrXGWxHkjv5wNj76spF3FJxtwycS4+KzY3UFy7gYWVn2jB0ha966YgjHMPhzQnT33W9myxGH33M1L5ZCGlfH19hLnqTUNMfzIfw3afxHkL5BFZbhthUPmIfLdLtKmZEkpSTBO/CrNA6CmMfY6xnT78hmwXytEQ+jeiRdKXdr9xQ2j6wVmPzckFKBsBYRe4DprKGt93fnKS9Z6A3Sv626DyZgDa8/NXbtAaBxtyix5Vdt872hYvCzYyB/OuSV6PR5DOq8z3fquOwgtka3rA6qL5gxhFJcO5TqtBM76DzOLd9OLM9bIO1yK9sCmbYynMojkXylzhDfcI8kytS5xs9FJEfwTElZRHkEIQE=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:46:43 np0005625204.localdomain sudo[38828]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:43 np0005625204.localdomain sshd[38831]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:43 np0005625204.localdomain sshd[38831]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:46:43 np0005625204.localdomain sudo[38846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrzljgldmmzldcoumxehnfriboxoraqj ; /usr/bin/python3
Feb 20 07:46:43 np0005625204.localdomain sudo[38846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:43 np0005625204.localdomain python3[38848]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.j_w9c8fi' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:43 np0005625204.localdomain sudo[38846]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:44 np0005625204.localdomain sudo[38864]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phfddeqbflcudkookspwynsgumwlipix ; /usr/bin/python3
Feb 20 07:46:44 np0005625204.localdomain sudo[38864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:44 np0005625204.localdomain python3[38866]: ansible-file Invoked with path=/tmp/ansible.j_w9c8fi state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:46:44 np0005625204.localdomain sudo[38864]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:44 np0005625204.localdomain sudo[38880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvcmrftvuazbtwpnpkjgdksxmgxnyhfz ; /usr/bin/python3
Feb 20 07:46:44 np0005625204.localdomain sudo[38880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:45 np0005625204.localdomain sshd[38749]: Invalid user mina from 185.246.128.171 port 33172
Feb 20 07:46:45 np0005625204.localdomain python3[38882]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:46:45 np0005625204.localdomain sudo[38880]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:45 np0005625204.localdomain sudo[38896]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttkjmcikhygyhldmfkggsekllqulqbuk ; /usr/bin/python3
Feb 20 07:46:45 np0005625204.localdomain sudo[38896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:45 np0005625204.localdomain python3[38898]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:45 np0005625204.localdomain sudo[38896]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:45 np0005625204.localdomain sudo[38914]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apuaogajrdpzmgygysieyhkuqqqgiulv ; /usr/bin/python3
Feb 20 07:46:45 np0005625204.localdomain sudo[38914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:45 np0005625204.localdomain sshd[38749]: Disconnecting invalid user mina 185.246.128.171 port 33172: Change of username or service not allowed: (mina,ssh-connection) -> (scsadmin,ssh-connection) [preauth]
Feb 20 07:46:45 np0005625204.localdomain python3[38916]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:45 np0005625204.localdomain sudo[38914]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:46 np0005625204.localdomain sudo[38933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eobawpjfhinbiqnnckeksrcfbguxdcfa ; /usr/bin/python3
Feb 20 07:46:46 np0005625204.localdomain sudo[38933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:46 np0005625204.localdomain python3[38935]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Feb 20 07:46:46 np0005625204.localdomain sudo[38933]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:46 np0005625204.localdomain sudo[38949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cteztsxktxrybsfspwwnwrnicqiswuef ; /usr/bin/python3
Feb 20 07:46:46 np0005625204.localdomain sudo[38949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:46 np0005625204.localdomain sudo[38949]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:46 np0005625204.localdomain sudo[38997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szgdabwvoyhpzyrrqgpupitivqzpnqyb ; /usr/bin/python3
Feb 20 07:46:46 np0005625204.localdomain sudo[38997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:47 np0005625204.localdomain sudo[38997]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:47 np0005625204.localdomain sudo[39040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfgakravkyswucrggffkyifjhrpscuuu ; /usr/bin/python3
Feb 20 07:46:47 np0005625204.localdomain sudo[39040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:47 np0005625204.localdomain sudo[39040]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:47 np0005625204.localdomain sshd[39057]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:48 np0005625204.localdomain sudo[39071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnfyteplemvndoecaodkbiosjqzmtkux ; /usr/bin/python3
Feb 20 07:46:48 np0005625204.localdomain sudo[39071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:48 np0005625204.localdomain python3[39073]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:48 np0005625204.localdomain sudo[39071]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:48 np0005625204.localdomain sudo[39088]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgtvztidsvbyruskrtlounmrnalxojrw ; /usr/bin/python3
Feb 20 07:46:48 np0005625204.localdomain sudo[39088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:49 np0005625204.localdomain python3[39090]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:46:52 np0005625204.localdomain sshd[39057]: Invalid user scsadmin from 185.246.128.171 port 6910
Feb 20 07:46:52 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 07:46:52 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:46:52 np0005625204.localdomain systemd-rc-local-generator[39206]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:46:52 np0005625204.localdomain systemd-sysv-generator[39213]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: tuned.service: Consumed 1.819s CPU time.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 07:46:52 np0005625204.localdomain systemd[1]: run-rddb0f232165f4e7caf1030d5981721e4.service: Deactivated successfully.
Feb 20 07:46:53 np0005625204.localdomain sshd[39057]: Disconnecting invalid user scsadmin 185.246.128.171 port 6910: Change of username or service not allowed: (scsadmin,ssh-connection) -> (operator,ssh-connection) [preauth]
Feb 20 07:46:54 np0005625204.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 20 07:46:54 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:46:54 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 07:46:54 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 07:46:54 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 07:46:54 np0005625204.localdomain systemd[1]: run-radcda4c2443e4fd58d37bb998d45cc0e.service: Deactivated successfully.
Feb 20 07:46:55 np0005625204.localdomain sudo[39088]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:55 np0005625204.localdomain sudo[39527]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knthgfxmanmwiltduamlmqlejyotblgk ; /usr/bin/python3
Feb 20 07:46:55 np0005625204.localdomain sudo[39527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:55 np0005625204.localdomain python3[39529]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:46:55 np0005625204.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 20 07:46:55 np0005625204.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 20 07:46:55 np0005625204.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 20 07:46:55 np0005625204.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 20 07:46:56 np0005625204.localdomain sshd[39706]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:46:57 np0005625204.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 20 07:46:57 np0005625204.localdomain sudo[39527]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:57 np0005625204.localdomain sudo[39723]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cabfkwdviqmvyinutngaetarojzmyyfp ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 20 07:46:57 np0005625204.localdomain sudo[39723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:57 np0005625204.localdomain python3[39725]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:46:57 np0005625204.localdomain sudo[39723]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:57 np0005625204.localdomain sudo[39740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnkvuwpyhvqldemqhlgqnodflgfpiubq ; /usr/bin/python3
Feb 20 07:46:57 np0005625204.localdomain sudo[39740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:58 np0005625204.localdomain python3[39742]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Feb 20 07:46:58 np0005625204.localdomain sudo[39740]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:58 np0005625204.localdomain sudo[39757]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epkkxtuygxgiinaodlzxnvwxolhvaybu ; /usr/bin/python3
Feb 20 07:46:58 np0005625204.localdomain sudo[39757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:58 np0005625204.localdomain python3[39759]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:46:58 np0005625204.localdomain sudo[39757]: pam_unix(sudo:session): session closed for user root
Feb 20 07:46:58 np0005625204.localdomain sudo[39773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrxhjhkhaippswrenxaybqrnhlnivkdb ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 20 07:46:58 np0005625204.localdomain sudo[39773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:46:59 np0005625204.localdomain python3[39775]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:00 np0005625204.localdomain sudo[39773]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:00 np0005625204.localdomain sudo[39793]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqgoidlbybwxhcoothyvglrgzsnxhsij ; /usr/bin/python3
Feb 20 07:47:00 np0005625204.localdomain sudo[39793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:00 np0005625204.localdomain python3[39795]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:00 np0005625204.localdomain sudo[39793]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:00 np0005625204.localdomain sudo[39810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aieunlavzwhsjroclmvjzodfzcbicuzy ; /usr/bin/python3
Feb 20 07:47:00 np0005625204.localdomain sudo[39810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:01 np0005625204.localdomain python3[39812]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:47:01 np0005625204.localdomain sudo[39810]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:03 np0005625204.localdomain sudo[39826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfbwzbrqwwcmangdzyklwlhnnoipihbk ; /usr/bin/python3
Feb 20 07:47:03 np0005625204.localdomain sudo[39826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:03 np0005625204.localdomain python3[39828]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:03 np0005625204.localdomain sudo[39826]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:05 np0005625204.localdomain sshd[39706]: Disconnecting authenticating user operator 185.246.128.171 port 52311: Change of username or service not allowed: (operator,ssh-connection) -> (dev,ssh-connection) [preauth]
Feb 20 07:47:06 np0005625204.localdomain sshd[39829]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:07 np0005625204.localdomain sshd[39829]: Invalid user dev from 185.246.128.171 port 41689
Feb 20 07:47:09 np0005625204.localdomain sudo[39844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbpbvkatlxeildxtdyhmtahguxbaaiud ; /usr/bin/python3
Feb 20 07:47:09 np0005625204.localdomain sudo[39844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:09 np0005625204.localdomain python3[39846]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:09 np0005625204.localdomain sudo[39844]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:09 np0005625204.localdomain sudo[39892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krmwlztkmgsgbasjrpprjqfcjkdyxfjg ; /usr/bin/python3
Feb 20 07:47:09 np0005625204.localdomain sudo[39892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:09 np0005625204.localdomain python3[39894]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:09 np0005625204.localdomain sudo[39892]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:10 np0005625204.localdomain sudo[39937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvqguefwpgsqmxbckrcpdktzdndbpalm ; /usr/bin/python3
Feb 20 07:47:10 np0005625204.localdomain sudo[39937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:10 np0005625204.localdomain python3[39939]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573629.5354686-70985-5671078970862/source _original_basename=tmphuc8ixg9 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:10 np0005625204.localdomain sudo[39937]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:10 np0005625204.localdomain sudo[39967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncbqifxpmoajhwhxewkzswnuspdpmobp ; /usr/bin/python3
Feb 20 07:47:10 np0005625204.localdomain sudo[39967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:10 np0005625204.localdomain python3[39969]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:10 np0005625204.localdomain sudo[39967]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:11 np0005625204.localdomain sudo[40015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhlmbutbhvtwxepafzbyinpbmmdnduxc ; /usr/bin/python3
Feb 20 07:47:11 np0005625204.localdomain sudo[40015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:11 np0005625204.localdomain python3[40017]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:11 np0005625204.localdomain sudo[40015]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:11 np0005625204.localdomain sshd[39829]: Disconnecting invalid user dev 185.246.128.171 port 41689: Change of username or service not allowed: (dev,ssh-connection) -> (wms,ssh-connection) [preauth]
Feb 20 07:47:11 np0005625204.localdomain sudo[40058]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxvaclqwgoplgnqicznktwqnxcmfkjfr ; /usr/bin/python3
Feb 20 07:47:11 np0005625204.localdomain sudo[40058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:11 np0005625204.localdomain python3[40060]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573631.1330435-71087-215878468385212/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=5387ef5e5a4b3d23a203db65b8a130e906dc0536 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:11 np0005625204.localdomain sudo[40058]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:12 np0005625204.localdomain sudo[40120]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnukvorzryrzgqmgzydzxbthrtzpnldb ; /usr/bin/python3
Feb 20 07:47:12 np0005625204.localdomain sudo[40120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:12 np0005625204.localdomain python3[40122]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:12 np0005625204.localdomain sudo[40120]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:12 np0005625204.localdomain sudo[40163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfdzbnodqcimvvrhrkieehezhxthwujn ; /usr/bin/python3
Feb 20 07:47:12 np0005625204.localdomain sudo[40163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:12 np0005625204.localdomain python3[40165]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573631.9541054-71141-186975358819900/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=b3e2a3c34ad78c32d8298bcfb96fa0bd48de4c29 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:12 np0005625204.localdomain sudo[40163]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:13 np0005625204.localdomain sudo[40225]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huxvcsdetwfpzjyuvgjtnijrzvudessv ; /usr/bin/python3
Feb 20 07:47:13 np0005625204.localdomain sudo[40225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:13 np0005625204.localdomain python3[40227]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:13 np0005625204.localdomain sudo[40225]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:13 np0005625204.localdomain sudo[40268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qguxebiqjjogjbbmhrvrslxkmvkkgbhz ; /usr/bin/python3
Feb 20 07:47:13 np0005625204.localdomain sudo[40268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:13 np0005625204.localdomain python3[40270]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573632.8013954-71141-13360655153265/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=9360c8b01c30dc9677a403a9f11e562b9309fb54 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:13 np0005625204.localdomain sudo[40268]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:14 np0005625204.localdomain sudo[40330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffpemxbhypksesbcbbycshsncnifogfv ; /usr/bin/python3
Feb 20 07:47:14 np0005625204.localdomain sudo[40330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:14 np0005625204.localdomain python3[40332]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:14 np0005625204.localdomain sudo[40330]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:14 np0005625204.localdomain sudo[40373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kehmlcprifeuzvwgeqttuiavuosqhkgs ; /usr/bin/python3
Feb 20 07:47:14 np0005625204.localdomain sudo[40373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:14 np0005625204.localdomain python3[40375]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573633.7584717-71141-24362054386575/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=1bd75eeb71ad8a06f7ad5bd2e02e7279e09e867f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:14 np0005625204.localdomain sudo[40373]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:14 np0005625204.localdomain sshd[40386]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:14 np0005625204.localdomain sudo[40436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jovbebiaphwnkhqbswhczegstbnwqrle ; /usr/bin/python3
Feb 20 07:47:14 np0005625204.localdomain sudo[40436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:15 np0005625204.localdomain python3[40438]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:15 np0005625204.localdomain sudo[40436]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:15 np0005625204.localdomain sudo[40479]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttndvvoswcxiklvtxxrnncraqifvrfso ; /usr/bin/python3
Feb 20 07:47:15 np0005625204.localdomain sudo[40479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:15 np0005625204.localdomain python3[40481]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573634.7654712-71141-8181946489592/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:15 np0005625204.localdomain sudo[40479]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:15 np0005625204.localdomain sudo[40542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljfprhkwaetqccxstpgygypuihxkntxr ; /usr/bin/python3
Feb 20 07:47:15 np0005625204.localdomain sudo[40542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:15 np0005625204.localdomain python3[40544]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:15 np0005625204.localdomain sudo[40542]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:16 np0005625204.localdomain sudo[40585]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phwaadvfsohxgmqibzacfviikpnukfas ; /usr/bin/python3
Feb 20 07:47:16 np0005625204.localdomain sudo[40585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:16 np0005625204.localdomain python3[40587]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573635.6510413-71141-92921810720287/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=cbfe5bf2a17b805f6637cedd456b7bd33893a9e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:16 np0005625204.localdomain sudo[40585]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:16 np0005625204.localdomain sudo[40647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yskmqnsnojceitpbdfnkhchzvaifllun ; /usr/bin/python3
Feb 20 07:47:16 np0005625204.localdomain sudo[40647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:16 np0005625204.localdomain python3[40649]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:16 np0005625204.localdomain sudo[40647]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:16 np0005625204.localdomain sudo[40690]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpwqnwaqbtrbmmzrlhgcnxwqaxffddjk ; /usr/bin/python3
Feb 20 07:47:17 np0005625204.localdomain sudo[40690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:17 np0005625204.localdomain python3[40692]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573636.49638-71141-32355966377171/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:17 np0005625204.localdomain sudo[40690]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:17 np0005625204.localdomain sudo[40752]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmffolksdohkchavfzcixvkxuanfudtt ; /usr/bin/python3
Feb 20 07:47:17 np0005625204.localdomain sudo[40752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:17 np0005625204.localdomain sudo[40755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:47:17 np0005625204.localdomain sudo[40755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:47:17 np0005625204.localdomain sudo[40755]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:17 np0005625204.localdomain python3[40754]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:17 np0005625204.localdomain sudo[40752]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:17 np0005625204.localdomain sudo[40770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:47:17 np0005625204.localdomain sudo[40770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:47:17 np0005625204.localdomain sshd[40386]: Invalid user wms from 185.246.128.171 port 21784
Feb 20 07:47:17 np0005625204.localdomain sudo[40825]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkqqqwcvejtyfjtmwmftprfjxbrqrtad ; /usr/bin/python3
Feb 20 07:47:17 np0005625204.localdomain sudo[40825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:17 np0005625204.localdomain python3[40827]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573637.3079348-71141-137780338074535/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=105f529004e67673ca4edd886c338642e88dedf6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:18 np0005625204.localdomain sudo[40825]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:18 np0005625204.localdomain sudo[40770]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:18 np0005625204.localdomain sshd[40386]: Disconnecting invalid user wms 185.246.128.171 port 21784: Change of username or service not allowed: (wms,ssh-connection) -> (ventas01,ssh-connection) [preauth]
Feb 20 07:47:18 np0005625204.localdomain sudo[40918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mywnjbzvibozsswbckkqxnqglmdhkivf ; /usr/bin/python3
Feb 20 07:47:18 np0005625204.localdomain sudo[40918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:18 np0005625204.localdomain python3[40920]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:18 np0005625204.localdomain sudo[40918]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:18 np0005625204.localdomain sudo[40961]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmdxqtyklnbyxvqobnzlnelwznkoiszv ; /usr/bin/python3
Feb 20 07:47:18 np0005625204.localdomain sudo[40961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:18 np0005625204.localdomain python3[40963]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573638.1331708-71141-212860132408312/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:18 np0005625204.localdomain sudo[40961]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:18 np0005625204.localdomain sudo[40973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:47:18 np0005625204.localdomain sudo[40973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:47:18 np0005625204.localdomain sudo[40973]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:19 np0005625204.localdomain sudo[41038]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riyjuifjoqvzfpnzjtfvizadmnpfpptq ; /usr/bin/python3
Feb 20 07:47:19 np0005625204.localdomain sudo[41038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:19 np0005625204.localdomain python3[41040]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:19 np0005625204.localdomain sudo[41038]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:19 np0005625204.localdomain sudo[41081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpngwaocsjttigknumdaksrqqrspmzfk ; /usr/bin/python3
Feb 20 07:47:19 np0005625204.localdomain sudo[41081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:19 np0005625204.localdomain python3[41083]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573638.9639196-71141-60813725389109/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:19 np0005625204.localdomain sudo[41081]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:19 np0005625204.localdomain sudo[41143]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxsbpgixdrlqpsnxwrpkemfmewihnntz ; /usr/bin/python3
Feb 20 07:47:19 np0005625204.localdomain sudo[41143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:20 np0005625204.localdomain python3[41145]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:20 np0005625204.localdomain sudo[41143]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:20 np0005625204.localdomain sudo[41186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqcvvwvarbxctchplvcelndsyblcfoen ; /usr/bin/python3
Feb 20 07:47:20 np0005625204.localdomain sudo[41186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:20 np0005625204.localdomain python3[41188]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573639.808891-71141-33791909834788/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=1a54ea8224417f04a01b19de6c5231a702bdb41b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:20 np0005625204.localdomain sudo[41186]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:21 np0005625204.localdomain sshd[41203]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:21 np0005625204.localdomain sudo[41217]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbemkcocbafsusetuxkxlyjntnekroda ; /usr/bin/python3
Feb 20 07:47:21 np0005625204.localdomain sudo[41217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:21 np0005625204.localdomain python3[41219]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:47:21 np0005625204.localdomain sudo[41217]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:21 np0005625204.localdomain sudo[41266]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zosgynmimdujxtavkervediwbjkqwmge ; /usr/bin/python3
Feb 20 07:47:21 np0005625204.localdomain sudo[41266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:22 np0005625204.localdomain python3[41268]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:47:22 np0005625204.localdomain sudo[41266]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:22 np0005625204.localdomain sudo[41309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkmentcqzwafwoeyputfcwrrvdwjbnsf ; /usr/bin/python3
Feb 20 07:47:22 np0005625204.localdomain sudo[41309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:22 np0005625204.localdomain python3[41311]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573641.833584-71995-12021929881542/source _original_basename=tmpcp_tb351 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:47:22 np0005625204.localdomain sudo[41309]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:24 np0005625204.localdomain sshd[41203]: Invalid user ventas01 from 185.246.128.171 port 53686
Feb 20 07:47:25 np0005625204.localdomain sshd[41203]: Disconnecting invalid user ventas01 185.246.128.171 port 53686: Change of username or service not allowed: (ventas01,ssh-connection) -> (httpadmin,ssh-connection) [preauth]
Feb 20 07:47:26 np0005625204.localdomain sudo[41339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niqkfsvsijrykeycomwbbicgjybjhiau ; /usr/bin/python3
Feb 20 07:47:26 np0005625204.localdomain sudo[41339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:26 np0005625204.localdomain python3[41341]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 07:47:27 np0005625204.localdomain sudo[41339]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:27 np0005625204.localdomain sudo[41400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyhbgbzdxtjiqcxerrqswanodaitnxem ; /usr/bin/python3
Feb 20 07:47:27 np0005625204.localdomain sudo[41400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:27 np0005625204.localdomain python3[41402]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:28 np0005625204.localdomain sshd[41404]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:31 np0005625204.localdomain sshd[41404]: Invalid user httpadmin from 185.246.128.171 port 27064
Feb 20 07:47:31 np0005625204.localdomain sudo[41400]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:31 np0005625204.localdomain sudo[41419]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efzkcatkuvywrffrzgpprhwydmvfnftj ; /usr/bin/python3
Feb 20 07:47:31 np0005625204.localdomain sudo[41419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:32 np0005625204.localdomain python3[41421]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:32 np0005625204.localdomain sshd[41423]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:32 np0005625204.localdomain sshd[41423]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:47:32 np0005625204.localdomain sshd[41404]: Disconnecting invalid user httpadmin 185.246.128.171 port 27064: Change of username or service not allowed: (httpadmin,ssh-connection) -> (xandeum,ssh-connection) [preauth]
Feb 20 07:47:36 np0005625204.localdomain sudo[41419]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:36 np0005625204.localdomain sshd[41425]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:37 np0005625204.localdomain sudo[41439]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqlscxxlzvmzcggvkhnvdmkjcgpwfqhu ; /usr/bin/python3
Feb 20 07:47:37 np0005625204.localdomain sudo[41439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:37 np0005625204.localdomain python3[41441]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:37 np0005625204.localdomain sudo[41439]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:37 np0005625204.localdomain sudo[41462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwgpkloooylirzydsccmxcmcubskemmm ; /usr/bin/python3
Feb 20 07:47:37 np0005625204.localdomain sudo[41462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:37 np0005625204.localdomain python3[41464]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:41 np0005625204.localdomain sshd[41425]: Invalid user xandeum from 185.246.128.171 port 5408
Feb 20 07:47:41 np0005625204.localdomain sshd[41425]: Disconnecting invalid user xandeum 185.246.128.171 port 5408: Change of username or service not allowed: (xandeum,ssh-connection) -> (morteza,ssh-connection) [preauth]
Feb 20 07:47:41 np0005625204.localdomain sudo[41462]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:42 np0005625204.localdomain sudo[41480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lredvavpoinrpldlwwaobeewpzlumryk ; /usr/bin/python3
Feb 20 07:47:42 np0005625204.localdomain sudo[41480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:42 np0005625204.localdomain python3[41482]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:42 np0005625204.localdomain sudo[41480]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:42 np0005625204.localdomain sudo[41503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpqbdatnxjcvvlvyevtcaqmlalcqfudk ; /usr/bin/python3
Feb 20 07:47:42 np0005625204.localdomain sudo[41503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:42 np0005625204.localdomain python3[41505]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:43 np0005625204.localdomain sshd[41507]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:46 np0005625204.localdomain sshd[41507]: Invalid user morteza from 185.246.128.171 port 35761
Feb 20 07:47:46 np0005625204.localdomain sshd[41507]: Disconnecting invalid user morteza 185.246.128.171 port 35761: Change of username or service not allowed: (morteza,ssh-connection) -> (user2,ssh-connection) [preauth]
Feb 20 07:47:47 np0005625204.localdomain sudo[41503]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:47 np0005625204.localdomain sudo[41522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjvczcrmxvyzmchgvumsdkcvptspintt ; /usr/bin/python3
Feb 20 07:47:47 np0005625204.localdomain sudo[41522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:47 np0005625204.localdomain python3[41524]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:48 np0005625204.localdomain sshd[41526]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:51 np0005625204.localdomain sudo[41522]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:51 np0005625204.localdomain sshd[41526]: Invalid user user2 from 185.246.128.171 port 62284
Feb 20 07:47:51 np0005625204.localdomain sudo[41541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tphdbgjujmqplpqjgncjmvbchivzvxky ; /usr/bin/python3
Feb 20 07:47:51 np0005625204.localdomain sudo[41541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:51 np0005625204.localdomain python3[41543]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:51 np0005625204.localdomain sudo[41541]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:52 np0005625204.localdomain sudo[41564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uidrfvgpguzpbgttkjsrxgmvojizizmk ; /usr/bin/python3
Feb 20 07:47:52 np0005625204.localdomain sudo[41564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:52 np0005625204.localdomain python3[41566]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:53 np0005625204.localdomain sshd[41526]: Disconnecting invalid user user2 185.246.128.171 port 62284: Change of username or service not allowed: (user2,ssh-connection) -> (helpdesk,ssh-connection) [preauth]
Feb 20 07:47:55 np0005625204.localdomain sshd[41568]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:47:56 np0005625204.localdomain systemd[36724]: Starting Mark boot as successful...
Feb 20 07:47:56 np0005625204.localdomain systemd[36724]: Finished Mark boot as successful.
Feb 20 07:47:56 np0005625204.localdomain sudo[41564]: pam_unix(sudo:session): session closed for user root
Feb 20 07:47:56 np0005625204.localdomain sudo[41584]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxeuwtwrvjnvdsgeubchwrlsoacqhdje ; /usr/bin/python3
Feb 20 07:47:56 np0005625204.localdomain sudo[41584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:47:56 np0005625204.localdomain python3[41586]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:47:58 np0005625204.localdomain sshd[41568]: Invalid user helpdesk from 185.246.128.171 port 29181
Feb 20 07:48:00 np0005625204.localdomain sshd[41568]: Disconnecting invalid user helpdesk 185.246.128.171 port 29181: Change of username or service not allowed: (helpdesk,ssh-connection) -> (git,ssh-connection) [preauth]
Feb 20 07:48:00 np0005625204.localdomain sudo[41584]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:01 np0005625204.localdomain sudo[41601]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgcphhpbqhmfvodjulrdqlxktttrsobx ; /usr/bin/python3
Feb 20 07:48:01 np0005625204.localdomain sudo[41601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:01 np0005625204.localdomain python3[41603]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:01 np0005625204.localdomain sudo[41601]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:01 np0005625204.localdomain sudo[41624]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-seyfnaznmiqwbkvardjrnqfffkogvnsy ; /usr/bin/python3
Feb 20 07:48:01 np0005625204.localdomain sudo[41624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:01 np0005625204.localdomain python3[41626]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:05 np0005625204.localdomain sshd[41628]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:05 np0005625204.localdomain sudo[41624]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:05 np0005625204.localdomain sudo[41642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lytgplsayxbwehimdezfgarndwbcnvdu ; /usr/bin/python3
Feb 20 07:48:05 np0005625204.localdomain sudo[41642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:06 np0005625204.localdomain python3[41644]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:09 np0005625204.localdomain sshd[41628]: Invalid user git from 185.246.128.171 port 15489
Feb 20 07:48:10 np0005625204.localdomain sudo[41642]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:10 np0005625204.localdomain sudo[41660]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeuwdzwcawqgwdbwsuekwdbiosgynuye ; /usr/bin/python3
Feb 20 07:48:10 np0005625204.localdomain sudo[41660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:10 np0005625204.localdomain python3[41662]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:10 np0005625204.localdomain sudo[41660]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:10 np0005625204.localdomain sudo[41683]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mibaolzaucsbpdtkjqesrleujnreyhmx ; /usr/bin/python3
Feb 20 07:48:10 np0005625204.localdomain sudo[41683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:10 np0005625204.localdomain python3[41685]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:14 np0005625204.localdomain sshd[41628]: Disconnecting invalid user git 185.246.128.171 port 15489: Change of username or service not allowed: (git,ssh-connection) -> (csgo,ssh-connection) [preauth]
Feb 20 07:48:15 np0005625204.localdomain sudo[41683]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:15 np0005625204.localdomain sudo[41700]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibdegkzdwpljevytdyguhpcrpecqeren ; /usr/bin/python3
Feb 20 07:48:15 np0005625204.localdomain sudo[41700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:15 np0005625204.localdomain python3[41702]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:18 np0005625204.localdomain sshd[41704]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:19 np0005625204.localdomain sudo[41706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:48:19 np0005625204.localdomain sudo[41706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:48:19 np0005625204.localdomain sudo[41706]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:19 np0005625204.localdomain sudo[41721]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:48:19 np0005625204.localdomain sudo[41721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:48:19 np0005625204.localdomain sudo[41700]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:19 np0005625204.localdomain sudo[41721]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:21 np0005625204.localdomain sudo[41781]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zywfdzpmibsbnhnksumghykbpmzapxct ; /usr/bin/python3
Feb 20 07:48:21 np0005625204.localdomain sudo[41781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:21 np0005625204.localdomain sshd[41783]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:21 np0005625204.localdomain python3[41784]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:21 np0005625204.localdomain sudo[41781]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:21 np0005625204.localdomain sshd[41783]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:48:21 np0005625204.localdomain sudo[41831]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mktzthtebfqilzgkzwwttnxotanpqznb ; /usr/bin/python3
Feb 20 07:48:21 np0005625204.localdomain sudo[41831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:21 np0005625204.localdomain sshd[41704]: Invalid user csgo from 185.246.128.171 port 16778
Feb 20 07:48:21 np0005625204.localdomain python3[41833]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:21 np0005625204.localdomain sudo[41831]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:21 np0005625204.localdomain sudo[41849]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geosinkfgtkmrpjxvkkokztcukiubvjj ; /usr/bin/python3
Feb 20 07:48:21 np0005625204.localdomain sudo[41849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:21 np0005625204.localdomain python3[41851]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpewy50165 recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:21 np0005625204.localdomain sudo[41849]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:22 np0005625204.localdomain sudo[41852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:48:22 np0005625204.localdomain sudo[41852]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:48:22 np0005625204.localdomain sudo[41852]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:22 np0005625204.localdomain sudo[41894]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tphpykhcaajxpprxbxanzhiejvbczidx ; /usr/bin/python3
Feb 20 07:48:22 np0005625204.localdomain sudo[41894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:22 np0005625204.localdomain python3[41896]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:22 np0005625204.localdomain sudo[41894]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:22 np0005625204.localdomain sshd[41704]: Disconnecting invalid user csgo 185.246.128.171 port 16778: Change of username or service not allowed: (csgo,ssh-connection) -> (siddharth,ssh-connection) [preauth]
Feb 20 07:48:22 np0005625204.localdomain sudo[41942]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfcrklisjxyvprneuktavpxstgcnqqcv ; /usr/bin/python3
Feb 20 07:48:22 np0005625204.localdomain sudo[41942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:23 np0005625204.localdomain python3[41944]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:23 np0005625204.localdomain sudo[41942]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:23 np0005625204.localdomain sudo[41960]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzwstsbrmebwrxidtieawttdnikmltxp ; /usr/bin/python3
Feb 20 07:48:23 np0005625204.localdomain sudo[41960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:23 np0005625204.localdomain python3[41962]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:23 np0005625204.localdomain sudo[41960]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:23 np0005625204.localdomain sudo[42022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thlqwqbuidakzmpjrctljpqguencyiha ; /usr/bin/python3
Feb 20 07:48:23 np0005625204.localdomain sudo[42022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:23 np0005625204.localdomain python3[42024]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:23 np0005625204.localdomain sudo[42022]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:24 np0005625204.localdomain sudo[42040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akhtekswehyvsfglacxgvxmpmozfbtig ; /usr/bin/python3
Feb 20 07:48:24 np0005625204.localdomain sudo[42040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:25 np0005625204.localdomain python3[42042]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:25 np0005625204.localdomain sshd[42043]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:25 np0005625204.localdomain sudo[42040]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:25 np0005625204.localdomain sshd[42044]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:26 np0005625204.localdomain sudo[42105]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivbxhbkxvoeapwitpuavkcrjkrtdyanw ; /usr/bin/python3
Feb 20 07:48:26 np0005625204.localdomain sudo[42105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:26 np0005625204.localdomain python3[42107]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:26 np0005625204.localdomain sudo[42105]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:26 np0005625204.localdomain sudo[42123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsyuwxjuyqlipvzlsyuntgmkcphqmgbp ; /usr/bin/python3
Feb 20 07:48:26 np0005625204.localdomain sudo[42123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:26 np0005625204.localdomain python3[42125]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:26 np0005625204.localdomain sudo[42123]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:26 np0005625204.localdomain sshd[42044]: Received disconnect from 189.143.72.189 port 40816:11: Bye Bye [preauth]
Feb 20 07:48:26 np0005625204.localdomain sshd[42044]: Disconnected from authenticating user root 189.143.72.189 port 40816 [preauth]
Feb 20 07:48:27 np0005625204.localdomain sudo[42186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niuhiqojrondqfxcrlcjblmmihhglofv ; /usr/bin/python3
Feb 20 07:48:27 np0005625204.localdomain sudo[42186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:27 np0005625204.localdomain python3[42188]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:27 np0005625204.localdomain sudo[42186]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:27 np0005625204.localdomain sudo[42204]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmqbvmafkcljaptyldjnseqoygchavpq ; /usr/bin/python3
Feb 20 07:48:27 np0005625204.localdomain sudo[42204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:27 np0005625204.localdomain python3[42206]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:27 np0005625204.localdomain sudo[42204]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:27 np0005625204.localdomain sudo[42266]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcufscmfpoluffxbboviqpzljttzygfn ; /usr/bin/python3
Feb 20 07:48:27 np0005625204.localdomain sudo[42266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:27 np0005625204.localdomain python3[42268]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:27 np0005625204.localdomain sudo[42266]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:28 np0005625204.localdomain sudo[42284]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lomonbvhxkqmlvqhbrnugjxefpkehvzt ; /usr/bin/python3
Feb 20 07:48:28 np0005625204.localdomain sudo[42284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:28 np0005625204.localdomain python3[42286]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:28 np0005625204.localdomain sudo[42284]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:28 np0005625204.localdomain sudo[42346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyadgmxhiuvdfermbifvyoxsglqgguwa ; /usr/bin/python3
Feb 20 07:48:28 np0005625204.localdomain sudo[42346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:28 np0005625204.localdomain sshd[42043]: Invalid user siddharth from 185.246.128.171 port 50555
Feb 20 07:48:28 np0005625204.localdomain python3[42348]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:28 np0005625204.localdomain sudo[42346]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:28 np0005625204.localdomain sudo[42364]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzeftxfmpdqhwxrftlcyrwbnsnanbqnm ; /usr/bin/python3
Feb 20 07:48:28 np0005625204.localdomain sudo[42364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:28 np0005625204.localdomain python3[42366]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:29 np0005625204.localdomain sudo[42364]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:29 np0005625204.localdomain sudo[42426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebfwcxzasjmzneqcffgfxmbhyctllupf ; /usr/bin/python3
Feb 20 07:48:29 np0005625204.localdomain sudo[42426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:29 np0005625204.localdomain sshd[42043]: Disconnecting invalid user siddharth 185.246.128.171 port 50555: Change of username or service not allowed: (siddharth,ssh-connection) -> (chenwang,ssh-connection) [preauth]
Feb 20 07:48:29 np0005625204.localdomain python3[42428]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:29 np0005625204.localdomain sudo[42426]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:29 np0005625204.localdomain sudo[42444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-korhjahwxququupqyvntylfzqvhtuprl ; /usr/bin/python3
Feb 20 07:48:29 np0005625204.localdomain sudo[42444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:29 np0005625204.localdomain python3[42446]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:29 np0005625204.localdomain sudo[42444]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:30 np0005625204.localdomain sudo[42506]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjsaunanwjqnzloxkhvkeaxvfxfyxfex ; /usr/bin/python3
Feb 20 07:48:30 np0005625204.localdomain sudo[42506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:30 np0005625204.localdomain python3[42508]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:30 np0005625204.localdomain sudo[42506]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:30 np0005625204.localdomain sudo[42524]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xastxxknioajyonkbjhuevodbouzwuuy ; /usr/bin/python3
Feb 20 07:48:30 np0005625204.localdomain sudo[42524]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:30 np0005625204.localdomain python3[42526]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:30 np0005625204.localdomain sudo[42524]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:30 np0005625204.localdomain sshd[42573]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:30 np0005625204.localdomain sudo[42587]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiyomhmelkameyjgubaqwakzkvzjgjtm ; /usr/bin/python3
Feb 20 07:48:30 np0005625204.localdomain sudo[42587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:31 np0005625204.localdomain python3[42589]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:31 np0005625204.localdomain sudo[42587]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:31 np0005625204.localdomain sudo[42605]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lglpyhidptkbvmdkpvhatclmsdtekuxk ; /usr/bin/python3
Feb 20 07:48:31 np0005625204.localdomain sudo[42605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:31 np0005625204.localdomain python3[42607]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:31 np0005625204.localdomain sudo[42605]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:31 np0005625204.localdomain sudo[42668]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exqewbybpszkofdnjwlyqzwidjppzqhn ; /usr/bin/python3
Feb 20 07:48:31 np0005625204.localdomain sudo[42668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:31 np0005625204.localdomain python3[42670]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:31 np0005625204.localdomain sudo[42668]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:31 np0005625204.localdomain sudo[42686]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfwlmyaofabcznfruktdjuvujdpsxvax ; /usr/bin/python3
Feb 20 07:48:31 np0005625204.localdomain sudo[42686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:32 np0005625204.localdomain python3[42688]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:32 np0005625204.localdomain sudo[42686]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:32 np0005625204.localdomain sudo[42748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzzprzxhcuzwvfvhrhwrevxhafgetprf ; /usr/bin/python3
Feb 20 07:48:32 np0005625204.localdomain sudo[42748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:32 np0005625204.localdomain python3[42750]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:32 np0005625204.localdomain sudo[42748]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:32 np0005625204.localdomain sudo[42766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rixmcovolmngrlknakwtzyjdbhybdnml ; /usr/bin/python3
Feb 20 07:48:32 np0005625204.localdomain sudo[42766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:32 np0005625204.localdomain python3[42768]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:32 np0005625204.localdomain sudo[42766]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:33 np0005625204.localdomain sudo[42796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgqpkbrigqktfzgxnjuoadfyujygjkwm ; /usr/bin/python3
Feb 20 07:48:33 np0005625204.localdomain sudo[42796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:33 np0005625204.localdomain python3[42798]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:48:33 np0005625204.localdomain sudo[42796]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:33 np0005625204.localdomain sudo[42844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlcgscfpvhdnffpugyodtsfgarnplofd ; /usr/bin/python3
Feb 20 07:48:33 np0005625204.localdomain sudo[42844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:33 np0005625204.localdomain python3[42846]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:33 np0005625204.localdomain sudo[42844]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:34 np0005625204.localdomain sudo[42862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmtnpwqejqebhhtyqobbaiydonzgalsq ; /usr/bin/python3
Feb 20 07:48:34 np0005625204.localdomain sudo[42862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:34 np0005625204.localdomain python3[42864]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpxnjlkiz1 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:34 np0005625204.localdomain sudo[42862]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:35 np0005625204.localdomain sshd[42573]: Invalid user chenwang from 185.246.128.171 port 14244
Feb 20 07:48:35 np0005625204.localdomain sshd[42573]: Disconnecting invalid user chenwang 185.246.128.171 port 14244: Change of username or service not allowed: (chenwang,ssh-connection) -> (trx,ssh-connection) [preauth]
Feb 20 07:48:36 np0005625204.localdomain sudo[42892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqfncihrxxodlcdjhsbtbasvugfqydwk ; /usr/bin/python3
Feb 20 07:48:36 np0005625204.localdomain sudo[42892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:36 np0005625204.localdomain python3[42894]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:48:38 np0005625204.localdomain sshd[42896]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:39 np0005625204.localdomain sudo[42892]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:41 np0005625204.localdomain sudo[42911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyjiminmdrlgoajcxgndiofcpeixyjcv ; /usr/bin/python3
Feb 20 07:48:41 np0005625204.localdomain sudo[42911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:41 np0005625204.localdomain python3[42913]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:48:41 np0005625204.localdomain sudo[42911]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:41 np0005625204.localdomain sudo[42929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyabrspeqojipgulczfuevderjehoksx ; /usr/bin/python3
Feb 20 07:48:41 np0005625204.localdomain sudo[42929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:41 np0005625204.localdomain python3[42931]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:48:42 np0005625204.localdomain sshd[42896]: Invalid user trx from 185.246.128.171 port 50785
Feb 20 07:48:42 np0005625204.localdomain sudo[42929]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:43 np0005625204.localdomain sudo[42947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgahbseqgglchwfxkrqfzxhtkmsqleby ; /usr/bin/python3
Feb 20 07:48:43 np0005625204.localdomain sudo[42947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:43 np0005625204.localdomain python3[42949]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:48:43 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:48:43 np0005625204.localdomain systemd-sysv-generator[42981]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:48:43 np0005625204.localdomain systemd-rc-local-generator[42975]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:48:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:48:43 np0005625204.localdomain systemd[1]: Starting Netfilter Tables...
Feb 20 07:48:43 np0005625204.localdomain systemd[1]: Finished Netfilter Tables.
Feb 20 07:48:43 np0005625204.localdomain sudo[42947]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:44 np0005625204.localdomain sudo[43037]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwamgghixnzeozevqyxuqnladjvxjknq ; /usr/bin/python3
Feb 20 07:48:44 np0005625204.localdomain sudo[43037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:44 np0005625204.localdomain python3[43039]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:44 np0005625204.localdomain sudo[43037]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:44 np0005625204.localdomain sudo[43080]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcpomlifewqgazyragspnrskozhzpicw ; /usr/bin/python3
Feb 20 07:48:44 np0005625204.localdomain sudo[43080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:44 np0005625204.localdomain python3[43082]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573724.157667-74834-258154693612869/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:44 np0005625204.localdomain sudo[43080]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:44 np0005625204.localdomain sshd[42896]: Disconnecting invalid user trx 185.246.128.171 port 50785: Change of username or service not allowed: (trx,ssh-connection) -> (maria,ssh-connection) [preauth]
Feb 20 07:48:45 np0005625204.localdomain sudo[43110]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaxkpzeumaptvexdclyphihrssvhjxvb ; /usr/bin/python3
Feb 20 07:48:45 np0005625204.localdomain sudo[43110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:45 np0005625204.localdomain python3[43112]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:45 np0005625204.localdomain sudo[43110]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:45 np0005625204.localdomain sudo[43128]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aujwfwpkdvhghzsdhpwvmmcdmudgooaz ; /usr/bin/python3
Feb 20 07:48:45 np0005625204.localdomain sudo[43128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:45 np0005625204.localdomain python3[43130]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:45 np0005625204.localdomain sudo[43128]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:46 np0005625204.localdomain sudo[43177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmkhsginwfaekhxdnanwkvlcfvwldzlf ; /usr/bin/python3
Feb 20 07:48:46 np0005625204.localdomain sudo[43177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:46 np0005625204.localdomain python3[43179]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:46 np0005625204.localdomain sudo[43177]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:46 np0005625204.localdomain sudo[43220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsocbmkxwjewexzreyfscdjtnmtbroll ; /usr/bin/python3
Feb 20 07:48:46 np0005625204.localdomain sudo[43220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:46 np0005625204.localdomain python3[43222]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573725.875214-74947-86404261163006/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:46 np0005625204.localdomain sudo[43220]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:47 np0005625204.localdomain sudo[43282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqdknycmmflssonhcxznkrxjyykuwnul ; /usr/bin/python3
Feb 20 07:48:47 np0005625204.localdomain sudo[43282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:47 np0005625204.localdomain python3[43284]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:47 np0005625204.localdomain sudo[43282]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:47 np0005625204.localdomain sudo[43325]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uladlruatenopjwehvoxaamuuriolhan ; /usr/bin/python3
Feb 20 07:48:47 np0005625204.localdomain sudo[43325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:47 np0005625204.localdomain python3[43327]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573726.7924778-75153-33929115001382/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:47 np0005625204.localdomain sudo[43325]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:47 np0005625204.localdomain sshd[43363]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:48 np0005625204.localdomain sudo[43388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kokstcpzytzfvoxbhhkgrgmfmpcwdjuf ; /usr/bin/python3
Feb 20 07:48:48 np0005625204.localdomain sudo[43388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:48 np0005625204.localdomain python3[43390]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:48 np0005625204.localdomain sudo[43388]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:48 np0005625204.localdomain sudo[43431]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcdiecnnrgvoirapuhjqwhkitymnpbcy ; /usr/bin/python3
Feb 20 07:48:48 np0005625204.localdomain sudo[43431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:48 np0005625204.localdomain python3[43433]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573727.826416-75217-194572242159236/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:48 np0005625204.localdomain sudo[43431]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:48 np0005625204.localdomain sudo[43493]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqjembkhfrhxbmwnavlelowtrrorduju ; /usr/bin/python3
Feb 20 07:48:48 np0005625204.localdomain sudo[43493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:49 np0005625204.localdomain python3[43495]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:49 np0005625204.localdomain sudo[43493]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:49 np0005625204.localdomain sudo[43536]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulmpbwnvpemmbddkaciorzrzohcnvlvq ; /usr/bin/python3
Feb 20 07:48:49 np0005625204.localdomain sudo[43536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:49 np0005625204.localdomain python3[43538]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573728.793991-75274-74104834022578/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:49 np0005625204.localdomain sudo[43536]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:50 np0005625204.localdomain sudo[43599]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxyojxuuhnajvlwpwtapfzscgejvruwr ; /usr/bin/python3
Feb 20 07:48:50 np0005625204.localdomain sudo[43599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:50 np0005625204.localdomain python3[43601]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:48:50 np0005625204.localdomain sudo[43599]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:50 np0005625204.localdomain sudo[43642]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zryzntugehvgsdviiaeneddbcxuwrffq ; /usr/bin/python3
Feb 20 07:48:50 np0005625204.localdomain sudo[43642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:50 np0005625204.localdomain python3[43644]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573729.673532-75313-178440309902299/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:50 np0005625204.localdomain sudo[43642]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:51 np0005625204.localdomain sshd[43363]: Invalid user maria from 185.246.128.171 port 33535
Feb 20 07:48:51 np0005625204.localdomain sudo[43672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeegrelcdiknqgwcmwzecdhvsakmegoz ; /usr/bin/python3
Feb 20 07:48:51 np0005625204.localdomain sudo[43672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:51 np0005625204.localdomain sshd[43363]: Disconnecting invalid user maria 185.246.128.171 port 33535: Change of username or service not allowed: (maria,ssh-connection) -> (natalie,ssh-connection) [preauth]
Feb 20 07:48:51 np0005625204.localdomain python3[43674]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:51 np0005625204.localdomain sudo[43672]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:51 np0005625204.localdomain sudo[43737]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vobeyvzpxiqsxypwasfmanwlozvhpbpa ; /usr/bin/python3
Feb 20 07:48:51 np0005625204.localdomain sudo[43737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:51 np0005625204.localdomain python3[43739]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:48:51 np0005625204.localdomain sudo[43737]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:52 np0005625204.localdomain sshd[43743]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:52 np0005625204.localdomain sudo[43755]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsxqacwgpzzdyqcbjvnamoubwutgdmgl ; /usr/bin/python3
Feb 20 07:48:52 np0005625204.localdomain sudo[43755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:52 np0005625204.localdomain python3[43758]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:52 np0005625204.localdomain sudo[43755]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:52 np0005625204.localdomain sudo[43773]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wejdnwzzahthmpcxltgovjrnnsisedsa ; /usr/bin/python3
Feb 20 07:48:52 np0005625204.localdomain sudo[43773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:52 np0005625204.localdomain python3[43775]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:48:52 np0005625204.localdomain sudo[43773]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:53 np0005625204.localdomain sudo[43792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ancmwohszhjvxythsggzknpduhbyksqr ; /usr/bin/python3
Feb 20 07:48:53 np0005625204.localdomain sudo[43792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:53 np0005625204.localdomain python3[43794]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:48:53 np0005625204.localdomain sudo[43792]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:53 np0005625204.localdomain sudo[43808]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkaariyuomhbibjnkgciccgitxoeytir ; /usr/bin/python3
Feb 20 07:48:53 np0005625204.localdomain sudo[43808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:53 np0005625204.localdomain python3[43810]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:48:53 np0005625204.localdomain sudo[43808]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:53 np0005625204.localdomain sudo[43824]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xuksdqxetozzjeuhhrbddlkmzohwiawl ; /usr/bin/python3
Feb 20 07:48:53 np0005625204.localdomain sudo[43824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:53 np0005625204.localdomain python3[43826]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:48:54 np0005625204.localdomain sudo[43824]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:54 np0005625204.localdomain sshd[43743]: Invalid user natalie from 185.246.128.171 port 54568
Feb 20 07:48:54 np0005625204.localdomain sudo[43840]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eijwfbtbqdgyyatuzmfymwcszqrftdly ; /usr/bin/python3
Feb 20 07:48:54 np0005625204.localdomain sudo[43840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:54 np0005625204.localdomain python3[43842]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 20 07:48:54 np0005625204.localdomain sshd[43743]: Disconnecting invalid user natalie 185.246.128.171 port 54568: Change of username or service not allowed: (natalie,ssh-connection) -> (scan,ssh-connection) [preauth]
Feb 20 07:48:55 np0005625204.localdomain sudo[43840]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:55 np0005625204.localdomain sudo[43861]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqgxtshbbocxxuiisiqkyinfsoykvnyb ; /usr/bin/python3
Feb 20 07:48:55 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 20 07:48:55 np0005625204.localdomain sudo[43861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:55 np0005625204.localdomain python3[43863]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 20 07:48:55 np0005625204.localdomain sshd[43864]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:48:56 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:48:56 np0005625204.localdomain sudo[43861]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:56 np0005625204.localdomain sudo[43884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-injtdgntgayvjbbdgnrsysqquyqabfwf ; /usr/bin/python3
Feb 20 07:48:56 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 20 07:48:56 np0005625204.localdomain sudo[43884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:56 np0005625204.localdomain python3[43886]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 20 07:48:57 np0005625204.localdomain sshd[43864]: Invalid user scan from 185.246.128.171 port 8987
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:48:57 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:48:57 np0005625204.localdomain sudo[43884]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:57 np0005625204.localdomain sudo[43905]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtwcnsckjnwtvtuapamcoizixhgltbul ; /usr/bin/python3
Feb 20 07:48:57 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 20 07:48:57 np0005625204.localdomain sudo[43905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:58 np0005625204.localdomain python3[43907]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 20 07:48:58 np0005625204.localdomain sshd[43864]: Disconnecting invalid user scan 185.246.128.171 port 8987: Change of username or service not allowed: (scan,ssh-connection) -> (joe,ssh-connection) [preauth]
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  Converting 2704 SID table entries...
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:48:58 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:48:58 np0005625204.localdomain sudo[43905]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:59 np0005625204.localdomain sudo[43926]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypcyxoypmiyurlzqsovmwchwjowkwxuq ; /usr/bin/python3
Feb 20 07:48:59 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 20 07:48:59 np0005625204.localdomain sudo[43926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:59 np0005625204.localdomain python3[43928]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:48:59 np0005625204.localdomain sudo[43926]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:59 np0005625204.localdomain sudo[43942]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgjjgpxbzfasvsuchanaslrlnlvigxhb ; /usr/bin/python3
Feb 20 07:48:59 np0005625204.localdomain sudo[43942]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:48:59 np0005625204.localdomain python3[43944]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:48:59 np0005625204.localdomain sudo[43942]: pam_unix(sudo:session): session closed for user root
Feb 20 07:48:59 np0005625204.localdomain sudo[43958]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqpzxwrwrjsddukyxjpujyuavnofuwae ; /usr/bin/python3
Feb 20 07:48:59 np0005625204.localdomain sudo[43958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:00 np0005625204.localdomain python3[43960]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:00 np0005625204.localdomain sudo[43958]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:00 np0005625204.localdomain sudo[43974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drijdtjwozzdkqyxnszsmjvrkfacatyt ; /usr/bin/python3
Feb 20 07:49:00 np0005625204.localdomain sudo[43974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:00 np0005625204.localdomain python3[43976]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:49:00 np0005625204.localdomain sudo[43974]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:00 np0005625204.localdomain sudo[43990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkqfmwjphofmzvdaeleekaldvotfhvwn ; /usr/bin/python3
Feb 20 07:49:00 np0005625204.localdomain sudo[43990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:00 np0005625204.localdomain python3[43992]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:00 np0005625204.localdomain sudo[43990]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:01 np0005625204.localdomain sudo[44007]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gstyxvwfhfsqlvxusttzueunlbcivyvz ; /usr/bin/python3
Feb 20 07:49:01 np0005625204.localdomain sudo[44007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:01 np0005625204.localdomain python3[44009]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:49:01 np0005625204.localdomain sshd[44011]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:03 np0005625204.localdomain sshd[44011]: Invalid user joe from 185.246.128.171 port 38071
Feb 20 07:49:03 np0005625204.localdomain sshd[44013]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:03 np0005625204.localdomain sshd[44011]: Disconnecting invalid user joe 185.246.128.171 port 38071: Change of username or service not allowed: (joe,ssh-connection) -> (casaos,ssh-connection) [preauth]
Feb 20 07:49:04 np0005625204.localdomain sshd[44013]: Invalid user n8n from 202.165.22.246 port 42912
Feb 20 07:49:04 np0005625204.localdomain sshd[44013]: Received disconnect from 202.165.22.246 port 42912:11: Bye Bye [preauth]
Feb 20 07:49:04 np0005625204.localdomain sshd[44013]: Disconnected from invalid user n8n 202.165.22.246 port 42912 [preauth]
Feb 20 07:49:05 np0005625204.localdomain sudo[44007]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:05 np0005625204.localdomain sudo[44028]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmkxuiqmyprznhmrtditrhuogtzxjuod ; /usr/bin/python3
Feb 20 07:49:05 np0005625204.localdomain sudo[44028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:05 np0005625204.localdomain python3[44030]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:05 np0005625204.localdomain sudo[44028]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:05 np0005625204.localdomain sudo[44076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcxhhzqdkplotxjrxbjjwikgvipemlar ; /usr/bin/python3
Feb 20 07:49:05 np0005625204.localdomain sudo[44076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:05 np0005625204.localdomain python3[44078]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:05 np0005625204.localdomain sudo[44076]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:06 np0005625204.localdomain sudo[44119]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svqssaksaaqstrdekmxgfxrfggurcpdn ; /usr/bin/python3
Feb 20 07:49:06 np0005625204.localdomain sudo[44119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:06 np0005625204.localdomain python3[44121]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573745.5980115-76140-43270010184730/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:06 np0005625204.localdomain sudo[44119]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:06 np0005625204.localdomain sudo[44149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsozxztcihtgihgvkuswoeesnoxljwlf ; /usr/bin/python3
Feb 20 07:49:06 np0005625204.localdomain sudo[44149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:06 np0005625204.localdomain python3[44151]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:49:06 np0005625204.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 20 07:49:06 np0005625204.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 20 07:49:06 np0005625204.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 20 07:49:06 np0005625204.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 20 07:49:06 np0005625204.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 20 07:49:06 np0005625204.localdomain kernel: Bridge firewalling registered
Feb 20 07:49:06 np0005625204.localdomain systemd-modules-load[44154]: Inserted module 'br_netfilter'
Feb 20 07:49:06 np0005625204.localdomain systemd-modules-load[44154]: Module 'msr' is built in
Feb 20 07:49:06 np0005625204.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 20 07:49:06 np0005625204.localdomain sudo[44149]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:06 np0005625204.localdomain sshd[44158]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:07 np0005625204.localdomain sudo[44204]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqlzgtcyuprtxaybfwmriayefrylrqls ; /usr/bin/python3
Feb 20 07:49:07 np0005625204.localdomain sudo[44204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:07 np0005625204.localdomain python3[44206]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:07 np0005625204.localdomain sudo[44204]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:07 np0005625204.localdomain sudo[44247]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhvdgwagcefoapqvxkkxkjmgiunmsyqy ; /usr/bin/python3
Feb 20 07:49:07 np0005625204.localdomain sudo[44247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:07 np0005625204.localdomain python3[44249]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573747.0509849-76179-9692904317391/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:07 np0005625204.localdomain sudo[44247]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:08 np0005625204.localdomain sudo[44277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxbrbvqdhkdzwirjxvnxgyvnfuorjvks ; /usr/bin/python3
Feb 20 07:49:08 np0005625204.localdomain sudo[44277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:08 np0005625204.localdomain python3[44279]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:08 np0005625204.localdomain sudo[44277]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:08 np0005625204.localdomain sudo[44294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dchroafsqueknbjuhdrpuxrhortenium ; /usr/bin/python3
Feb 20 07:49:08 np0005625204.localdomain sudo[44294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:08 np0005625204.localdomain python3[44296]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:08 np0005625204.localdomain sudo[44294]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:08 np0005625204.localdomain sudo[44312]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkakrezpwtlohjojlcajnzrswgmflilz ; /usr/bin/python3
Feb 20 07:49:08 np0005625204.localdomain sudo[44312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:08 np0005625204.localdomain python3[44314]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:08 np0005625204.localdomain sudo[44312]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:08 np0005625204.localdomain sudo[44330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgtuyjeulqmnoghencolxxlfsefxcyyr ; /usr/bin/python3
Feb 20 07:49:08 np0005625204.localdomain sudo[44330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:09 np0005625204.localdomain python3[44332]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:09 np0005625204.localdomain sudo[44330]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:09 np0005625204.localdomain sudo[44348]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwawfipvcptevmbzfawiozihtcylpghb ; /usr/bin/python3
Feb 20 07:49:09 np0005625204.localdomain sudo[44348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:09 np0005625204.localdomain python3[44350]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:09 np0005625204.localdomain sudo[44348]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:09 np0005625204.localdomain sshd[44364]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:09 np0005625204.localdomain sudo[44366]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeirlvqzaawuhxyatlkxlmlpuywondoi ; /usr/bin/python3
Feb 20 07:49:09 np0005625204.localdomain sudo[44366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:09 np0005625204.localdomain sshd[44364]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:49:09 np0005625204.localdomain python3[44369]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:09 np0005625204.localdomain sudo[44366]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:09 np0005625204.localdomain sudo[44384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnjvmeiofainmlkdhcwlqkzrsupdkidh ; /usr/bin/python3
Feb 20 07:49:09 np0005625204.localdomain sudo[44384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:09 np0005625204.localdomain python3[44386]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:09 np0005625204.localdomain sudo[44384]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:10 np0005625204.localdomain sudo[44402]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhpgusmroijlhtwkymugvfypjmcvhroo ; /usr/bin/python3
Feb 20 07:49:10 np0005625204.localdomain sudo[44402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:10 np0005625204.localdomain python3[44404]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:10 np0005625204.localdomain sudo[44402]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:10 np0005625204.localdomain sudo[44420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nihghoshbrmwjrpijnhnwtxwmwaulthx ; /usr/bin/python3
Feb 20 07:49:10 np0005625204.localdomain sudo[44420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:10 np0005625204.localdomain python3[44422]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:10 np0005625204.localdomain sudo[44420]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:10 np0005625204.localdomain sudo[44438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyybjacojjgyevbbuqugtyvrkhhrqbmv ; /usr/bin/python3
Feb 20 07:49:10 np0005625204.localdomain sudo[44438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:10 np0005625204.localdomain sshd[44158]: Invalid user casaos from 185.246.128.171 port 64572
Feb 20 07:49:10 np0005625204.localdomain python3[44440]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:10 np0005625204.localdomain sudo[44438]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:11 np0005625204.localdomain sudo[44456]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvqouinlfqdwahcqlfidzxaxpnrtnqef ; /usr/bin/python3
Feb 20 07:49:11 np0005625204.localdomain sudo[44456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:11 np0005625204.localdomain python3[44458]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:11 np0005625204.localdomain sudo[44456]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:11 np0005625204.localdomain sudo[44474]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecpolgavehzefkdlrvtxltplckcizype ; /usr/bin/python3
Feb 20 07:49:11 np0005625204.localdomain sudo[44474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:11 np0005625204.localdomain python3[44476]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:11 np0005625204.localdomain sudo[44474]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:11 np0005625204.localdomain sudo[44492]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmddljyqyqggpmpnptocqbdhnnjquzak ; /usr/bin/python3
Feb 20 07:49:11 np0005625204.localdomain sudo[44492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:11 np0005625204.localdomain python3[44494]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:11 np0005625204.localdomain sudo[44492]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:11 np0005625204.localdomain sshd[44158]: Disconnecting invalid user casaos 185.246.128.171 port 64572: Change of username or service not allowed: (casaos,ssh-connection) -> (anonymous,ssh-connection) [preauth]
Feb 20 07:49:12 np0005625204.localdomain sudo[44510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcybdssaoyuuyimsmgpgpyfcbllzsnyx ; /usr/bin/python3
Feb 20 07:49:12 np0005625204.localdomain sudo[44510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:12 np0005625204.localdomain python3[44512]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:12 np0005625204.localdomain sudo[44510]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:12 np0005625204.localdomain sudo[44527]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orvtnaklcqibfljdodfxmwsikvptbvur ; /usr/bin/python3
Feb 20 07:49:12 np0005625204.localdomain sudo[44527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:12 np0005625204.localdomain python3[44529]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:12 np0005625204.localdomain sudo[44527]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:12 np0005625204.localdomain sudo[44544]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyxztnwxfazdbmyrcvdthnjuyevjtoih ; /usr/bin/python3
Feb 20 07:49:12 np0005625204.localdomain sudo[44544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:12 np0005625204.localdomain python3[44546]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:12 np0005625204.localdomain sudo[44544]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:12 np0005625204.localdomain sudo[44561]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrrzahvmgpnipkhqdlabywkstkriqczg ; /usr/bin/python3
Feb 20 07:49:12 np0005625204.localdomain sudo[44561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:13 np0005625204.localdomain python3[44563]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:13 np0005625204.localdomain sudo[44561]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:13 np0005625204.localdomain sshd[44565]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:13 np0005625204.localdomain sudo[44579]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynqdrsvtusvgvkqzwabtuhvayjgcbtep ; /usr/bin/python3
Feb 20 07:49:13 np0005625204.localdomain sudo[44579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:13 np0005625204.localdomain python3[44581]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 20 07:49:13 np0005625204.localdomain sudo[44579]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:13 np0005625204.localdomain sudo[44598]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmgtnmulkjwwbyetuqullcagovwdlpwd ; /usr/bin/python3
Feb 20 07:49:13 np0005625204.localdomain sudo[44598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:13 np0005625204.localdomain python3[44600]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 07:49:13 np0005625204.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 20 07:49:13 np0005625204.localdomain systemd[1]: Stopped Apply Kernel Variables.
Feb 20 07:49:13 np0005625204.localdomain systemd[1]: Stopping Apply Kernel Variables...
Feb 20 07:49:13 np0005625204.localdomain systemd[1]: Starting Apply Kernel Variables...
Feb 20 07:49:13 np0005625204.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 20 07:49:13 np0005625204.localdomain systemd[1]: Finished Apply Kernel Variables.
Feb 20 07:49:13 np0005625204.localdomain sudo[44598]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:14 np0005625204.localdomain sudo[44618]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pipsppogpvploszgjnudhtbnmtdbiynh ; /usr/bin/python3
Feb 20 07:49:14 np0005625204.localdomain sudo[44618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:14 np0005625204.localdomain python3[44620]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:14 np0005625204.localdomain sudo[44618]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:14 np0005625204.localdomain sudo[44634]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zszfeernkxifsucrahyskdzcnbnqzagj ; /usr/bin/python3
Feb 20 07:49:14 np0005625204.localdomain sudo[44634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:14 np0005625204.localdomain python3[44636]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:14 np0005625204.localdomain sudo[44634]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:14 np0005625204.localdomain sudo[44650]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpldfyhfvfmqdsjnirelmjfazpmqntlg ; /usr/bin/python3
Feb 20 07:49:14 np0005625204.localdomain sudo[44650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:14 np0005625204.localdomain python3[44652]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:14 np0005625204.localdomain sudo[44650]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:15 np0005625204.localdomain sshd[44565]: Invalid user anonymous from 185.246.128.171 port 32333
Feb 20 07:49:15 np0005625204.localdomain sudo[44666]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgnyxdfmoadorpbcrtucfssbcxmjzvae ; /usr/bin/python3
Feb 20 07:49:15 np0005625204.localdomain sudo[44666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:15 np0005625204.localdomain python3[44668]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:49:15 np0005625204.localdomain sudo[44666]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:15 np0005625204.localdomain sudo[44682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qemhbimsyrsikeeoxcjsyotgxdxupbxv ; /usr/bin/python3
Feb 20 07:49:15 np0005625204.localdomain sudo[44682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:15 np0005625204.localdomain python3[44684]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:15 np0005625204.localdomain sudo[44682]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:15 np0005625204.localdomain sudo[44698]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myepcoyfnnjrarbjwjadedkliczetizo ; /usr/bin/python3
Feb 20 07:49:15 np0005625204.localdomain sudo[44698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:15 np0005625204.localdomain python3[44700]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:15 np0005625204.localdomain sudo[44698]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:16 np0005625204.localdomain sudo[44714]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knscxrbfkalrjgdtnvvgmctbxjrdvtol ; /usr/bin/python3
Feb 20 07:49:16 np0005625204.localdomain sudo[44714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:16 np0005625204.localdomain python3[44716]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:16 np0005625204.localdomain sudo[44714]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:16 np0005625204.localdomain sudo[44730]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnjywzarfzfyugmxpconogilcmwlaumj ; /usr/bin/python3
Feb 20 07:49:16 np0005625204.localdomain sudo[44730]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:16 np0005625204.localdomain python3[44732]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:16 np0005625204.localdomain sudo[44730]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:16 np0005625204.localdomain sudo[44746]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csxtldpjkbxgwzgxtncmekaypskrewta ; /usr/bin/python3
Feb 20 07:49:16 np0005625204.localdomain sudo[44746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:16 np0005625204.localdomain python3[44748]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:16 np0005625204.localdomain sudo[44746]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:17 np0005625204.localdomain sudo[44794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oekgnxxhwwhcneytmkswxasepelzbxly ; /usr/bin/python3
Feb 20 07:49:17 np0005625204.localdomain sudo[44794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:17 np0005625204.localdomain python3[44796]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:17 np0005625204.localdomain sudo[44794]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:17 np0005625204.localdomain sudo[44837]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqvjalpuxrrqvycleuwqfiqwqtayllqm ; /usr/bin/python3
Feb 20 07:49:17 np0005625204.localdomain sudo[44837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:17 np0005625204.localdomain python3[44839]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573757.0013485-76569-164355237576144/source _original_basename=tmpysrkc7db follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:17 np0005625204.localdomain sudo[44837]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:17 np0005625204.localdomain sudo[44867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usjafrpwzpffpshorlgytmopqxmvayvg ; /usr/bin/python3
Feb 20 07:49:17 np0005625204.localdomain sudo[44867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:18 np0005625204.localdomain python3[44869]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:18 np0005625204.localdomain sudo[44867]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:18 np0005625204.localdomain sshd[44565]: error: maximum authentication attempts exceeded for invalid user anonymous from 185.246.128.171 port 32333 ssh2 [preauth]
Feb 20 07:49:18 np0005625204.localdomain sshd[44565]: Disconnecting invalid user anonymous 185.246.128.171 port 32333: Too many authentication failures [preauth]
Feb 20 07:49:19 np0005625204.localdomain sudo[44884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktcfeiaokllczjeydsdqqyhbkjorqgfa ; /usr/bin/python3
Feb 20 07:49:19 np0005625204.localdomain sudo[44884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:19 np0005625204.localdomain python3[44886]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:19 np0005625204.localdomain sudo[44884]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:20 np0005625204.localdomain sudo[44932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqtyfucqzdanyduihpdwgglwhkpnacmf ; /usr/bin/python3
Feb 20 07:49:20 np0005625204.localdomain sudo[44932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:20 np0005625204.localdomain python3[44934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:20 np0005625204.localdomain sudo[44932]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:20 np0005625204.localdomain sudo[44975]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsqkxbmffkvemzcobiqwxowrcugrdevq ; /usr/bin/python3
Feb 20 07:49:20 np0005625204.localdomain sudo[44975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:20 np0005625204.localdomain python3[44977]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573760.2712302-76911-84892213949368/source _original_basename=tmphskfpoqi follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:20 np0005625204.localdomain sudo[44975]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:21 np0005625204.localdomain sudo[45005]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwwkzydavfsbdaiclauhcswfzzgnwehz ; /usr/bin/python3
Feb 20 07:49:21 np0005625204.localdomain sshd[45007]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:21 np0005625204.localdomain sudo[45005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:21 np0005625204.localdomain python3[45008]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:21 np0005625204.localdomain sudo[45005]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:21 np0005625204.localdomain sudo[45022]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkpqbmkgvczwotoxvwsyzqviqjrxjtte ; /usr/bin/python3
Feb 20 07:49:21 np0005625204.localdomain sudo[45022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:21 np0005625204.localdomain python3[45024]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:21 np0005625204.localdomain sudo[45022]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:21 np0005625204.localdomain sudo[45038]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcupqnnpliqiedoakxqnrneowsxtutkh ; /usr/bin/python3
Feb 20 07:49:21 np0005625204.localdomain sudo[45038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:22 np0005625204.localdomain python3[45040]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:22 np0005625204.localdomain sudo[45041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:49:22 np0005625204.localdomain sudo[45038]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:22 np0005625204.localdomain sudo[45041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:49:22 np0005625204.localdomain sudo[45041]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:22 np0005625204.localdomain sudo[45056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 07:49:22 np0005625204.localdomain sudo[45056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:49:22 np0005625204.localdomain sudo[45084]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akrkrvrwjuesvqjhmyydddfunqgqfpnl ; /usr/bin/python3
Feb 20 07:49:22 np0005625204.localdomain sudo[45084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:22 np0005625204.localdomain python3[45086]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:22 np0005625204.localdomain sudo[45084]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:22 np0005625204.localdomain sudo[45115]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-judjwyduigpjqasicddvhjnbhxpimien ; /usr/bin/python3
Feb 20 07:49:22 np0005625204.localdomain sudo[45115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:22 np0005625204.localdomain sudo[45056]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:22 np0005625204.localdomain python3[45122]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:22 np0005625204.localdomain sudo[45115]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:22 np0005625204.localdomain sudo[45137]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcqmnmwuelldnawpghuahhpnolpcakal ; /usr/bin/python3
Feb 20 07:49:22 np0005625204.localdomain sudo[45137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:23 np0005625204.localdomain sudo[45140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:49:23 np0005625204.localdomain sudo[45140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:49:23 np0005625204.localdomain sudo[45140]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:23 np0005625204.localdomain python3[45139]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:23 np0005625204.localdomain sudo[45137]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:23 np0005625204.localdomain sudo[45155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:49:23 np0005625204.localdomain sudo[45155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:49:23 np0005625204.localdomain sudo[45183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymsxdmdyyokervopzmsefdxrvhtuxjsd ; /usr/bin/python3
Feb 20 07:49:23 np0005625204.localdomain sudo[45183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:23 np0005625204.localdomain python3[45185]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:23 np0005625204.localdomain sudo[45183]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:23 np0005625204.localdomain sudo[45216]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llmygxggwdoqqzjmuklihcedkjautlmw ; /usr/bin/python3
Feb 20 07:49:23 np0005625204.localdomain sudo[45216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:23 np0005625204.localdomain sudo[45155]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:23 np0005625204.localdomain python3[45220]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:23 np0005625204.localdomain sudo[45216]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:23 np0005625204.localdomain sudo[45246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnvsqahnpiwlqoedcnzqboictxmsjtnc ; /usr/bin/python3
Feb 20 07:49:23 np0005625204.localdomain sudo[45246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:23 np0005625204.localdomain python3[45248]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:24 np0005625204.localdomain sudo[45246]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:24 np0005625204.localdomain sshd[45007]: Invalid user anonymous from 185.246.128.171 port 9674
Feb 20 07:49:24 np0005625204.localdomain sudo[45262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnfrvxlzdshgpzejenkasvmckfzmldut ; /usr/bin/python3
Feb 20 07:49:24 np0005625204.localdomain sudo[45262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:24 np0005625204.localdomain sudo[45264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:49:24 np0005625204.localdomain sudo[45264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:49:24 np0005625204.localdomain sudo[45264]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:24 np0005625204.localdomain python3[45272]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Feb 20 07:49:24 np0005625204.localdomain groupadd[45280]: group added to /etc/group: name=qemu, GID=107
Feb 20 07:49:24 np0005625204.localdomain groupadd[45280]: group added to /etc/gshadow: name=qemu
Feb 20 07:49:24 np0005625204.localdomain groupadd[45280]: new group: name=qemu, GID=107
Feb 20 07:49:24 np0005625204.localdomain sudo[45262]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:24 np0005625204.localdomain sudo[45299]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqcojxxytefkiuyoteuinlurujstohkq ; /usr/bin/python3
Feb 20 07:49:24 np0005625204.localdomain sudo[45299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:24 np0005625204.localdomain python3[45301]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 20 07:49:24 np0005625204.localdomain useradd[45303]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Feb 20 07:49:24 np0005625204.localdomain sudo[45299]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:25 np0005625204.localdomain sudo[45323]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klsubapcjwkzgoxrepffnhmwhtrmyvth ; /usr/bin/python3
Feb 20 07:49:25 np0005625204.localdomain sudo[45323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:25 np0005625204.localdomain python3[45325]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Feb 20 07:49:25 np0005625204.localdomain sudo[45323]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:25 np0005625204.localdomain sudo[45339]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmrygzkxaszhujawhibilocxdraaahik ; /usr/bin/python3
Feb 20 07:49:25 np0005625204.localdomain sudo[45339]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:25 np0005625204.localdomain python3[45341]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:25 np0005625204.localdomain sudo[45339]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:25 np0005625204.localdomain sudo[45388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snxcnykhhzirdgmvyawssxnezwugiygv ; /usr/bin/python3
Feb 20 07:49:25 np0005625204.localdomain sudo[45388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:26 np0005625204.localdomain python3[45390]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:26 np0005625204.localdomain sudo[45388]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:26 np0005625204.localdomain sudo[45431]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grthxlygbkdxidjiyzsyokzwcbqvavpq ; /usr/bin/python3
Feb 20 07:49:26 np0005625204.localdomain sudo[45431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:26 np0005625204.localdomain python3[45433]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573765.7865148-77198-41640684322259/source _original_basename=tmpt9istg74 follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:26 np0005625204.localdomain sudo[45431]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:26 np0005625204.localdomain sudo[45461]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evuyudjfohadcbzwzxqkdblwjjpqtwmm ; /usr/bin/python3
Feb 20 07:49:26 np0005625204.localdomain sudo[45461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:26 np0005625204.localdomain python3[45463]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 20 07:49:27 np0005625204.localdomain sshd[45007]: Disconnecting invalid user anonymous 185.246.128.171 port 9674: Change of username or service not allowed: (anonymous,ssh-connection) -> (2,ssh-connection) [preauth]
Feb 20 07:49:27 np0005625204.localdomain sudo[45461]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:27 np0005625204.localdomain sudo[45481]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hicfkwmysxilzpqqggavfuyjvcqbncwp ; /usr/bin/python3
Feb 20 07:49:27 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 20 07:49:27 np0005625204.localdomain sudo[45481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:27 np0005625204.localdomain python3[45483]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:27 np0005625204.localdomain sudo[45481]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:28 np0005625204.localdomain sudo[45497]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfgvhdxadtmatimbqntniskxzbhkbvcy ; /usr/bin/python3
Feb 20 07:49:28 np0005625204.localdomain sudo[45497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:28 np0005625204.localdomain python3[45499]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:28 np0005625204.localdomain sudo[45497]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:28 np0005625204.localdomain sudo[45513]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzrjrtbmibjmjgqcppepovibnqsbobio ; /usr/bin/python3
Feb 20 07:49:28 np0005625204.localdomain sudo[45513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:28 np0005625204.localdomain python3[45515]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Feb 20 07:49:29 np0005625204.localdomain sudo[45513]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:29 np0005625204.localdomain sshd[45535]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:29 np0005625204.localdomain sudo[45534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmlkvffqvdcxxtpwerqijnvzoafpcabh ; /usr/bin/python3
Feb 20 07:49:29 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 20 07:49:29 np0005625204.localdomain sudo[45534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:30 np0005625204.localdomain python3[45537]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:49:32 np0005625204.localdomain sudo[45534]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:33 np0005625204.localdomain sudo[45553]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-geevuihhkbfnhxahuxojbnzcdxkqlsaf ; /usr/bin/python3
Feb 20 07:49:33 np0005625204.localdomain sudo[45553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:33 np0005625204.localdomain python3[45555]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 07:49:33 np0005625204.localdomain sudo[45553]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:33 np0005625204.localdomain sudo[45614]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grhovzehhoepmdokgrudvisosesmdcbr ; /usr/bin/python3
Feb 20 07:49:33 np0005625204.localdomain sudo[45614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:33 np0005625204.localdomain sshd[45535]: Invalid user 2 from 185.246.128.171 port 53400
Feb 20 07:49:34 np0005625204.localdomain python3[45616]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:34 np0005625204.localdomain sudo[45614]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:34 np0005625204.localdomain sudo[45630]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prfinzlpjnxurvqoidbfzsakwlwcsxla ; /usr/bin/python3
Feb 20 07:49:34 np0005625204.localdomain sudo[45630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:34 np0005625204.localdomain python3[45632]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:34 np0005625204.localdomain sudo[45630]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:34 np0005625204.localdomain sshd[45535]: Disconnecting invalid user 2 185.246.128.171 port 53400: Change of username or service not allowed: (2,ssh-connection) -> (tmp,ssh-connection) [preauth]
Feb 20 07:49:34 np0005625204.localdomain sudo[45689]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryuypzutrwegrtvbkwfemhckglkqhgnc ; /usr/bin/python3
Feb 20 07:49:34 np0005625204.localdomain sudo[45689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:35 np0005625204.localdomain python3[45691]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:35 np0005625204.localdomain sudo[45689]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:35 np0005625204.localdomain sudo[45732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhvpjwhtrjxoamkizlzzbzawztskmjpd ; /usr/bin/python3
Feb 20 07:49:35 np0005625204.localdomain sudo[45732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:35 np0005625204.localdomain python3[45734]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573774.6713738-77580-66630746460930/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=0c6402bd7c36c2824760eb4c5e728ced7f603318 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:35 np0005625204.localdomain sudo[45732]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:35 np0005625204.localdomain sudo[45794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bljkaeyxwpjevjhjakbsmxzptafvwzkh ; /usr/bin/python3
Feb 20 07:49:35 np0005625204.localdomain sudo[45794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:36 np0005625204.localdomain python3[45796]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:36 np0005625204.localdomain sudo[45794]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:36 np0005625204.localdomain sudo[45839]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-watwceapnrzunfpsftlundksypbhhguu ; /usr/bin/python3
Feb 20 07:49:36 np0005625204.localdomain sudo[45839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:36 np0005625204.localdomain python3[45841]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573775.7108946-77635-83902544214929/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:36 np0005625204.localdomain sudo[45839]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:36 np0005625204.localdomain sudo[45869]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqwqzmyqeorkkyessnnzxqqzikoloncw ; /usr/bin/python3
Feb 20 07:49:36 np0005625204.localdomain sudo[45869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:36 np0005625204.localdomain python3[45871]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:36 np0005625204.localdomain sudo[45869]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:37 np0005625204.localdomain sudo[45885]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bylwmedmnwppwbrvrjjdvkpdwxyzbmqi ; /usr/bin/python3
Feb 20 07:49:37 np0005625204.localdomain sudo[45885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:37 np0005625204.localdomain python3[45887]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:37 np0005625204.localdomain sudo[45885]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:37 np0005625204.localdomain sudo[45901]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdyxnuxfurxsgtunetbmfmdqdudncawb ; /usr/bin/python3
Feb 20 07:49:37 np0005625204.localdomain sudo[45901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:37 np0005625204.localdomain python3[45903]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:37 np0005625204.localdomain sudo[45901]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:37 np0005625204.localdomain sudo[45917]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkptpwkslwgguuvsvglgmcseuwjqxjew ; /usr/bin/python3
Feb 20 07:49:37 np0005625204.localdomain sudo[45917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:37 np0005625204.localdomain python3[45919]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:37 np0005625204.localdomain sudo[45917]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:38 np0005625204.localdomain sudo[45965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmskauwgiusrwokurhlafuhzbdkbzpft ; /usr/bin/python3
Feb 20 07:49:38 np0005625204.localdomain sudo[45965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:38 np0005625204.localdomain sshd[45968]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:38 np0005625204.localdomain python3[45967]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:38 np0005625204.localdomain sudo[45965]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:38 np0005625204.localdomain sudo[46009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pijlauinxycnrvgliycdgjsuwnodkenv ; /usr/bin/python3
Feb 20 07:49:38 np0005625204.localdomain sudo[46009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:38 np0005625204.localdomain python3[46011]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573778.3062062-77751-104110974457188/source _original_basename=tmp41pyhktf follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:38 np0005625204.localdomain sudo[46009]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:39 np0005625204.localdomain sudo[46039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxpepjqeplxnjzbtoszciiyltctiwdjz ; /usr/bin/python3
Feb 20 07:49:39 np0005625204.localdomain sudo[46039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:39 np0005625204.localdomain python3[46041]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:39 np0005625204.localdomain sudo[46039]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:39 np0005625204.localdomain sudo[46055]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thjexywcxtywzlppqrdelpachczrcnmb ; /usr/bin/python3
Feb 20 07:49:39 np0005625204.localdomain sudo[46055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:39 np0005625204.localdomain python3[46057]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:39 np0005625204.localdomain sudo[46055]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:40 np0005625204.localdomain sudo[46072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erukcqrlsdxueewqjixjyjodpwooojtr ; /usr/bin/python3
Feb 20 07:49:40 np0005625204.localdomain sudo[46072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:40 np0005625204.localdomain python3[46074]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:49:41 np0005625204.localdomain sshd[45968]: Invalid user tmp from 185.246.128.171 port 32635
Feb 20 07:49:41 np0005625204.localdomain sshd[45968]: Disconnecting invalid user tmp 185.246.128.171 port 32635: Change of username or service not allowed: (tmp,ssh-connection) -> (factorio,ssh-connection) [preauth]
Feb 20 07:49:43 np0005625204.localdomain sudo[46072]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:44 np0005625204.localdomain sshd[46108]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:44 np0005625204.localdomain sudo[46122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moctnkzdfcotgcueqvhxfodwnctmvwuf ; /usr/bin/python3
Feb 20 07:49:44 np0005625204.localdomain sudo[46122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:44 np0005625204.localdomain python3[46124]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:44 np0005625204.localdomain sudo[46122]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:44 np0005625204.localdomain sudo[46167]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcapzihgbiyuweekwkupqdypimjdmozb ; /usr/bin/python3
Feb 20 07:49:44 np0005625204.localdomain sudo[46167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:44 np0005625204.localdomain python3[46169]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573783.9418566-77982-198255911223850/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:44 np0005625204.localdomain sudo[46167]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:44 np0005625204.localdomain sudo[46199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pstepzqvhbgtabftocmyipzsygyvrlrs ; /usr/bin/python3
Feb 20 07:49:44 np0005625204.localdomain sudo[46199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:45 np0005625204.localdomain python3[46201]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:49:45 np0005625204.localdomain sshd[1132]: Received signal 15; terminating.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: sshd.service: Unit process 46108 (sshd) remains running after unit stopped.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: sshd.service: Unit process 46171 (sshd) remains running after unit stopped.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: sshd.service: Consumed 13.674s CPU time, read 1.9M from disk, written 1.1M to disk.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 20 07:49:45 np0005625204.localdomain sshd[46205]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:45 np0005625204.localdomain sshd[46205]: Server listening on 0.0.0.0 port 22.
Feb 20 07:49:45 np0005625204.localdomain sshd[46205]: Server listening on :: port 22.
Feb 20 07:49:45 np0005625204.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 20 07:49:45 np0005625204.localdomain sudo[46199]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:45 np0005625204.localdomain sudo[46219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhnfrluqlkhgcuajumzsviaunmiqfurq ; /usr/bin/python3
Feb 20 07:49:45 np0005625204.localdomain sudo[46219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:45 np0005625204.localdomain python3[46221]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:45 np0005625204.localdomain sudo[46219]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:46 np0005625204.localdomain sudo[46237]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtzwehgcyyivhbuqwoasodwrpqxfyaqs ; /usr/bin/python3
Feb 20 07:49:46 np0005625204.localdomain sudo[46237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:46 np0005625204.localdomain python3[46239]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:46 np0005625204.localdomain sudo[46237]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:46 np0005625204.localdomain sudo[46255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usdibemwbvnplcwdogltrsdyfoojacjf ; /usr/bin/python3
Feb 20 07:49:46 np0005625204.localdomain sudo[46255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:46 np0005625204.localdomain sshd[46108]: Invalid user factorio from 185.246.128.171 port 62266
Feb 20 07:49:46 np0005625204.localdomain sshd[46258]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:46 np0005625204.localdomain python3[46257]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:49:47 np0005625204.localdomain sshd[46108]: Disconnecting invalid user factorio 185.246.128.171 port 62266: Change of username or service not allowed: (factorio,ssh-connection) -> (off,ssh-connection) [preauth]
Feb 20 07:49:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 07:49:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3258 writes, 16K keys, 3258 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3258 writes, 145 syncs, 22.47 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3258 writes, 16K keys, 3258 commit groups, 1.0 writes per commit group, ingest: 14.65 MB, 0.02 MB/s
                                                          Interval WAL: 3258 writes, 145 syncs, 22.47 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 07:49:47 np0005625204.localdomain sshd[46261]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:48 np0005625204.localdomain sshd[46258]: Received disconnect from 101.36.109.176 port 50170:11: Bye Bye [preauth]
Feb 20 07:49:48 np0005625204.localdomain sshd[46258]: Disconnected from authenticating user root 101.36.109.176 port 50170 [preauth]
Feb 20 07:49:49 np0005625204.localdomain sudo[46255]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:49 np0005625204.localdomain sshd[46261]: Invalid user off from 185.246.128.171 port 16069
Feb 20 07:49:50 np0005625204.localdomain sudo[46308]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xcvydnuekgcrqennqlmejmscqmrbyzot ; /usr/bin/python3
Feb 20 07:49:50 np0005625204.localdomain sudo[46308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:50 np0005625204.localdomain python3[46310]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:50 np0005625204.localdomain sudo[46308]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:50 np0005625204.localdomain sudo[46326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owfqyhddubksrdbgqlouqtqlpgnsmgay ; /usr/bin/python3
Feb 20 07:49:50 np0005625204.localdomain sudo[46326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:50 np0005625204.localdomain python3[46328]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:50 np0005625204.localdomain sudo[46326]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:50 np0005625204.localdomain sshd[46261]: Disconnecting invalid user off 185.246.128.171 port 16069: Change of username or service not allowed: (off,ssh-connection) -> (admin123,ssh-connection) [preauth]
Feb 20 07:49:51 np0005625204.localdomain sudo[46356]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ribmhruxvdqazvfefpjqnpsbgcemkmwt ; /usr/bin/python3
Feb 20 07:49:51 np0005625204.localdomain sudo[46356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 07:49:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3387 writes, 16K keys, 3387 commit groups, 1.0 writes per commit group, ingest: 15.26 MB, 0.03 MB/s
                                                          Interval WAL: 3387 writes, 198 syncs, 17.11 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 07:49:51 np0005625204.localdomain python3[46358]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:49:51 np0005625204.localdomain sudo[46356]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:52 np0005625204.localdomain sudo[46406]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyztfucrhmkomwxxddlcfboliirpavgl ; /usr/bin/python3
Feb 20 07:49:52 np0005625204.localdomain sudo[46406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:52 np0005625204.localdomain python3[46408]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:52 np0005625204.localdomain sudo[46406]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:52 np0005625204.localdomain sudo[46424]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnhzbjvhddqvghpgrpzcqtoqfmlxmtee ; /usr/bin/python3
Feb 20 07:49:52 np0005625204.localdomain sudo[46424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:52 np0005625204.localdomain python3[46426]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:52 np0005625204.localdomain sudo[46424]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:52 np0005625204.localdomain sudo[46454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzlopaxkklmwhehbfpxdrtvojlfesurf ; /usr/bin/python3
Feb 20 07:49:52 np0005625204.localdomain sudo[46454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:53 np0005625204.localdomain python3[46456]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:49:53 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:49:53 np0005625204.localdomain sshd[46458]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:53 np0005625204.localdomain systemd-sysv-generator[46488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:49:53 np0005625204.localdomain systemd-rc-local-generator[46483]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:49:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:49:53 np0005625204.localdomain systemd[1]: Starting chronyd online sources service...
Feb 20 07:49:53 np0005625204.localdomain chronyc[46498]: 200 OK
Feb 20 07:49:53 np0005625204.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Feb 20 07:49:53 np0005625204.localdomain systemd[1]: Finished chronyd online sources service.
Feb 20 07:49:53 np0005625204.localdomain sudo[46454]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:53 np0005625204.localdomain sudo[46512]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpidosjfxwgokghtwpbohwgliezhfocx ; /usr/bin/python3
Feb 20 07:49:53 np0005625204.localdomain sudo[46512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:53 np0005625204.localdomain python3[46514]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:53 np0005625204.localdomain chronyd[26351]: System clock was stepped by -0.000137 seconds
Feb 20 07:49:53 np0005625204.localdomain sudo[46512]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:54 np0005625204.localdomain sudo[46530]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnzgzjdygymhspfgtvjzlxfnbzhulmuh ; /usr/bin/python3
Feb 20 07:49:54 np0005625204.localdomain sudo[46530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:54 np0005625204.localdomain python3[46532]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:54 np0005625204.localdomain sudo[46530]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:54 np0005625204.localdomain sudo[46547]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdbyebaxzceryzkzvcqijhmkngnpclbk ; /usr/bin/python3
Feb 20 07:49:54 np0005625204.localdomain sudo[46547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:54 np0005625204.localdomain python3[46549]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:54 np0005625204.localdomain chronyd[26351]: System clock was stepped by 0.000000 seconds
Feb 20 07:49:54 np0005625204.localdomain sudo[46547]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:54 np0005625204.localdomain sudo[46564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxjkmbhjgqbkaylcorqasaqsjkbzbsbw ; /usr/bin/python3
Feb 20 07:49:54 np0005625204.localdomain sudo[46564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:54 np0005625204.localdomain python3[46566]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:54 np0005625204.localdomain sudo[46564]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:55 np0005625204.localdomain sudo[46581]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrycipjgokebjmvbuzdmzuttrqewvpaa ; /usr/bin/python3
Feb 20 07:49:55 np0005625204.localdomain sudo[46581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:55 np0005625204.localdomain python3[46583]: ansible-timezone Invoked with name=UTC hwclock=None
Feb 20 07:49:55 np0005625204.localdomain systemd[1]: Starting Time & Date Service...
Feb 20 07:49:55 np0005625204.localdomain systemd[1]: Started Time & Date Service.
Feb 20 07:49:55 np0005625204.localdomain sudo[46581]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:55 np0005625204.localdomain sshd[46588]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:55 np0005625204.localdomain sshd[46458]: Invalid user admin123 from 185.246.128.171 port 43544
Feb 20 07:49:55 np0005625204.localdomain sshd[46588]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:49:56 np0005625204.localdomain sshd[46458]: Disconnecting invalid user admin123 185.246.128.171 port 43544: Change of username or service not allowed: (admin123,ssh-connection) -> (ss,ssh-connection) [preauth]
Feb 20 07:49:56 np0005625204.localdomain sudo[46603]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avtgfsdbscjhhgdxlwtenlwsahklbfwn ; /usr/bin/python3
Feb 20 07:49:56 np0005625204.localdomain sudo[46603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:56 np0005625204.localdomain python3[46605]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:56 np0005625204.localdomain sudo[46603]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:56 np0005625204.localdomain sudo[46620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nssinnavvucclrxkifhnydpncdnpedyu ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 20 07:49:56 np0005625204.localdomain sudo[46620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:57 np0005625204.localdomain python3[46622]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:49:57 np0005625204.localdomain sudo[46620]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:57 np0005625204.localdomain sshd[46624]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:49:57 np0005625204.localdomain sudo[46638]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjpntckkpzkznhmfmekzyhuxcrzullsa ; /usr/bin/python3
Feb 20 07:49:57 np0005625204.localdomain sudo[46638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:57 np0005625204.localdomain python3[46640]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Feb 20 07:49:57 np0005625204.localdomain sudo[46638]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:57 np0005625204.localdomain sudo[46654]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvecejhkvlcdbdmdwgozhiesnoxxvaut ; /usr/bin/python3
Feb 20 07:49:57 np0005625204.localdomain sudo[46654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:58 np0005625204.localdomain python3[46656]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:49:58 np0005625204.localdomain sudo[46654]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:58 np0005625204.localdomain sudo[46670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbzabbfvqtxsskttxhbslfreuuaxxdqc ; /usr/bin/python3
Feb 20 07:49:58 np0005625204.localdomain sudo[46670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:58 np0005625204.localdomain python3[46672]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:58 np0005625204.localdomain sudo[46670]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:58 np0005625204.localdomain sudo[46687]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpznwwaxrvpjzxebclqtppcadtnfdmaj ; /usr/bin/python3
Feb 20 07:49:58 np0005625204.localdomain sudo[46687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:59 np0005625204.localdomain python3[46689]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:49:59 np0005625204.localdomain sudo[46687]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:59 np0005625204.localdomain sudo[46735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dookfjvpbqulctcoodjxzmqmknthfsee ; /usr/bin/python3
Feb 20 07:49:59 np0005625204.localdomain sudo[46735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:59 np0005625204.localdomain python3[46737]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:49:59 np0005625204.localdomain sudo[46735]: pam_unix(sudo:session): session closed for user root
Feb 20 07:49:59 np0005625204.localdomain sudo[46778]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpieqbqmwklchqfqivlkzuaaaignreay ; /usr/bin/python3
Feb 20 07:49:59 np0005625204.localdomain sudo[46778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:49:59 np0005625204.localdomain python3[46780]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573799.1928504-79045-211679343598931/source _original_basename=tmpsefv5d1g follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:49:59 np0005625204.localdomain sudo[46778]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:00 np0005625204.localdomain sudo[46840]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuvdmanyssdearbehlnaewjydeexrohm ; /usr/bin/python3
Feb 20 07:50:00 np0005625204.localdomain sudo[46840]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:00 np0005625204.localdomain sshd[46624]: Invalid user ss from 185.246.128.171 port 2616
Feb 20 07:50:00 np0005625204.localdomain python3[46842]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:50:00 np0005625204.localdomain sudo[46840]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:00 np0005625204.localdomain sudo[46883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bagwvjxwnqyrkxqkkugbfeztqquwrwyq ; /usr/bin/python3
Feb 20 07:50:00 np0005625204.localdomain sudo[46883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:00 np0005625204.localdomain python3[46885]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573800.085099-79098-107142424176482/source _original_basename=tmpnuzabp4h follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:50:00 np0005625204.localdomain sudo[46883]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:01 np0005625204.localdomain sudo[46913]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zstlsofblgiagxigsslszchpwzpmlzjq ; /usr/bin/python3
Feb 20 07:50:01 np0005625204.localdomain sudo[46913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:01 np0005625204.localdomain sshd[46624]: Disconnecting invalid user ss 185.246.128.171 port 2616: Change of username or service not allowed: (ss,ssh-connection) -> (cx,ssh-connection) [preauth]
Feb 20 07:50:01 np0005625204.localdomain python3[46915]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 20 07:50:01 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:50:01 np0005625204.localdomain systemd-sysv-generator[46942]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:50:01 np0005625204.localdomain systemd-rc-local-generator[46938]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:50:01 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:50:01 np0005625204.localdomain sudo[46913]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:02 np0005625204.localdomain sudo[46966]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tekuowaxzyhntmmrsbldpzvkmmpdyrey ; /usr/bin/python3
Feb 20 07:50:02 np0005625204.localdomain sudo[46966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:02 np0005625204.localdomain python3[46968]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:50:02 np0005625204.localdomain sudo[46966]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:02 np0005625204.localdomain sudo[46982]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yarljshesyhxjviuftuvzhscvuntzqjj ; /usr/bin/python3
Feb 20 07:50:02 np0005625204.localdomain sudo[46982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:02 np0005625204.localdomain python3[46984]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:50:02 np0005625204.localdomain sudo[46982]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:02 np0005625204.localdomain sudo[46999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alxeqibwkphoeyopsbmubgpzmgixhpnj ; /usr/bin/python3
Feb 20 07:50:02 np0005625204.localdomain sudo[46999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:02 np0005625204.localdomain python3[47001]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:50:02 np0005625204.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Feb 20 07:50:02 np0005625204.localdomain sudo[46999]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:03 np0005625204.localdomain sudo[47016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djiwjeniaovzuqmnnwihvilkrbyizvwv ; /usr/bin/python3
Feb 20 07:50:03 np0005625204.localdomain sudo[47016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:03 np0005625204.localdomain python3[47018]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:50:03 np0005625204.localdomain sudo[47016]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:03 np0005625204.localdomain sudo[47032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfwseltcgsdamrykgblxgzlyqnewtiyl ; /usr/bin/python3
Feb 20 07:50:03 np0005625204.localdomain sudo[47032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:03 np0005625204.localdomain python3[47034]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:50:03 np0005625204.localdomain sudo[47032]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:03 np0005625204.localdomain sudo[47080]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsoxckvqnmihlrwsjjgmdtblmqhsvuxw ; /usr/bin/python3
Feb 20 07:50:03 np0005625204.localdomain sudo[47080]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:04 np0005625204.localdomain python3[47082]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:50:04 np0005625204.localdomain sudo[47080]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:04 np0005625204.localdomain sudo[47123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdtcbqiejosjtzyeixalkkoengxsequy ; /usr/bin/python3
Feb 20 07:50:04 np0005625204.localdomain sudo[47123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:04 np0005625204.localdomain python3[47125]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573803.7573938-79278-60573188786964/source _original_basename=tmpcwpp_zig follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:50:04 np0005625204.localdomain sudo[47123]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:04 np0005625204.localdomain sshd[47140]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:07 np0005625204.localdomain sshd[47140]: Invalid user cx from 185.246.128.171 port 38981
Feb 20 07:50:08 np0005625204.localdomain sshd[47140]: Disconnecting invalid user cx 185.246.128.171 port 38981: Change of username or service not allowed: (cx,ssh-connection) -> (odoo18,ssh-connection) [preauth]
Feb 20 07:50:10 np0005625204.localdomain sshd[47142]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:13 np0005625204.localdomain sshd[47142]: Invalid user odoo18 from 185.246.128.171 port 6742
Feb 20 07:50:13 np0005625204.localdomain sshd[47142]: Disconnecting invalid user odoo18 185.246.128.171 port 6742: Change of username or service not allowed: (odoo18,ssh-connection) -> (admin1,ssh-connection) [preauth]
Feb 20 07:50:16 np0005625204.localdomain sshd[47144]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:20 np0005625204.localdomain sshd[47144]: Invalid user admin1 from 185.246.128.171 port 34343
Feb 20 07:50:24 np0005625204.localdomain sudo[47146]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:50:24 np0005625204.localdomain sudo[47146]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:50:24 np0005625204.localdomain sudo[47146]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:24 np0005625204.localdomain sudo[47161]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:50:24 np0005625204.localdomain sudo[47161]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:50:25 np0005625204.localdomain sudo[47161]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:25 np0005625204.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 20 07:50:25 np0005625204.localdomain sshd[47144]: error: maximum authentication attempts exceeded for invalid user admin1 from 185.246.128.171 port 34343 ssh2 [preauth]
Feb 20 07:50:25 np0005625204.localdomain sshd[47144]: Disconnecting invalid user admin1 185.246.128.171 port 34343: Too many authentication failures [preauth]
Feb 20 07:50:26 np0005625204.localdomain sudo[47210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:50:26 np0005625204.localdomain sudo[47210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:50:26 np0005625204.localdomain sudo[47210]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:27 np0005625204.localdomain sshd[47225]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:27 np0005625204.localdomain sudo[47240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-launvekzjdimehhfcixzptatvtrtpjdf ; /usr/bin/python3
Feb 20 07:50:27 np0005625204.localdomain sudo[47240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:28 np0005625204.localdomain python3[47242]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:50:28 np0005625204.localdomain sudo[47240]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:28 np0005625204.localdomain sudo[47256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekkuujpliayejkuonhqslilprvetrgve ; /usr/bin/python3
Feb 20 07:50:28 np0005625204.localdomain sudo[47256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:28 np0005625204.localdomain python3[47258]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Feb 20 07:50:28 np0005625204.localdomain sudo[47256]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:28 np0005625204.localdomain sudo[47272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsuslytzvatguipawvwyhnevhjzqkauh ; /usr/bin/python3
Feb 20 07:50:28 np0005625204.localdomain sudo[47272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:28 np0005625204.localdomain sshd[47225]: Invalid user admin1 from 185.246.128.171 port 28921
Feb 20 07:50:28 np0005625204.localdomain python3[47274]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:50:28 np0005625204.localdomain sudo[47272]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:28 np0005625204.localdomain sudo[47288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfiltrcuhphwiqxvglelwbpibcboyibe ; /usr/bin/python3
Feb 20 07:50:28 np0005625204.localdomain sudo[47288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:29 np0005625204.localdomain python3[47290]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:50:29 np0005625204.localdomain sudo[47288]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:29 np0005625204.localdomain sudo[47304]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsupbxwpcsssktsfqncucxhksrlheazk ; /usr/bin/python3
Feb 20 07:50:29 np0005625204.localdomain sudo[47304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:29 np0005625204.localdomain python3[47306]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:50:29 np0005625204.localdomain sudo[47304]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:29 np0005625204.localdomain sudo[47320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwrexckngvyjfxoazzsdfvbiaunzuzuz ; /usr/bin/python3
Feb 20 07:50:29 np0005625204.localdomain sudo[47320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:30 np0005625204.localdomain python3[47322]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 20 07:50:30 np0005625204.localdomain sshd[47225]: Disconnecting invalid user admin1 185.246.128.171 port 28921: Change of username or service not allowed: (admin1,ssh-connection) -> (user7,ssh-connection) [preauth]
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:50:30 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:50:30 np0005625204.localdomain sudo[47320]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:31 np0005625204.localdomain sudo[47341]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eouzblukbwqvbzvddvfcltahumyaiikk ; /usr/bin/python3
Feb 20 07:50:31 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 20 07:50:31 np0005625204.localdomain sudo[47341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:31 np0005625204.localdomain python3[47343]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:50:31 np0005625204.localdomain sudo[47341]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:31 np0005625204.localdomain sudo[47357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxjlwnifewugkaarjhjzfqttaqkgybqt ; /usr/bin/python3
Feb 20 07:50:31 np0005625204.localdomain sudo[47357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:31 np0005625204.localdomain sudo[47357]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:31 np0005625204.localdomain sudo[47405]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxpkeoehiubuhhzeswltzxlxtwfyweqa ; /usr/bin/python3
Feb 20 07:50:31 np0005625204.localdomain sudo[47405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:31 np0005625204.localdomain sudo[47405]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:32 np0005625204.localdomain sudo[47448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmxgkwigyjvhalkehwvlimvawyaagzsg ; /usr/bin/python3
Feb 20 07:50:32 np0005625204.localdomain sudo[47448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:32 np0005625204.localdomain sshd[47451]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:32 np0005625204.localdomain sudo[47448]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:32 np0005625204.localdomain sudo[47479]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svmafjcwzmnpwsnklssoljgdlofexfrr ; /usr/bin/python3
Feb 20 07:50:32 np0005625204.localdomain sudo[47479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:32 np0005625204.localdomain python3[47481]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Feb 20 07:50:32 np0005625204.localdomain sudo[47479]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:33 np0005625204.localdomain rsyslogd[758]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Feb 20 07:50:33 np0005625204.localdomain sudo[47496]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqomsucrsicnonwohzjfbyqdnmchakjv ; /usr/bin/python3
Feb 20 07:50:33 np0005625204.localdomain sudo[47496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:34 np0005625204.localdomain python3[47498]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:50:34 np0005625204.localdomain sudo[47496]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:34 np0005625204.localdomain sudo[47512]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfqrmifkwlrucrkaweutjjgkvffipkid ; /usr/bin/python3
Feb 20 07:50:34 np0005625204.localdomain sudo[47512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:34 np0005625204.localdomain python3[47514]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:50:34 np0005625204.localdomain sudo[47512]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:34 np0005625204.localdomain sudo[47528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gggtpisqspinbohbvqxrwxkeccxnpcrv ; /usr/bin/python3
Feb 20 07:50:34 np0005625204.localdomain sudo[47528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:34 np0005625204.localdomain python3[47530]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n -iNONE', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Feb 20 07:50:34 np0005625204.localdomain sudo[47528]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:36 np0005625204.localdomain sshd[47451]: Invalid user user7 from 185.246.128.171 port 52008
Feb 20 07:50:37 np0005625204.localdomain sshd[47451]: Disconnecting invalid user user7 185.246.128.171 port 52008: Change of username or service not allowed: (user7,ssh-connection) -> (andy,ssh-connection) [preauth]
Feb 20 07:50:38 np0005625204.localdomain sshd[47531]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:39 np0005625204.localdomain sudo[47578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eaqqfhipweiwtrnbxmxoekxtzxogudhp ; /usr/bin/python3
Feb 20 07:50:39 np0005625204.localdomain sudo[47578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:39 np0005625204.localdomain python3[47580]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:50:39 np0005625204.localdomain sudo[47578]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:39 np0005625204.localdomain sudo[47621]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbejqsjuhurfiyxbqpscqxtvopbrrxrl ; /usr/bin/python3
Feb 20 07:50:39 np0005625204.localdomain sudo[47621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:40 np0005625204.localdomain python3[47623]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573839.4949474-80839-190790480198677/source _original_basename=tmphya55sug follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:50:40 np0005625204.localdomain sudo[47621]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:40 np0005625204.localdomain sudo[47651]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzlnarwpxfgmgmhuqpzpgjtsaatmsuwb ; /usr/bin/python3
Feb 20 07:50:40 np0005625204.localdomain sudo[47651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:40 np0005625204.localdomain python3[47653]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:50:40 np0005625204.localdomain sudo[47651]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:41 np0005625204.localdomain sudo[47701]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgipfypxykyvsovgsjysdoslpfizuuol ; /usr/bin/python3
Feb 20 07:50:41 np0005625204.localdomain sudo[47701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:41 np0005625204.localdomain sudo[47701]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:41 np0005625204.localdomain sudo[47744]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xucgnalrohhgurjkbddrscxmqvwgpuoa ; /usr/bin/python3
Feb 20 07:50:41 np0005625204.localdomain sudo[47744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:41 np0005625204.localdomain sshd[47531]: Invalid user andy from 185.246.128.171 port 21782
Feb 20 07:50:41 np0005625204.localdomain sudo[47744]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:42 np0005625204.localdomain sudo[47774]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxxgicnqqsffkqqxyyqepotfjtoggdie ; /usr/bin/python3
Feb 20 07:50:42 np0005625204.localdomain sudo[47774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:42 np0005625204.localdomain python3[47776]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:50:42 np0005625204.localdomain sudo[47774]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:42 np0005625204.localdomain sshd[47777]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:42 np0005625204.localdomain sshd[47531]: Disconnecting invalid user andy 185.246.128.171 port 21782: Change of username or service not allowed: (andy,ssh-connection) -> (emby,ssh-connection) [preauth]
Feb 20 07:50:42 np0005625204.localdomain sudo[47824]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbvzfikjsohxcswjnpmfpsvzfgqpahmm ; /usr/bin/python3
Feb 20 07:50:42 np0005625204.localdomain sudo[47824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:42 np0005625204.localdomain sshd[47777]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:50:43 np0005625204.localdomain sudo[47824]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:43 np0005625204.localdomain sudo[47867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyfxrerxcautfszzzjaxmtkvzwqxhxsg ; /usr/bin/python3
Feb 20 07:50:43 np0005625204.localdomain sudo[47867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:43 np0005625204.localdomain sudo[47867]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:43 np0005625204.localdomain sudo[47897]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-miaquuohstrkidhkqzzbdjynflqgsaum ; /usr/bin/python3
Feb 20 07:50:43 np0005625204.localdomain sudo[47897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:44 np0005625204.localdomain python3[47899]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 20 07:50:44 np0005625204.localdomain sudo[47897]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:46 np0005625204.localdomain sudo[47913]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvfgwncuqovsycoqrecpoqdfefvtcvfg ; /usr/bin/python3
Feb 20 07:50:46 np0005625204.localdomain sudo[47913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:46 np0005625204.localdomain sshd[47916]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:46 np0005625204.localdomain python3[47915]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:50:46 np0005625204.localdomain sudo[47913]: pam_unix(sudo:session): session closed for user root
Feb 20 07:50:47 np0005625204.localdomain sudo[47931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeipuecrqbbbsinafwpnjextpxqixpmv ; /usr/bin/python3
Feb 20 07:50:47 np0005625204.localdomain sudo[47931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:50:47 np0005625204.localdomain python3[47933]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 07:50:51 np0005625204.localdomain sshd[47916]: Invalid user emby from 185.246.128.171 port 61436
Feb 20 07:50:51 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 07:50:51 np0005625204.localdomain dbus-broker-launch[17399]: Noticed file-system modification, trigger reload.
Feb 20 07:50:51 np0005625204.localdomain dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 20 07:50:51 np0005625204.localdomain dbus-broker-launch[17399]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 20 07:50:51 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 07:50:52 np0005625204.localdomain systemd[1]: Reexecuting.
Feb 20 07:50:52 np0005625204.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 20 07:50:52 np0005625204.localdomain systemd[1]: Detected virtualization kvm.
Feb 20 07:50:52 np0005625204.localdomain systemd[1]: Detected architecture x86-64.
Feb 20 07:50:52 np0005625204.localdomain systemd-rc-local-generator[47986]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:50:52 np0005625204.localdomain systemd-sysv-generator[47993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:50:52 np0005625204.localdomain sshd[47916]: Disconnecting invalid user emby 185.246.128.171 port 61436: Change of username or service not allowed: (emby,ssh-connection) -> (juan,ssh-connection) [preauth]
Feb 20 07:50:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:50:55 np0005625204.localdomain sshd[48009]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:50:56 np0005625204.localdomain systemd[36724]: Created slice User Background Tasks Slice.
Feb 20 07:50:56 np0005625204.localdomain systemd[36724]: Starting Cleanup of User's Temporary Files and Directories...
Feb 20 07:50:56 np0005625204.localdomain systemd[36724]: Finished Cleanup of User's Temporary Files and Directories.
Feb 20 07:50:59 np0005625204.localdomain sshd[48009]: Invalid user juan from 185.246.128.171 port 43665
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  Converting 2707 SID table entries...
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 07:51:00 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 07:51:00 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 07:51:00 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Feb 20 07:51:00 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 07:51:01 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:51:01 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 07:51:01 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:51:02 np0005625204.localdomain sshd[48009]: Disconnecting invalid user juan 185.246.128.171 port 43665: Change of username or service not allowed: (juan,ssh-connection) -> (lab,ssh-connection) [preauth]
Feb 20 07:51:02 np0005625204.localdomain systemd-sysv-generator[48070]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:51:02 np0005625204.localdomain systemd-rc-local-generator[48066]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 07:51:02 np0005625204.localdomain systemd-journald[618]: Journal stopped
Feb 20 07:51:02 np0005625204.localdomain systemd-journald[618]: Received SIGTERM from PID 1 (systemd).
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Stopping Journal Service...
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Stopped Journal Service.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: systemd-journald.service: Consumed 2.438s CPU time.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Starting Journal Service...
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: systemd-udevd.service: Consumed 2.953s CPU time.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 20 07:51:02 np0005625204.localdomain systemd-journald[48359]: Journal started
Feb 20 07:51:02 np0005625204.localdomain systemd-journald[48359]: Runtime Journal (/run/log/journal/01f46965e72fd8a157841feaa66c8d52) is 12.7M, max 314.7M, 302.0M free.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Started Journal Service.
Feb 20 07:51:02 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 20 07:51:02 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 07:51:02 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 07:51:02 np0005625204.localdomain systemd-udevd[48360]: Using default interface naming scheme 'rhel-9.0'.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 20 07:51:02 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:51:02 np0005625204.localdomain systemd-sysv-generator[49122]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:51:02 np0005625204.localdomain systemd-rc-local-generator[49119]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:51:02 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 07:51:03 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 07:51:03 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 07:51:03 np0005625204.localdomain systemd[1]: run-rc0cdca9920d74441b0a37776021354c5.service: Deactivated successfully.
Feb 20 07:51:03 np0005625204.localdomain systemd[1]: run-rc9b08c4be8c0410ea931fd2b756f1180.service: Deactivated successfully.
Feb 20 07:51:05 np0005625204.localdomain sshd[49414]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:06 np0005625204.localdomain sudo[47931]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:07 np0005625204.localdomain sudo[49431]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydqljswqfiolcnayytyjeyisrnkzguoi ; /usr/bin/python3
Feb 20 07:51:07 np0005625204.localdomain sudo[49431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:51:07 np0005625204.localdomain python3[49433]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Feb 20 07:51:07 np0005625204.localdomain sudo[49431]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:07 np0005625204.localdomain sudo[49450]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqqsbywfurefhdtdqfxxsgarveyjqnqh ; /usr/bin/python3
Feb 20 07:51:07 np0005625204.localdomain sudo[49450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:51:07 np0005625204.localdomain python3[49452]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 07:51:07 np0005625204.localdomain sudo[49450]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:08 np0005625204.localdomain sshd[49414]: Invalid user lab from 185.246.128.171 port 23993
Feb 20 07:51:08 np0005625204.localdomain sudo[49468]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpighgllgiixywubliatebbrnnlbsitg ; /usr/bin/python3
Feb 20 07:51:08 np0005625204.localdomain sudo[49468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:51:08 np0005625204.localdomain python3[49470]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:51:08 np0005625204.localdomain python3[49470]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Feb 20 07:51:08 np0005625204.localdomain python3[49470]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Feb 20 07:51:12 np0005625204.localdomain sshd[49414]: Disconnecting invalid user lab 185.246.128.171 port 23993: Change of username or service not allowed: (lab,ssh-connection) -> (oper,ssh-connection) [preauth]
Feb 20 07:51:16 np0005625204.localdomain podman[49484]: 2026-02-20 07:51:08.559441929 +0000 UTC m=+0.039625571 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 20 07:51:16 np0005625204.localdomain python3[49470]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 591bb9fb46a70e9f840f28502388406078442df6b6701a3c17990ee75e333673 --format json
Feb 20 07:51:16 np0005625204.localdomain sudo[49468]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:16 np0005625204.localdomain sshd[49573]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:19 np0005625204.localdomain sudo[49589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ataamovgtmhaaufeigufggyopjsdiops ; /usr/bin/python3
Feb 20 07:51:19 np0005625204.localdomain sudo[49589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:51:19 np0005625204.localdomain python3[49591]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:51:19 np0005625204.localdomain python3[49591]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Feb 20 07:51:19 np0005625204.localdomain python3[49591]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Feb 20 07:51:21 np0005625204.localdomain sshd[49573]: Invalid user oper from 185.246.128.171 port 21626
Feb 20 07:51:22 np0005625204.localdomain sshd[49573]: Disconnecting invalid user oper 185.246.128.171 port 21626: Change of username or service not allowed: (oper,ssh-connection) -> (thomas,ssh-connection) [preauth]
Feb 20 07:51:23 np0005625204.localdomain sshd[49641]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:23 np0005625204.localdomain sshd[49641]: Invalid user thomas from 185.246.128.171 port 53876
Feb 20 07:51:24 np0005625204.localdomain sshd[49641]: Disconnecting invalid user thomas 185.246.128.171 port 53876: Change of username or service not allowed: (thomas,ssh-connection) -> (linaro,ssh-connection) [preauth]
Feb 20 07:51:25 np0005625204.localdomain sshd[49656]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:27 np0005625204.localdomain sudo[49670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:51:27 np0005625204.localdomain sudo[49670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:51:27 np0005625204.localdomain sudo[49670]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:27 np0005625204.localdomain sudo[49685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 07:51:27 np0005625204.localdomain sudo[49685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:51:28 np0005625204.localdomain sshd[49656]: Invalid user linaro from 185.246.128.171 port 62619
Feb 20 07:51:28 np0005625204.localdomain podman[49603]: 2026-02-20 07:51:20.029869189 +0000 UTC m=+0.023600713 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 20 07:51:28 np0005625204.localdomain python3[49591]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d59b33e7fb841c47a47a12b18fb68b11debd968b4596c63f3177ecc7400fb1bc --format json
Feb 20 07:51:28 np0005625204.localdomain sshd[49743]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:28 np0005625204.localdomain sudo[49589]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:28 np0005625204.localdomain sudo[49792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqrioceicrjebovsksfabfxvisrecnes ; /usr/bin/python3
Feb 20 07:51:28 np0005625204.localdomain sudo[49792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:51:28 np0005625204.localdomain sshd[49743]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:51:28 np0005625204.localdomain podman[49809]: 2026-02-20 07:51:28.986256823 +0000 UTC m=+0.078347269 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, release=1770267347, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 07:51:29 np0005625204.localdomain podman[49809]: 2026-02-20 07:51:29.056455279 +0000 UTC m=+0.148545685 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 07:51:29 np0005625204.localdomain sshd[49656]: Disconnecting invalid user linaro 185.246.128.171 port 62619: Change of username or service not allowed: (linaro,ssh-connection) -> (root,ssh-connection) [preauth]
Feb 20 07:51:29 np0005625204.localdomain python3[49806]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:51:29 np0005625204.localdomain python3[49806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Feb 20 07:51:29 np0005625204.localdomain sudo[49685]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:29 np0005625204.localdomain python3[49806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Feb 20 07:51:29 np0005625204.localdomain sudo[49894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:51:29 np0005625204.localdomain sudo[49894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:51:29 np0005625204.localdomain sudo[49894]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:29 np0005625204.localdomain sudo[49916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:51:29 np0005625204.localdomain sudo[49916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:51:30 np0005625204.localdomain sudo[49916]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:32 np0005625204.localdomain sshd[49988]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:41 np0005625204.localdomain sudo[50289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:51:41 np0005625204.localdomain sudo[50289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:51:41 np0005625204.localdomain sudo[50289]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:45 np0005625204.localdomain sshd[49988]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 38125 ssh2 [preauth]
Feb 20 07:51:45 np0005625204.localdomain sshd[49988]: Disconnecting authenticating user root 185.246.128.171 port 38125: Too many authentication failures [preauth]
Feb 20 07:51:48 np0005625204.localdomain podman[49888]: 2026-02-20 07:51:29.369983526 +0000 UTC m=+0.035851213 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 07:51:48 np0005625204.localdomain python3[49806]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 6eddd23e1e6adfbfa713a747123707c02f92ffdbf1913da92f171aba1d6d7856 --format json
Feb 20 07:51:48 np0005625204.localdomain sudo[49792]: pam_unix(sudo:session): session closed for user root
Feb 20 07:51:48 np0005625204.localdomain sudo[50330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqpcqvylvaxjdzvbojjidkomobjlwscg ; /usr/bin/python3
Feb 20 07:51:48 np0005625204.localdomain sudo[50330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:51:48 np0005625204.localdomain sshd[50333]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:51:48 np0005625204.localdomain python3[50332]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:51:48 np0005625204.localdomain python3[50332]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Feb 20 07:51:48 np0005625204.localdomain python3[50332]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Feb 20 07:51:56 np0005625204.localdomain sshd[50333]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 52189 ssh2 [preauth]
Feb 20 07:51:56 np0005625204.localdomain sshd[50333]: Disconnecting authenticating user root 185.246.128.171 port 52189: Too many authentication failures [preauth]
Feb 20 07:52:00 np0005625204.localdomain sshd[50397]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:01 np0005625204.localdomain podman[50346]: 2026-02-20 07:51:48.535033434 +0000 UTC m=+0.028081231 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 07:52:01 np0005625204.localdomain python3[50332]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 2c8610235afe953aa46efb141a5a988799548b22280d65a7e7ab21889422df37 --format json
Feb 20 07:52:01 np0005625204.localdomain sudo[50330]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:01 np0005625204.localdomain sudo[50429]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teetjhfljwywoepggthamuzqywzkldmp ; /usr/bin/python3
Feb 20 07:52:01 np0005625204.localdomain sudo[50429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:01 np0005625204.localdomain python3[50431]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:01 np0005625204.localdomain python3[50431]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Feb 20 07:52:01 np0005625204.localdomain python3[50431]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Feb 20 07:52:08 np0005625204.localdomain sshd[50737]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:09 np0005625204.localdomain sshd[50737]: Received disconnect from 189.143.72.189 port 52268:11: Bye Bye [preauth]
Feb 20 07:52:09 np0005625204.localdomain sshd[50737]: Disconnected from authenticating user root 189.143.72.189 port 52268 [preauth]
Feb 20 07:52:09 np0005625204.localdomain sshd[50397]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 46659 ssh2 [preauth]
Feb 20 07:52:09 np0005625204.localdomain sshd[50397]: Disconnecting authenticating user root 185.246.128.171 port 46659: Too many authentication failures [preauth]
Feb 20 07:52:10 np0005625204.localdomain podman[50444]: 2026-02-20 07:52:01.771158909 +0000 UTC m=+0.046988597 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 20 07:52:10 np0005625204.localdomain python3[50431]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ab5aab6d0c3ec80926032b7acf4cec1d4710f1c2daccd17ae4daa64399ec237 --format json
Feb 20 07:52:10 np0005625204.localdomain sudo[50429]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:10 np0005625204.localdomain sudo[50790]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbtmvthkbubvmpfrafgojvaexjfezqqn ; /usr/bin/python3
Feb 20 07:52:10 np0005625204.localdomain sudo[50790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:11 np0005625204.localdomain python3[50792]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:11 np0005625204.localdomain python3[50792]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Feb 20 07:52:11 np0005625204.localdomain python3[50792]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Feb 20 07:52:11 np0005625204.localdomain sshd[50817]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:14 np0005625204.localdomain sshd[50856]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:14 np0005625204.localdomain sshd[50856]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:52:15 np0005625204.localdomain podman[50804]: 2026-02-20 07:52:11.235088033 +0000 UTC m=+0.045336466 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 20 07:52:15 np0005625204.localdomain python3[50792]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 4853142d85dba3766b28d28ae195b26f7242230fe3646e9590a7aee2dc2e0dfa --format json
Feb 20 07:52:15 np0005625204.localdomain sudo[50790]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:15 np0005625204.localdomain sudo[50884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlncoxbdjvnzsxdttiaarughjnmkyzfq ; /usr/bin/python3
Feb 20 07:52:15 np0005625204.localdomain sudo[50884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:15 np0005625204.localdomain python3[50886]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:15 np0005625204.localdomain python3[50886]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Feb 20 07:52:15 np0005625204.localdomain python3[50886]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Feb 20 07:52:17 np0005625204.localdomain sshd[50817]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 42044 ssh2 [preauth]
Feb 20 07:52:17 np0005625204.localdomain sshd[50817]: Disconnecting authenticating user root 185.246.128.171 port 42044: Too many authentication failures [preauth]
Feb 20 07:52:17 np0005625204.localdomain podman[50899]: 2026-02-20 07:52:15.73509789 +0000 UTC m=+0.042531429 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 20 07:52:17 np0005625204.localdomain python3[50886]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ac6ea63c0fb4851145e847f9ced2f20804afc8472907b63a82d5866f5cf608a --format json
Feb 20 07:52:17 np0005625204.localdomain sudo[50884]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:17 np0005625204.localdomain sudo[50974]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdexrjquvnsfxmjymzoysfqosckohodg ; /usr/bin/python3
Feb 20 07:52:17 np0005625204.localdomain sudo[50974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:18 np0005625204.localdomain python3[50976]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:18 np0005625204.localdomain python3[50976]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Feb 20 07:52:18 np0005625204.localdomain python3[50976]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Feb 20 07:52:18 np0005625204.localdomain sshd[51002]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:20 np0005625204.localdomain podman[50989]: 2026-02-20 07:52:18.229254902 +0000 UTC m=+0.043812659 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 20 07:52:20 np0005625204.localdomain python3[50976]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ba1a08ea1c1207b471b1f02cee16ff456b8a812662cce16906d16de330a66d63 --format json
Feb 20 07:52:20 np0005625204.localdomain sudo[50974]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:20 np0005625204.localdomain sudo[51067]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcazcpkqehuitkfnimnfgzfncinvhqxn ; /usr/bin/python3
Feb 20 07:52:20 np0005625204.localdomain sudo[51067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:20 np0005625204.localdomain python3[51069]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:20 np0005625204.localdomain python3[51069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Feb 20 07:52:20 np0005625204.localdomain python3[51069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Feb 20 07:52:22 np0005625204.localdomain podman[51082]: 2026-02-20 07:52:20.700589607 +0000 UTC m=+0.034146876 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 20 07:52:22 np0005625204.localdomain python3[51069]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 8576d3a17e57ea28f29435f132f583320941b5aa7bf0aa02e998b09a094d1fe8 --format json
Feb 20 07:52:22 np0005625204.localdomain sudo[51067]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:22 np0005625204.localdomain sudo[51158]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-detmrehddludsfwonpvvwdfwvekvstxf ; /usr/bin/python3
Feb 20 07:52:22 np0005625204.localdomain sudo[51158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:22 np0005625204.localdomain python3[51160]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:22 np0005625204.localdomain python3[51160]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Feb 20 07:52:23 np0005625204.localdomain python3[51160]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Feb 20 07:52:24 np0005625204.localdomain sshd[51002]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 15525 ssh2 [preauth]
Feb 20 07:52:24 np0005625204.localdomain sshd[51002]: Disconnecting authenticating user root 185.246.128.171 port 15525: Too many authentication failures [preauth]
Feb 20 07:52:25 np0005625204.localdomain sshd[51210]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:26 np0005625204.localdomain podman[51172]: 2026-02-20 07:52:23.070355153 +0000 UTC m=+0.034608610 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 20 07:52:26 np0005625204.localdomain python3[51160]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7fcbf63c0504494c8fcaa07583f909a06486472a0982aeac9554c6fdbeb04c9a --format json
Feb 20 07:52:26 np0005625204.localdomain sudo[51158]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:26 np0005625204.localdomain sudo[51260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chpucuiqoateqsdlfgtjfphundilbtpz ; /usr/bin/python3
Feb 20 07:52:26 np0005625204.localdomain sudo[51260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:27 np0005625204.localdomain python3[51262]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 20 07:52:27 np0005625204.localdomain python3[51262]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Feb 20 07:52:27 np0005625204.localdomain python3[51262]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Feb 20 07:52:28 np0005625204.localdomain podman[51275]: 2026-02-20 07:52:27.180270735 +0000 UTC m=+0.041764263 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 20 07:52:28 np0005625204.localdomain python3[51262]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 72ddf109f135b64d3116af7b84caaa358dc72e2e60f4c8753fa54fa65b76ba35 --format json
Feb 20 07:52:28 np0005625204.localdomain sudo[51260]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:29 np0005625204.localdomain sudo[51350]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaibgrcwpfaghomiurlirmqnrdipgnre ; /usr/bin/python3
Feb 20 07:52:29 np0005625204.localdomain sudo[51350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:29 np0005625204.localdomain python3[51352]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:52:29 np0005625204.localdomain sudo[51350]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:30 np0005625204.localdomain sudo[51400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdgdxuuysllopzcosniaoqxnwwmwqjup ; /usr/bin/python3
Feb 20 07:52:30 np0005625204.localdomain sudo[51400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:30 np0005625204.localdomain sudo[51400]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:30 np0005625204.localdomain sudo[51418]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbcsornjxwltqnokczycxjkheulsahnl ; /usr/bin/python3
Feb 20 07:52:30 np0005625204.localdomain sudo[51418]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:30 np0005625204.localdomain sudo[51418]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:30 np0005625204.localdomain sudo[51522]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfhrkxxdlqxdojtcexgmjfjycafwltwg ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573950.6461928-83846-246300545821431/async_wrapper.py 700999029294 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573950.6461928-83846-246300545821431/AnsiballZ_command.py _
Feb 20 07:52:30 np0005625204.localdomain sudo[51522]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 07:52:31 np0005625204.localdomain ansible-async_wrapper.py[51524]: Invoked with 700999029294 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573950.6461928-83846-246300545821431/AnsiballZ_command.py _
Feb 20 07:52:31 np0005625204.localdomain ansible-async_wrapper.py[51527]: Starting module and watcher
Feb 20 07:52:31 np0005625204.localdomain ansible-async_wrapper.py[51527]: Start watching 51528 (3600)
Feb 20 07:52:31 np0005625204.localdomain ansible-async_wrapper.py[51528]: Start module (51528)
Feb 20 07:52:31 np0005625204.localdomain ansible-async_wrapper.py[51524]: Return async_wrapper task started.
Feb 20 07:52:31 np0005625204.localdomain sudo[51522]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:31 np0005625204.localdomain sudo[51543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wapfqgjkczkvfppsivraaqsmvylmfxfw ; /usr/bin/python3
Feb 20 07:52:31 np0005625204.localdomain sudo[51543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:31 np0005625204.localdomain python3[51545]: ansible-ansible.legacy.async_status Invoked with jid=700999029294.51524 mode=status _async_dir=/tmp/.ansible_async
Feb 20 07:52:31 np0005625204.localdomain sudo[51543]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:33 np0005625204.localdomain sshd[51210]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 47606 ssh2 [preauth]
Feb 20 07:52:33 np0005625204.localdomain sshd[51210]: Disconnecting authenticating user root 185.246.128.171 port 47606: Too many authentication failures [preauth]
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    (file & line not available)
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    (file & line not available)
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.16 seconds
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Notice: Applied catalog in 0.05 seconds
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Application:
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    Initial environment: production
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    Converged environment: production
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:          Run mode: user
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Changes:
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:             Total: 3
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Events:
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:           Success: 3
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:             Total: 3
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Resources:
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:           Changed: 3
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:       Out of sync: 3
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:             Total: 10
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Time:
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:          Schedule: 0.00
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:              File: 0.00
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:              Exec: 0.02
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:            Augeas: 0.02
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    Transaction evaluation: 0.05
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    Catalog application: 0.05
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:    Config retrieval: 0.20
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:          Last run: 1771573955
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:        Filebucket: 0.00
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:             Total: 0.06
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]: Version:
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:            Config: 1771573955
Feb 20 07:52:35 np0005625204.localdomain puppet-user[51548]:            Puppet: 7.10.0
Feb 20 07:52:35 np0005625204.localdomain ansible-async_wrapper.py[51528]: Module complete (51528)
Feb 20 07:52:35 np0005625204.localdomain sshd[51660]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:36 np0005625204.localdomain ansible-async_wrapper.py[51527]: Done in kid B.
Feb 20 07:52:41 np0005625204.localdomain sudo[51662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:52:41 np0005625204.localdomain sudo[51662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:52:41 np0005625204.localdomain sudo[51662]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:41 np0005625204.localdomain sudo[51688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgocsefhilgfbuwyzhendghlkgwaxlae ; /usr/bin/python3
Feb 20 07:52:41 np0005625204.localdomain sudo[51688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:42 np0005625204.localdomain sudo[51693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:52:42 np0005625204.localdomain sudo[51693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:52:42 np0005625204.localdomain python3[51692]: ansible-ansible.legacy.async_status Invoked with jid=700999029294.51524 mode=status _async_dir=/tmp/.ansible_async
Feb 20 07:52:42 np0005625204.localdomain sudo[51688]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:42 np0005625204.localdomain sudo[51693]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:42 np0005625204.localdomain sshd[51660]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 36863 ssh2 [preauth]
Feb 20 07:52:42 np0005625204.localdomain sshd[51660]: Disconnecting authenticating user root 185.246.128.171 port 36863: Too many authentication failures [preauth]
Feb 20 07:52:47 np0005625204.localdomain sshd[51739]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:47 np0005625204.localdomain sudo[51766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeniolumraueuozwmusjcshwemaxgdhv ; /usr/bin/python3
Feb 20 07:52:47 np0005625204.localdomain sudo[51766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:47 np0005625204.localdomain sudo[51743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:52:47 np0005625204.localdomain sudo[51743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:52:47 np0005625204.localdomain sudo[51743]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:51 np0005625204.localdomain python3[51771]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:52:51 np0005625204.localdomain sudo[51766]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:51 np0005625204.localdomain sudo[51786]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avbiefrqitbqbeyehchxmaydjfulbcyj ; /usr/bin/python3
Feb 20 07:52:51 np0005625204.localdomain sudo[51786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:51 np0005625204.localdomain python3[51788]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:52:51 np0005625204.localdomain sudo[51786]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:52 np0005625204.localdomain sudo[51834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwdlpdybgzpageyerbdmwuyegfaanoop ; /usr/bin/python3
Feb 20 07:52:52 np0005625204.localdomain sudo[51834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:52 np0005625204.localdomain python3[51836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:52:52 np0005625204.localdomain sudo[51834]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:52 np0005625204.localdomain sudo[51877]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfprlqgfgpuoosfciwltphzcfuqxfcxb ; /usr/bin/python3
Feb 20 07:52:52 np0005625204.localdomain sudo[51877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:52 np0005625204.localdomain python3[51879]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573972.1343498-84213-47671038942375/source _original_basename=tmp0uv_bnxo follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:52:52 np0005625204.localdomain sudo[51877]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:54 np0005625204.localdomain sudo[51907]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wskrmkcqnoaeowvnrfujeittvzczyibj ; /usr/bin/python3
Feb 20 07:52:54 np0005625204.localdomain sudo[51907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:55 np0005625204.localdomain python3[51909]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:52:55 np0005625204.localdomain sudo[51907]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:55 np0005625204.localdomain sudo[51923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjlfhzrlcpuesxmdtkxxpgffbhyqeoum ; /usr/bin/python3
Feb 20 07:52:55 np0005625204.localdomain sudo[51923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:55 np0005625204.localdomain sudo[51923]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:55 np0005625204.localdomain sudo[52010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvdmrqmghhjvcfniiioorflkzmexfhye ; /usr/bin/python3
Feb 20 07:52:55 np0005625204.localdomain sudo[52010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:56 np0005625204.localdomain python3[52012]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 20 07:52:56 np0005625204.localdomain sudo[52010]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:56 np0005625204.localdomain sudo[52029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kghtudcgyzhsxzgcxxwavscrcnpsrehh ; /usr/bin/python3
Feb 20 07:52:56 np0005625204.localdomain sudo[52029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:56 np0005625204.localdomain python3[52031]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 07:52:56 np0005625204.localdomain sudo[52029]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:56 np0005625204.localdomain sudo[52045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjbonkrowachgsfkujgpcvjdemzlvojg ; /usr/bin/python3
Feb 20 07:52:56 np0005625204.localdomain sudo[52045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:57 np0005625204.localdomain sshd[51739]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 18590 ssh2 [preauth]
Feb 20 07:52:57 np0005625204.localdomain sshd[51739]: Disconnecting authenticating user root 185.246.128.171 port 18590: Too many authentication failures [preauth]
Feb 20 07:52:57 np0005625204.localdomain python3[52047]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005625204 step=1 update_config_hash_only=False
Feb 20 07:52:57 np0005625204.localdomain sudo[52045]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:57 np0005625204.localdomain sudo[52193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sswtungsydpiialcyllqiaqbljkvyyuq ; /usr/bin/python3
Feb 20 07:52:57 np0005625204.localdomain sudo[52193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:57 np0005625204.localdomain python3[52195]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:52:57 np0005625204.localdomain sudo[52193]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:58 np0005625204.localdomain sudo[52209]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxyxjbnohovhapwmqgochyltnmkgykxj ; /usr/bin/python3
Feb 20 07:52:58 np0005625204.localdomain sudo[52209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:58 np0005625204.localdomain python3[52211]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 20 07:52:58 np0005625204.localdomain sshd[52212]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:52:58 np0005625204.localdomain sudo[52209]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:58 np0005625204.localdomain sudo[52226]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvrjgcjuawixxjryakvqmawnrizxoaly ; /usr/bin/python3
Feb 20 07:52:58 np0005625204.localdomain sudo[52226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:58 np0005625204.localdomain python3[52228]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Feb 20 07:52:58 np0005625204.localdomain sudo[52226]: pam_unix(sudo:session): session closed for user root
Feb 20 07:52:59 np0005625204.localdomain sudo[52269]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecnhqqjaucqritpefzdpsbokxkqlovaw ; /usr/bin/python3
Feb 20 07:52:59 np0005625204.localdomain sudo[52269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:52:59 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Feb 20 07:53:00 np0005625204.localdomain podman[52439]: 2026-02-20 07:53:00.111720753 +0000 UTC m=+0.102959229 container create 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, container_name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5)
Feb 20 07:53:00 np0005625204.localdomain podman[52450]: 2026-02-20 07:53:00.130562541 +0000 UTC m=+0.110981029 container create aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, container_name=container-puppet-collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true)
Feb 20 07:53:00 np0005625204.localdomain podman[52439]: 2026-02-20 07:53:00.045427992 +0000 UTC m=+0.036666488 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 07:53:00 np0005625204.localdomain podman[52450]: 2026-02-20 07:53:00.055300223 +0000 UTC m=+0.035718721 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 20 07:53:00 np0005625204.localdomain podman[52451]: 2026-02-20 07:53:00.166229259 +0000 UTC m=+0.144557997 container create 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:34:43Z, tcib_managed=true, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope.
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope.
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f138d9d6c461962e8cf2ee8539c9294af2f13aab0c8b266d53219a78c733e21/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:00 np0005625204.localdomain podman[52443]: 2026-02-20 07:53:00.100357039 +0000 UTC m=+0.083926507 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope.
Feb 20 07:53:00 np0005625204.localdomain podman[52451]: 2026-02-20 07:53:00.10075017 +0000 UTC m=+0.079078898 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 20 07:53:00 np0005625204.localdomain podman[52470]: 2026-02-20 07:53:00.102606803 +0000 UTC m=+0.068439034 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 20 07:53:00 np0005625204.localdomain podman[52439]: 2026-02-20 07:53:00.203601366 +0000 UTC m=+0.194839842 container init 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-nova_libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z)
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:00 np0005625204.localdomain podman[52439]: 2026-02-20 07:53:00.21146629 +0000 UTC m=+0.202704776 container start 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, container_name=container-puppet-nova_libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 20 07:53:00 np0005625204.localdomain podman[52439]: 2026-02-20 07:53:00.211785279 +0000 UTC m=+0.203023775 container attach 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=container-puppet-nova_libvirt)
Feb 20 07:53:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:00 np0005625204.localdomain podman[52451]: 2026-02-20 07:53:00.22161548 +0000 UTC m=+0.199944188 container init 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=container-puppet-iscsid, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-iscsid-container)
Feb 20 07:53:00 np0005625204.localdomain podman[52443]: 2026-02-20 07:53:00.231345578 +0000 UTC m=+0.214915026 container create 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public)
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope.
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b4a27664720ce930aee8034c0e3a2e981bce86564061fc7e3c5cc60116ab629/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:00 np0005625204.localdomain podman[52450]: 2026-02-20 07:53:00.346462363 +0000 UTC m=+0.326880861 container init aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, url=https://www.redhat.com, container_name=container-puppet-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Feb 20 07:53:00 np0005625204.localdomain podman[52450]: 2026-02-20 07:53:00.352800374 +0000 UTC m=+0.333218862 container start aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 20 07:53:00 np0005625204.localdomain podman[52450]: 2026-02-20 07:53:00.353002689 +0000 UTC m=+0.333421197 container attach aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, container_name=container-puppet-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 07:53:00 np0005625204.localdomain podman[52443]: 2026-02-20 07:53:00.377589911 +0000 UTC m=+0.361159339 container init 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 07:53:00 np0005625204.localdomain podman[52470]: 2026-02-20 07:53:00.384037936 +0000 UTC m=+0.349870117 container create 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=container-puppet-metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope.
Feb 20 07:53:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17d67d7c6c3046ba2041c4048263641e426665d92e1e8fa18e3c871ca9222f66/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:00 np0005625204.localdomain podman[52451]: 2026-02-20 07:53:00.437173853 +0000 UTC m=+0.415502551 container start 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 07:53:00 np0005625204.localdomain podman[52451]: 2026-02-20 07:53:00.437416359 +0000 UTC m=+0.415745167 container attach 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_puppet_step1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z)
Feb 20 07:53:00 np0005625204.localdomain podman[52443]: 2026-02-20 07:53:00.440385064 +0000 UTC m=+0.423954492 container start 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-crond, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 20 07:53:00 np0005625204.localdomain podman[52443]: 2026-02-20 07:53:00.440627261 +0000 UTC m=+0.424196709 container attach 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=container-puppet-crond, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 07:53:00 np0005625204.localdomain podman[52470]: 2026-02-20 07:53:00.443558505 +0000 UTC m=+0.409390706 container init 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, release=1766032510)
Feb 20 07:53:00 np0005625204.localdomain podman[52470]: 2026-02-20 07:53:00.458697816 +0000 UTC m=+0.424529997 container start 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, container_name=container-puppet-metrics_qdr, build-date=2026-01-12T22:10:14Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 07:53:00 np0005625204.localdomain podman[52470]: 2026-02-20 07:53:00.459160259 +0000 UTC m=+0.424992480 container attach 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=container-puppet-metrics_qdr, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, architecture=x86_64)
Feb 20 07:53:00 np0005625204.localdomain sshd[52602]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]:    (file & line not available)
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]:    (file & line not available)
Feb 20 07:53:02 np0005625204.localdomain puppet-user[52556]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.11 seconds
Feb 20 07:53:04 np0005625204.localdomain sshd[52602]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:53:04 np0005625204.localdomain puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Feb 20 07:53:04 np0005625204.localdomain puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Feb 20 07:53:04 np0005625204.localdomain puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Feb 20 07:53:05 np0005625204.localdomain ovs-vsctl[52713]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Notice: Applied catalog in 2.69 seconds
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Application:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:    Initial environment: production
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:    Converged environment: production
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:          Run mode: user
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Changes:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:             Total: 4
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Events:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:           Success: 4
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:             Total: 4
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Resources:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:           Changed: 4
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:       Out of sync: 4
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:           Skipped: 8
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:             Total: 13
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Time:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:              File: 0.00
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:    Config retrieval: 0.15
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:            Augeas: 0.79
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:              Exec: 1.88
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:          Last run: 1771573985
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:    Transaction evaluation: 2.68
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:    Catalog application: 2.69
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:             Total: 2.69
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]: Version:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:            Config: 1771573982
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52556]:            Puppet: 7.10.0
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52569]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52569]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52569]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52569]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52569]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52569]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.07 seconds
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain crontab[52933]: (root) LIST (root)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Feb 20 07:53:05 np0005625204.localdomain crontab[52934]: (root) REPLACE (root)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Notice: Applied catalog in 0.11 seconds
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Application:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    Initial environment: production
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    Converged environment: production
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:          Run mode: user
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Changes:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:             Total: 2
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Events:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:           Success: 2
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:             Total: 2
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Resources:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:           Changed: 2
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:       Out of sync: 2
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:           Skipped: 7
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:             Total: 9
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Time:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:              Cron: 0.01
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:              File: 0.08
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    Config retrieval: 0.10
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    Transaction evaluation: 0.10
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:    Catalog application: 0.11
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:          Last run: 1771573985
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:             Total: 0.11
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]: Version:
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:            Config: 1771573985
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52584]:            Puppet: 7.10.0
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]:    (file & line not available)
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: in a future release. Use nova::cinder::os_region_name instead
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52553]: in a future release. Use nova::cinder::catalog_info instead
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: Accepting previously invalid value for target type 'Integer'
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.27 seconds
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}e814c37697a48f1c054f6d0fe463bdd460287439d0bb244a932a4a914847eb0b'
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Feb 20 07:53:05 np0005625204.localdomain puppet-user[52601]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Notice: Applied catalog in 0.03 seconds
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Application:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:    Initial environment: production
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:    Converged environment: production
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:          Run mode: user
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Changes:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:             Total: 7
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Events:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:           Success: 7
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:             Total: 7
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Resources:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:           Skipped: 13
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:           Changed: 5
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:       Out of sync: 5
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:             Total: 20
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Time:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:              File: 0.01
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:    Transaction evaluation: 0.03
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:    Catalog application: 0.03
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:    Config retrieval: 0.30
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:          Last run: 1771573986
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:             Total: 0.03
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]: Version:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:            Config: 1771573985
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52601]:            Puppet: 7.10.0
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.38 seconds
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope: Consumed 4.279s CPU time.
Feb 20 07:53:06 np0005625204.localdomain podman[52451]: 2026-02-20 07:53:06.063387683 +0000 UTC m=+6.041716391 container died 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=container-puppet-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64)
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope: Consumed 2.152s CPU time.
Feb 20 07:53:06 np0005625204.localdomain podman[52443]: 2026-02-20 07:53:06.092158493 +0000 UTC m=+6.075727951 container died 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_puppet_step1, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a-merged.mount: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Feb 20 07:53:06 np0005625204.localdomain podman[53019]: 2026-02-20 07:53:06.185087555 +0000 UTC m=+0.082505135 container cleanup 48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, container_name=container-puppet-crond, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-conmon-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain podman[53002]: 2026-02-20 07:53:06.225406157 +0000 UTC m=+0.154316135 container cleanup 3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_puppet_step1, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-conmon-3c5a5c564cb8506536c3d54632b21f3d31946b59b84e13dcdcf04672af817c05.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}8dd3769945b86c38433504b97f7851a931eb3c94b667298d10a9796a3d020595'
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope: Consumed 2.194s CPU time.
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Feb 20 07:53:06 np0005625204.localdomain podman[52470]: 2026-02-20 07:53:06.324794173 +0000 UTC m=+6.290626374 container died 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_puppet_step1, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public)
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Notice: Applied catalog in 0.28 seconds
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Application:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:    Initial environment: production
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:    Converged environment: production
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:          Run mode: user
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Changes:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:             Total: 43
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Events:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:           Success: 43
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:             Total: 43
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Resources:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:           Skipped: 14
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:           Changed: 38
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:       Out of sync: 38
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:             Total: 82
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Time:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:       Concat file: 0.00
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:              File: 0.17
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:    Transaction evaluation: 0.27
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:    Catalog application: 0.28
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:    Config retrieval: 0.49
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:          Last run: 1771573986
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:    Concat fragment: 0.00
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:             Total: 0.28
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]: Version:
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:            Config: 1771573985
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52569]:            Puppet: 7.10.0
Feb 20 07:53:06 np0005625204.localdomain podman[53105]: 2026-02-20 07:53:06.388081169 +0000 UTC m=+0.056157623 container cleanup 705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-conmon-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Feb 20 07:53:06 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope: Deactivated successfully.
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: libpod-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope: Consumed 2.631s CPU time.
Feb 20 07:53:06 np0005625204.localdomain podman[52450]: 2026-02-20 07:53:06.881216824 +0000 UTC m=+6.861635312 container died aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=container-puppet-collectd, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 07:53:06 np0005625204.localdomain podman[53200]: 2026-02-20 07:53:06.95257074 +0000 UTC m=+0.071470040 container create 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, container_name=container-puppet-ovn_controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team)
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: Started libpod-conmon-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope.
Feb 20 07:53:06 np0005625204.localdomain puppet-user[52553]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 1.35 seconds
Feb 20 07:53:06 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:06 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:06 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:06 np0005625204.localdomain podman[53246]: 2026-02-20 07:53:06.996594377 +0000 UTC m=+0.109110495 container create 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, architecture=x86_64, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.13, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=container-puppet-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog)
Feb 20 07:53:07 np0005625204.localdomain podman[53200]: 2026-02-20 07:53:07.003094412 +0000 UTC m=+0.121993712 container init 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, release=1766032510, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 07:53:07 np0005625204.localdomain podman[53200]: 2026-02-20 07:53:06.914665389 +0000 UTC m=+0.033564729 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 20 07:53:07 np0005625204.localdomain podman[53246]: 2026-02-20 07:53:06.914806673 +0000 UTC m=+0.027322761 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 20 07:53:07 np0005625204.localdomain podman[53257]: 2026-02-20 07:53:07.033281634 +0000 UTC m=+0.144196936 container cleanup aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, release=1766032510, config_id=tripleo_puppet_step1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=container-puppet-collectd, distribution-scope=public)
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: libpod-conmon-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508.scope: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 20 07:53:07 np0005625204.localdomain podman[53200]: 2026-02-20 07:53:07.060420079 +0000 UTC m=+0.179319369 container start 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, architecture=x86_64, release=1766032510, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Feb 20 07:53:07 np0005625204.localdomain podman[53200]: 2026-02-20 07:53:07.061709685 +0000 UTC m=+0.180609005 container attach 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-ovn_controller, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: Started libpod-conmon-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:07 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4bb1f8a81ebf31c6df88a84cd13b1c78ab0b7c78b4f247f0212f5208091a25c0/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:07 np0005625204.localdomain podman[53246]: 2026-02-20 07:53:07.108697476 +0000 UTC m=+0.221213634 container init 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=container-puppet-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 20 07:53:07 np0005625204.localdomain podman[53246]: 2026-02-20 07:53:07.121513943 +0000 UTC m=+0.234030031 container start 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-rsyslog-container, container_name=container-puppet-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog)
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: tmp-crun.YX1cgo.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-17d67d7c6c3046ba2041c4048263641e426665d92e1e8fa18e3c871ca9222f66-merged.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-705da74d7a439f0470a111c10cd2a0404401272387cae7e6f8ec905eb26941b2-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0b4a27664720ce930aee8034c0e3a2e981bce86564061fc7e3c5cc60116ab629-merged.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48fa1592325cf07d567d9e0d7d75577e4157c6ede38db2b483dc299cf77c8bab-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2f138d9d6c461962e8cf2ee8539c9294af2f13aab0c8b266d53219a78c733e21-merged.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa3a11dae04c369d9330a2c7374d4c84ad3281530184d77c0a749c626b5cd508-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:07 np0005625204.localdomain podman[53246]: 2026-02-20 07:53:07.123780927 +0000 UTC m=+0.236297095 container attach 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:09Z, distribution-scope=public, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, container_name=container-puppet-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1)
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}37542e92f883a9129d79835364a7293bd4c337025ae650a647285cb3357f99b9'
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Warning: Empty environment setting 'TLS_PASSWORD'
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}5bbbbc79dd1f184aec3b40a4e5d830cb87a3dca9076a18726a5379ee062cd087'
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Feb 20 07:53:08 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain podman[52357]: 2026-02-20 07:52:59.975591018 +0000 UTC m=+0.043864793 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain podman[53465]: 2026-02-20 07:53:09.864595361 +0000 UTC m=+0.071134338 container create b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, name=rhosp-rhel9/openstack-ceilometer-central, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:07:24Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain systemd[1]: Started libpod-conmon-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope.
Feb 20 07:53:09 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Feb 20 07:53:09 np0005625204.localdomain podman[53465]: 2026-02-20 07:53:09.917528429 +0000 UTC m=+0.124067426 container init b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, name=rhosp-rhel9/openstack-ceilometer-central, release=1766032510, build-date=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central)
Feb 20 07:53:09 np0005625204.localdomain podman[53465]: 2026-02-20 07:53:09.823806318 +0000 UTC m=+0.030345335 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 20 07:53:09 np0005625204.localdomain podman[53465]: 2026-02-20 07:53:09.924078825 +0000 UTC m=+0.130617812 container start b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:24Z, tcib_managed=true, config_id=tripleo_puppet_step1, vcs-type=git, url=https://www.redhat.com, container_name=container-puppet-ceilometer, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T23:07:24Z, name=rhosp-rhel9/openstack-ceilometer-central, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13)
Feb 20 07:53:09 np0005625204.localdomain podman[53465]: 2026-02-20 07:53:09.924464416 +0000 UTC m=+0.131003463 container attach b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:24Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, release=1766032510, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:07:24Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 20 07:53:09 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    (file & line not available)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    (file & line not available)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    (file & line not available)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.24 seconds
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    (file & line not available)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}64c5f9c37bfdcd550f09aea32895662c8b3e80da678034168cc6138d9da68080'
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}9a4304e8b8ecdc53ca05b5d959decd5d06e8240e50d42ed6e1a54c334031e88c'
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Notice: Applied catalog in 0.12 seconds
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Application:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    Initial environment: production
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    Converged environment: production
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:          Run mode: user
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Changes:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:             Total: 3
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Events:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:           Success: 3
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:             Total: 3
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Resources:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:           Skipped: 11
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:           Changed: 3
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:       Out of sync: 3
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:             Total: 25
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Time:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:       Concat file: 0.00
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    Concat fragment: 0.00
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:              File: 0.02
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    Transaction evaluation: 0.12
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    Catalog application: 0.12
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:    Config retrieval: 0.29
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:          Last run: 1771573990
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:             Total: 0.12
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]: Version:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:            Config: 1771573990
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53366]:            Puppet: 7.10.0
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.26 seconds
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53629]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53631]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53633]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.108
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53650]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005625204.localdomain
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005625204.novalocal' to 'np0005625204.localdomain'
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53657]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Feb 20 07:53:10 np0005625204.localdomain systemd[1]: libpod-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope: Deactivated successfully.
Feb 20 07:53:10 np0005625204.localdomain systemd[1]: libpod-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope: Consumed 3.474s CPU time.
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain podman[53246]: 2026-02-20 07:53:10.725741391 +0000 UTC m=+3.838257509 container died 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, container_name=container-puppet-rsyslog, release=1766032510, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z)
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53674]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53676]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53678]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53680]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53682]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53684]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:4f:a1:a1
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53686]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53688]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain ovs-vsctl[53690]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Notice: Applied catalog in 0.40 seconds
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Application:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    Initial environment: production
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    Converged environment: production
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:          Run mode: user
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Changes:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:             Total: 14
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Events:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:           Success: 14
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:             Total: 14
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Resources:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:           Skipped: 12
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:           Changed: 14
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:       Out of sync: 14
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:             Total: 29
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Time:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:              Exec: 0.01
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    Config retrieval: 0.29
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:         Vs config: 0.35
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    Transaction evaluation: 0.39
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:    Catalog application: 0.40
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:          Last run: 1771573990
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:             Total: 0.40
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]: Version:
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:            Config: 1771573990
Feb 20 07:53:10 np0005625204.localdomain puppet-user[53324]:            Puppet: 7.10.0
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Feb 20 07:53:10 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4bb1f8a81ebf31c6df88a84cd13b1c78ab0b7c78b4f247f0212f5208091a25c0-merged.mount: Deactivated successfully.
Feb 20 07:53:11 np0005625204.localdomain sshd[52212]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 24189 ssh2 [preauth]
Feb 20 07:53:11 np0005625204.localdomain sshd[52212]: Disconnecting authenticating user root 185.246.128.171 port 24189: Too many authentication failures [preauth]
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: libpod-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope: Deactivated successfully.
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: libpod-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope: Consumed 2.832s CPU time.
Feb 20 07:53:11 np0005625204.localdomain podman[53200]: 2026-02-20 07:53:11.40330573 +0000 UTC m=+4.522205060 container died 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-ovn_controller, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 07:53:11 np0005625204.localdomain podman[53664]: 2026-02-20 07:53:11.415841868 +0000 UTC m=+0.685105095 container cleanup 7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-rsyslog, version=17.1.13)
Feb 20 07:53:11 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: libpod-conmon-7cb4184d856db308eb8de75cc7204cd932e37dedcb2ec3f484b8f4e056c1b92e.scope: Deactivated successfully.
Feb 20 07:53:11 np0005625204.localdomain podman[53287]: 2026-02-20 07:53:06.971367487 +0000 UTC m=+0.037037918 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: tmp-crun.2y8Apn.mount: Deactivated successfully.
Feb 20 07:53:11 np0005625204.localdomain podman[53733]: 2026-02-20 07:53:11.591133441 +0000 UTC m=+0.178266989 container cleanup 0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vcs-type=git, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: libpod-conmon-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985.scope: Deactivated successfully.
Feb 20 07:53:11 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 20 07:53:11 np0005625204.localdomain podman[53788]: 2026-02-20 07:53:11.614759746 +0000 UTC m=+0.071485072 container create 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, architecture=x86_64, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-server, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_puppet_step1, tcib_managed=true, container_name=container-puppet-neutron, org.opencontainers.image.created=2026-01-12T22:57:35Z)
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: Started libpod-conmon-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope.
Feb 20 07:53:11 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:11 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:11 np0005625204.localdomain podman[53788]: 2026-02-20 07:53:11.674559822 +0000 UTC m=+0.131285148 container init 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=container-puppet-neutron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:57:35Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T22:57:35Z, com.redhat.component=openstack-neutron-server-container, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 07:53:11 np0005625204.localdomain podman[53788]: 2026-02-20 07:53:11.575502605 +0000 UTC m=+0.032227941 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 20 07:53:11 np0005625204.localdomain podman[53788]: 2026-02-20 07:53:11.682357145 +0000 UTC m=+0.139082441 container start 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, name=rhosp-rhel9/openstack-neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-server, release=1766032510, org.opencontainers.image.created=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, build-date=2026-01-12T22:57:35Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, container_name=container-puppet-neutron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=)
Feb 20 07:53:11 np0005625204.localdomain podman[53788]: 2026-02-20 07:53:11.68287947 +0000 UTC m=+0.139604846 container attach 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-server, org.opencontainers.image.created=2026-01-12T22:57:35Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-neutron-server-container, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:57:35Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server)
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Feb 20 07:53:11 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c-merged.mount: Deactivated successfully.
Feb 20 07:53:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0baaeeabb270b91c75685d617f9f24d0484b2d7ff378497b44169fb4c756f985-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]:    (file & line not available)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]:    (file & line not available)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.38 seconds
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[53550]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Feb 20 07:53:12 np0005625204.localdomain puppet-user[52553]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}66a7ab6cc1a19ea5002a5aaa2cfb2f196778c89c859d0afac926fe3fac9c75a4'
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Notice: Applied catalog in 0.44 seconds
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Application:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:    Initial environment: production
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:    Converged environment: production
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:          Run mode: user
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Changes:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:             Total: 31
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Events:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:           Success: 31
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:             Total: 31
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Resources:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:           Skipped: 22
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:           Changed: 31
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:       Out of sync: 31
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:             Total: 151
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Time:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:           Package: 0.01
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:    Ceilometer config: 0.35
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:    Transaction evaluation: 0.43
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:    Catalog application: 0.44
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:    Config retrieval: 0.46
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:          Last run: 1771573993
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:         Resources: 0.00
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:             Total: 0.44
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]: Version:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Notice: Applied catalog in 5.77 seconds
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:            Config: 1771573992
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53550]:            Puppet: 7.10.0
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Application:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Initial environment: production
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Converged environment: production
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:          Run mode: user
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Changes:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:             Total: 183
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Events:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:           Success: 183
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:             Total: 183
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Resources:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:           Changed: 183
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:       Out of sync: 183
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:           Skipped: 57
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:             Total: 487
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Time:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Concat fragment: 0.00
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:            Anchor: 0.00
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:         File line: 0.00
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Virtlogd config: 0.01
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Virtstoraged config: 0.01
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:              Exec: 0.01
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Virtnodedevd config: 0.02
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Virtqemud config: 0.02
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:              File: 0.02
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Virtsecretd config: 0.02
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:           Package: 0.02
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Virtproxyd config: 0.03
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:            Augeas: 0.92
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Config retrieval: 1.60
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:          Last run: 1771573993
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:       Nova config: 3.12
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:         Resources: 0.00
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Transaction evaluation: 5.76
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:    Catalog application: 5.77
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:       Concat file: 0.00
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:             Total: 5.77
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]: Version:
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:            Config: 1771573985
Feb 20 07:53:13 np0005625204.localdomain puppet-user[52553]:            Puppet: 7.10.0
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: libpod-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope: Deactivated successfully.
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: libpod-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope: Consumed 3.174s CPU time.
Feb 20 07:53:13 np0005625204.localdomain podman[53465]: 2026-02-20 07:53:13.406970957 +0000 UTC m=+3.613509964 container died b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:07:24Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-ceilometer, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central)
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340-merged.mount: Deactivated successfully.
Feb 20 07:53:13 np0005625204.localdomain podman[53954]: 2026-02-20 07:53:13.498526261 +0000 UTC m=+0.082971400 container cleanup b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, vcs-type=git, config_id=tripleo_puppet_step1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, build-date=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, architecture=x86_64, com.redhat.component=openstack-ceilometer-central-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=container-puppet-ceilometer)
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: libpod-conmon-b3cbf75578bccd93e3ae87c1ce44331145004582e2bd6e2cdf7ccb6b2eeb1c5d.scope: Deactivated successfully.
Feb 20 07:53:13 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]:    (file & line not available)
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]:    (file & line not available)
Feb 20 07:53:13 np0005625204.localdomain puppet-user[53843]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: libpod-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope: Deactivated successfully.
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: libpod-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope: Consumed 9.844s CPU time.
Feb 20 07:53:13 np0005625204.localdomain podman[52439]: 2026-02-20 07:53:13.817164595 +0000 UTC m=+13.808403101 container died 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_puppet_step1, io.openshift.expose-services=, container_name=container-puppet-nova_libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 07:53:13 np0005625204.localdomain podman[54092]: 2026-02-20 07:53:13.924884239 +0000 UTC m=+0.102663970 container cleanup 7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=container-puppet-nova_libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt)
Feb 20 07:53:13 np0005625204.localdomain systemd[1]: libpod-conmon-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed.scope: Deactivated successfully.
Feb 20 07:53:13 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 07:53:14 np0005625204.localdomain sshd[54133]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.63 seconds
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain systemd[1]: tmp-crun.jiSs8e.mount: Deactivated successfully.
Feb 20 07:53:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3-merged.mount: Deactivated successfully.
Feb 20 07:53:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7a94b37439b24e5934a5cf554cd538c80b75ca2bff6141920c9351812a4480ed-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Notice: Applied catalog in 0.43 seconds
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Application:
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Initial environment: production
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Converged environment: production
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:          Run mode: user
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Changes:
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:             Total: 33
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Events:
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:           Success: 33
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:             Total: 33
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Resources:
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:           Skipped: 21
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:           Changed: 33
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:       Out of sync: 33
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:             Total: 155
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Time:
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:         Resources: 0.00
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Ovn metadata agent config: 0.02
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Neutron config: 0.35
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Transaction evaluation: 0.43
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Catalog application: 0.43
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:    Config retrieval: 0.70
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:          Last run: 1771573994
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:             Total: 0.43
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]: Version:
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:            Config: 1771573993
Feb 20 07:53:14 np0005625204.localdomain puppet-user[53843]:            Puppet: 7.10.0
Feb 20 07:53:15 np0005625204.localdomain systemd[1]: libpod-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope: Deactivated successfully.
Feb 20 07:53:15 np0005625204.localdomain systemd[1]: libpod-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope: Consumed 3.589s CPU time.
Feb 20 07:53:15 np0005625204.localdomain podman[53788]: 2026-02-20 07:53:15.314631284 +0000 UTC m=+3.771356650 container died 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, com.redhat.component=openstack-neutron-server-container, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T22:57:35Z, summary=Red Hat OpenStack Platform 17.1 neutron-server, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:57:35Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-neutron-server)
Feb 20 07:53:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d-merged.mount: Deactivated successfully.
Feb 20 07:53:15 np0005625204.localdomain podman[54169]: 2026-02-20 07:53:15.45146744 +0000 UTC m=+0.125726479 container cleanup 051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-server-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, name=rhosp-rhel9/openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-neutron, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-server, distribution-scope=public, build-date=2026-01-12T22:57:35Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:57:35Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1)
Feb 20 07:53:15 np0005625204.localdomain systemd[1]: libpod-conmon-051cf29ab1454bbfae7b673128e11f41bb6910a83e4eb595c27abf614005e23b.scope: Deactivated successfully.
Feb 20 07:53:15 np0005625204.localdomain python3[52271]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005625204 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005625204', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 20 07:53:15 np0005625204.localdomain sudo[52269]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:16 np0005625204.localdomain sudo[54222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxweiompxdzcudytqfxbhoezhfsbkafv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:16 np0005625204.localdomain sudo[54222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:16 np0005625204.localdomain python3[54224]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:16 np0005625204.localdomain sudo[54222]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:16 np0005625204.localdomain sudo[54238]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qprhrvqydkycdazdtwewhdukfjvqytfh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:16 np0005625204.localdomain sudo[54238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:16 np0005625204.localdomain sudo[54238]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:16 np0005625204.localdomain sudo[54254]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzlyowpyoxdfiylrqgbvqulehmqrecll ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:17 np0005625204.localdomain sudo[54254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:17 np0005625204.localdomain python3[54256]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:53:18 np0005625204.localdomain sudo[54254]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:18 np0005625204.localdomain sudo[54304]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-silyjuxskjvfheuwainjigjfxpcoemzj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:18 np0005625204.localdomain sudo[54304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:18 np0005625204.localdomain python3[54306]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:53:18 np0005625204.localdomain sudo[54304]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:18 np0005625204.localdomain sudo[54347]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mimemrjmllhfrrsjxrqojppvkyuaznue ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:18 np0005625204.localdomain sudo[54347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:19 np0005625204.localdomain python3[54349]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573998.3857064-85051-93623064883299/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:19 np0005625204.localdomain sudo[54347]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:19 np0005625204.localdomain sudo[54409]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayvxkhtrganjsgwvvmajwqkjlfqhkfqx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:19 np0005625204.localdomain sudo[54409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:19 np0005625204.localdomain python3[54411]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:53:19 np0005625204.localdomain sudo[54409]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:19 np0005625204.localdomain sudo[54452]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-semtlkzfngkmwetggkkbycrqbkimonul ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:19 np0005625204.localdomain sudo[54452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:19 np0005625204.localdomain python3[54454]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771573999.2260113-85051-25605579334971/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:19 np0005625204.localdomain sudo[54452]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:20 np0005625204.localdomain sudo[54514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgrsivvyuafrhgyrjqzumxyzndlahpnk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:20 np0005625204.localdomain sudo[54514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:20 np0005625204.localdomain python3[54516]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:53:20 np0005625204.localdomain sudo[54514]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:20 np0005625204.localdomain sudo[54557]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azshwhjuematurpsnkwzfdewgbuslicr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:20 np0005625204.localdomain sudo[54557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:20 np0005625204.localdomain python3[54559]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574000.1863053-85071-7139879288483/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:20 np0005625204.localdomain sudo[54557]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:21 np0005625204.localdomain sshd[54133]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 43889 ssh2 [preauth]
Feb 20 07:53:21 np0005625204.localdomain sshd[54133]: Disconnecting authenticating user root 185.246.128.171 port 43889: Too many authentication failures [preauth]
Feb 20 07:53:21 np0005625204.localdomain sudo[54619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jajidfqpikwepxfshagzqgghjwjaphno ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:21 np0005625204.localdomain sudo[54619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:21 np0005625204.localdomain python3[54621]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:53:21 np0005625204.localdomain sudo[54619]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:21 np0005625204.localdomain sudo[54662]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waiernzwtkxikuunzhdfzuyqtuzlmmoo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:21 np0005625204.localdomain sudo[54662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:21 np0005625204.localdomain python3[54664]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574001.1139882-85083-15401580707635/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:21 np0005625204.localdomain sudo[54662]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:21 np0005625204.localdomain sudo[54692]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpkzxeebilkubrtbsnvzjqxfglpwhwwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:21 np0005625204.localdomain sudo[54692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:22 np0005625204.localdomain python3[54694]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:53:22 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:53:22 np0005625204.localdomain systemd-sysv-generator[54721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:53:22 np0005625204.localdomain systemd-rc-local-generator[54718]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:53:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:53:22 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:53:22 np0005625204.localdomain systemd-sysv-generator[54761]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:53:22 np0005625204.localdomain systemd-rc-local-generator[54755]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:53:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:53:22 np0005625204.localdomain systemd[1]: Starting TripleO Container Shutdown...
Feb 20 07:53:22 np0005625204.localdomain systemd[1]: Finished TripleO Container Shutdown.
Feb 20 07:53:22 np0005625204.localdomain sudo[54692]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:23 np0005625204.localdomain sudo[54816]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kotzigjmieouqobxmomwgplebmtrzqlt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:23 np0005625204.localdomain sudo[54816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:23 np0005625204.localdomain python3[54818]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:53:23 np0005625204.localdomain sudo[54816]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:23 np0005625204.localdomain sshd[54861]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:23 np0005625204.localdomain sudo[54859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-arjnxyfzlbouimlaqljdsojyjfsfdkhg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:23 np0005625204.localdomain sudo[54859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:23 np0005625204.localdomain python3[54862]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574002.9107478-85163-182647580955573/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:23 np0005625204.localdomain sudo[54859]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:23 np0005625204.localdomain sudo[54922]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erewnjqgzghuocghowlpogpivdjljxye ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:24 np0005625204.localdomain sudo[54922]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:24 np0005625204.localdomain python3[54924]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:53:24 np0005625204.localdomain sudo[54922]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:24 np0005625204.localdomain sudo[54965]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olapofmhelptgbvcscmfkfrdnwdyapxv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:24 np0005625204.localdomain sudo[54965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:24 np0005625204.localdomain python3[54967]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574003.841421-85174-231287861662889/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:24 np0005625204.localdomain sudo[54965]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:24 np0005625204.localdomain sudo[54995]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gctwlwyleojbepsvntpostmarjbmtyqs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:24 np0005625204.localdomain sudo[54995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:25 np0005625204.localdomain python3[54997]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:53:25 np0005625204.localdomain systemd-rc-local-generator[55023]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:53:25 np0005625204.localdomain systemd-sysv-generator[55028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:53:25 np0005625204.localdomain systemd-rc-local-generator[55064]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:53:25 np0005625204.localdomain systemd-sysv-generator[55067]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 07:53:25 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 07:53:25 np0005625204.localdomain sudo[54995]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:25 np0005625204.localdomain sudo[55089]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gansjtfzvmidrnvskopmmfcawpgttgom ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:25 np0005625204.localdomain sudo[55089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: c3cf83e3d6b9a6a9323d670f77d9e810
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: d31718fcd17fdeee6489534105191c7a
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 24eefedeb2e4ab8bab62979b617bbba7
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: eb8c5e608f55bc52c95871f92a543185
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: ed809cd151e1fa8da7409fe229c809b7
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: ed809cd151e1fa8da7409fe229c809b7
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 684ebb6e94768a0a31a4d8592f0686b3
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain python3[55091]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 6f2a8ada21c5a8beb0844e05e372be87
Feb 20 07:53:26 np0005625204.localdomain sudo[55089]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:26 np0005625204.localdomain sudo[55105]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzcvbfbzhihpdffidsreluxgxjtwpkqd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:26 np0005625204.localdomain sudo[55105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:26 np0005625204.localdomain sudo[55105]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:27 np0005625204.localdomain sudo[55145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fidtthithqptfahewicdthvtrnxzdfyt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:27 np0005625204.localdomain sudo[55145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:27 np0005625204.localdomain python3[55147]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 20 07:53:27 np0005625204.localdomain podman[55184]: 2026-02-20 07:53:27.968342119 +0000 UTC m=+0.087385277 container create 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, container_name=metrics_qdr_init_logs, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: Started libpod-conmon-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c.scope.
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:28 np0005625204.localdomain podman[55184]: 2026-02-20 07:53:27.924738659 +0000 UTC m=+0.043781797 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 20 07:53:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66b4607051ec4b678b98370429ea66c5b0f53009a9a85441acbc9ac68d517903/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:28 np0005625204.localdomain podman[55184]: 2026-02-20 07:53:28.036840105 +0000 UTC m=+0.155883233 container init 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, url=https://www.redhat.com, tcib_managed=true, container_name=metrics_qdr_init_logs, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 07:53:28 np0005625204.localdomain podman[55184]: 2026-02-20 07:53:28.051126234 +0000 UTC m=+0.170169382 container start 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr_init_logs, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 07:53:28 np0005625204.localdomain podman[55184]: 2026-02-20 07:53:28.051862167 +0000 UTC m=+0.170905375 container attach 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, container_name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, distribution-scope=public)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: libpod-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c.scope: Deactivated successfully.
Feb 20 07:53:28 np0005625204.localdomain podman[55184]: 2026-02-20 07:53:28.05618728 +0000 UTC m=+0.175230438 container died 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, container_name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 07:53:28 np0005625204.localdomain podman[55204]: 2026-02-20 07:53:28.14697525 +0000 UTC m=+0.076043458 container cleanup 25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: libpod-conmon-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c.scope: Deactivated successfully.
Feb 20 07:53:28 np0005625204.localdomain python3[55147]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Feb 20 07:53:28 np0005625204.localdomain podman[55280]: 2026-02-20 07:53:28.517600143 +0000 UTC m=+0.071719815 container create f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: Started libpod-conmon-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope.
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:53:28 np0005625204.localdomain podman[55280]: 2026-02-20 07:53:28.478197282 +0000 UTC m=+0.032317004 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 20 07:53:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:53:28 np0005625204.localdomain podman[55280]: 2026-02-20 07:53:28.617995819 +0000 UTC m=+0.172115501 container init f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Feb 20 07:53:28 np0005625204.localdomain sudo[55301]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 07:53:28 np0005625204.localdomain sudo[55301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:53:28 np0005625204.localdomain podman[55280]: 2026-02-20 07:53:28.653578243 +0000 UTC m=+0.207697875 container start f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 07:53:28 np0005625204.localdomain python3[55147]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=c3cf83e3d6b9a6a9323d670f77d9e810 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 20 07:53:28 np0005625204.localdomain sudo[55301]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:28 np0005625204.localdomain podman[55303]: 2026-02-20 07:53:28.746885691 +0000 UTC m=+0.087172130 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z)
Feb 20 07:53:28 np0005625204.localdomain sudo[55145]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:28 np0005625204.localdomain podman[55303]: 2026-02-20 07:53:28.966349227 +0000 UTC m=+0.306635706 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container)
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: tmp-crun.3SZiRu.mount: Deactivated successfully.
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-66b4607051ec4b678b98370429ea66c5b0f53009a9a85441acbc9ac68d517903-merged.mount: Deactivated successfully.
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25b71c3774408fdc61f6219370fa0fd01137d9c7c682a08be8734b4601ce4d8c-userdata-shm.mount: Deactivated successfully.
Feb 20 07:53:28 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:53:29 np0005625204.localdomain sudo[55372]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dexjmlhjzexwmyzcwuddnkadurwexogx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:29 np0005625204.localdomain sudo[55372]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:29 np0005625204.localdomain python3[55374]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:29 np0005625204.localdomain sudo[55372]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:29 np0005625204.localdomain sudo[55388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdlecwvwmurycbatcpbqeezqzsirizmz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:29 np0005625204.localdomain sudo[55388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:29 np0005625204.localdomain python3[55390]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:53:29 np0005625204.localdomain sudo[55388]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:29 np0005625204.localdomain sudo[55449]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jggdmcwudvimwjozkyfidgimhvgganum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:29 np0005625204.localdomain sudo[55449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:30 np0005625204.localdomain python3[55451]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574009.5332015-85365-187906120463055/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:30 np0005625204.localdomain sudo[55449]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:30 np0005625204.localdomain sudo[55465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlqofvthtuwkykmfyjgelkgivivuiisx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:30 np0005625204.localdomain sudo[55465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:30 np0005625204.localdomain python3[55467]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 07:53:30 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:53:30 np0005625204.localdomain systemd-sysv-generator[55495]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:53:30 np0005625204.localdomain systemd-rc-local-generator[55492]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:53:30 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:53:30 np0005625204.localdomain sudo[55465]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:31 np0005625204.localdomain sudo[55517]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahayxonvhtakzrjklxweojvnsnuqqqbv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:53:31 np0005625204.localdomain sudo[55517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:31 np0005625204.localdomain python3[55519]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:53:31 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:53:31 np0005625204.localdomain systemd-rc-local-generator[55551]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:53:31 np0005625204.localdomain systemd-sysv-generator[55554]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:53:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:53:31 np0005625204.localdomain systemd[1]: Starting metrics_qdr container...
Feb 20 07:53:31 np0005625204.localdomain systemd[1]: Started metrics_qdr container.
Feb 20 07:53:31 np0005625204.localdomain sudo[55517]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:31 np0005625204.localdomain sudo[55597]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgdoapfarhfsqulxfaxpsiyvgrrzutkb ; /usr/bin/python3
Feb 20 07:53:31 np0005625204.localdomain sudo[55597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:32 np0005625204.localdomain python3[55599]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:32 np0005625204.localdomain sudo[55597]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:32 np0005625204.localdomain sudo[55645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsibaleelvfdtobweehrspqzeoheplol ; /usr/bin/python3
Feb 20 07:53:32 np0005625204.localdomain sudo[55645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:32 np0005625204.localdomain sudo[55645]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:32 np0005625204.localdomain sudo[55688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxqbudimmjjvtseocokcwzkwixuudgkc ; /usr/bin/python3
Feb 20 07:53:32 np0005625204.localdomain sudo[55688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:32 np0005625204.localdomain sudo[55688]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:33 np0005625204.localdomain sudo[55718]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgbgnfhrgfsgwzvxfrwhmdgvncwfcxak ; /usr/bin/python3
Feb 20 07:53:33 np0005625204.localdomain sudo[55718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:33 np0005625204.localdomain python3[55720]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005625204 step=1 update_config_hash_only=False
Feb 20 07:53:33 np0005625204.localdomain sudo[55718]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:33 np0005625204.localdomain sshd[54861]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 28914 ssh2 [preauth]
Feb 20 07:53:33 np0005625204.localdomain sshd[54861]: Disconnecting authenticating user root 185.246.128.171 port 28914: Too many authentication failures [preauth]
Feb 20 07:53:33 np0005625204.localdomain sudo[55734]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmyargwhwbjvpqatbclrijdfwuaggmft ; /usr/bin/python3
Feb 20 07:53:33 np0005625204.localdomain sudo[55734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:33 np0005625204.localdomain python3[55736]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:53:33 np0005625204.localdomain sudo[55734]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:34 np0005625204.localdomain sudo[55750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptgitaeursvaamlbhczkrcypjupeqgnc ; /usr/bin/python3
Feb 20 07:53:34 np0005625204.localdomain sudo[55750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:53:34 np0005625204.localdomain python3[55752]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 20 07:53:34 np0005625204.localdomain sudo[55750]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:35 np0005625204.localdomain sshd[55753]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:43 np0005625204.localdomain sshd[55753]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 27142 ssh2 [preauth]
Feb 20 07:53:43 np0005625204.localdomain sshd[55753]: Disconnecting authenticating user root 185.246.128.171 port 27142: Too many authentication failures [preauth]
Feb 20 07:53:43 np0005625204.localdomain sshd[55755]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:47 np0005625204.localdomain sshd[55757]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:47 np0005625204.localdomain sshd[55757]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:53:51 np0005625204.localdomain sudo[55759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:53:51 np0005625204.localdomain sudo[55759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:53:51 np0005625204.localdomain sudo[55759]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:51 np0005625204.localdomain sudo[55774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:53:51 np0005625204.localdomain sudo[55774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:53:51 np0005625204.localdomain sudo[55774]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:52 np0005625204.localdomain sudo[55822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:53:52 np0005625204.localdomain sudo[55822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:53:52 np0005625204.localdomain sudo[55822]: pam_unix(sudo:session): session closed for user root
Feb 20 07:53:54 np0005625204.localdomain sshd[55755]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 6713 ssh2 [preauth]
Feb 20 07:53:54 np0005625204.localdomain sshd[55755]: Disconnecting authenticating user root 185.246.128.171 port 6713: Too many authentication failures [preauth]
Feb 20 07:53:58 np0005625204.localdomain sshd[55837]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:53:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:53:59 np0005625204.localdomain podman[55838]: 2026-02-20 07:53:59.099194544 +0000 UTC m=+0.044767538 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 07:53:59 np0005625204.localdomain podman[55838]: 2026-02-20 07:53:59.272985796 +0000 UTC m=+0.218558780 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 07:53:59 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:54:07 np0005625204.localdomain sshd[55837]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 20040 ssh2 [preauth]
Feb 20 07:54:07 np0005625204.localdomain sshd[55837]: Disconnecting authenticating user root 185.246.128.171 port 20040: Too many authentication failures [preauth]
Feb 20 07:54:09 np0005625204.localdomain sshd[55867]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:14 np0005625204.localdomain sshd[55867]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 16220 ssh2 [preauth]
Feb 20 07:54:14 np0005625204.localdomain sshd[55867]: Disconnecting authenticating user root 185.246.128.171 port 16220: Too many authentication failures [preauth]
Feb 20 07:54:16 np0005625204.localdomain sshd[55869]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:26 np0005625204.localdomain sshd[55869]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 57557 ssh2 [preauth]
Feb 20 07:54:26 np0005625204.localdomain sshd[55869]: Disconnecting authenticating user root 185.246.128.171 port 57557: Too many authentication failures [preauth]
Feb 20 07:54:28 np0005625204.localdomain sshd[55871]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:54:30 np0005625204.localdomain podman[55873]: 2026-02-20 07:54:30.200995592 +0000 UTC m=+0.119853072 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, distribution-scope=public, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64)
Feb 20 07:54:30 np0005625204.localdomain podman[55873]: 2026-02-20 07:54:30.420405359 +0000 UTC m=+0.339262859 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=metrics_qdr, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 07:54:30 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:54:31 np0005625204.localdomain sshd[55871]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 59280 ssh2 [preauth]
Feb 20 07:54:31 np0005625204.localdomain sshd[55871]: Disconnecting authenticating user root 185.246.128.171 port 59280: Too many authentication failures [preauth]
Feb 20 07:54:33 np0005625204.localdomain sshd[55902]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:35 np0005625204.localdomain sshd[55904]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:35 np0005625204.localdomain sshd[55904]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:54:40 np0005625204.localdomain sshd[55902]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 22715 ssh2 [preauth]
Feb 20 07:54:40 np0005625204.localdomain sshd[55902]: Disconnecting authenticating user root 185.246.128.171 port 22715: Too many authentication failures [preauth]
Feb 20 07:54:42 np0005625204.localdomain sshd[55906]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:49 np0005625204.localdomain sshd[55906]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 5093 ssh2 [preauth]
Feb 20 07:54:49 np0005625204.localdomain sshd[55906]: Disconnecting authenticating user root 185.246.128.171 port 5093: Too many authentication failures [preauth]
Feb 20 07:54:51 np0005625204.localdomain sshd[55908]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:54:52 np0005625204.localdomain sudo[55910]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:54:52 np0005625204.localdomain sudo[55910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:54:52 np0005625204.localdomain sudo[55910]: pam_unix(sudo:session): session closed for user root
Feb 20 07:54:52 np0005625204.localdomain sudo[55925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:54:52 np0005625204.localdomain sudo[55925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:54:53 np0005625204.localdomain sudo[55925]: pam_unix(sudo:session): session closed for user root
Feb 20 07:54:53 np0005625204.localdomain sudo[55973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:54:53 np0005625204.localdomain sudo[55973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:54:53 np0005625204.localdomain sudo[55973]: pam_unix(sudo:session): session closed for user root
Feb 20 07:55:00 np0005625204.localdomain sshd[55908]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 56752 ssh2 [preauth]
Feb 20 07:55:00 np0005625204.localdomain sshd[55908]: Disconnecting authenticating user root 185.246.128.171 port 56752: Too many authentication failures [preauth]
Feb 20 07:55:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:55:00 np0005625204.localdomain systemd[1]: tmp-crun.T5TZ5L.mount: Deactivated successfully.
Feb 20 07:55:00 np0005625204.localdomain podman[55988]: 2026-02-20 07:55:00.875829532 +0000 UTC m=+0.079355376 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64)
Feb 20 07:55:01 np0005625204.localdomain podman[55988]: 2026-02-20 07:55:01.102280089 +0000 UTC m=+0.305805893 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, version=17.1.13, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 07:55:01 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:55:01 np0005625204.localdomain anacron[19093]: Job `cron.monthly' started
Feb 20 07:55:01 np0005625204.localdomain anacron[19093]: Job `cron.monthly' terminated
Feb 20 07:55:01 np0005625204.localdomain anacron[19093]: Normal exit (3 jobs run)
Feb 20 07:55:01 np0005625204.localdomain sshd[56019]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:03 np0005625204.localdomain sshd[56021]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:04 np0005625204.localdomain sshd[56021]: Invalid user deamon_root from 103.157.25.4 port 60776
Feb 20 07:55:05 np0005625204.localdomain sshd[56021]: Received disconnect from 103.157.25.4 port 60776:11: Bye Bye [preauth]
Feb 20 07:55:05 np0005625204.localdomain sshd[56021]: Disconnected from invalid user deamon_root 103.157.25.4 port 60776 [preauth]
Feb 20 07:55:06 np0005625204.localdomain sshd[56019]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 56502 ssh2 [preauth]
Feb 20 07:55:06 np0005625204.localdomain sshd[56019]: Disconnecting authenticating user root 185.246.128.171 port 56502: Too many authentication failures [preauth]
Feb 20 07:55:08 np0005625204.localdomain sshd[56023]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:12 np0005625204.localdomain sshd[56023]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 29722 ssh2 [preauth]
Feb 20 07:55:12 np0005625204.localdomain sshd[56023]: Disconnecting authenticating user root 185.246.128.171 port 29722: Too many authentication failures [preauth]
Feb 20 07:55:13 np0005625204.localdomain sshd[56025]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:21 np0005625204.localdomain sshd[56025]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 58311 ssh2 [preauth]
Feb 20 07:55:21 np0005625204.localdomain sshd[56025]: Disconnecting authenticating user root 185.246.128.171 port 58311: Too many authentication failures [preauth]
Feb 20 07:55:23 np0005625204.localdomain sshd[56027]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:23 np0005625204.localdomain sshd[56027]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:55:24 np0005625204.localdomain sshd[56029]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:30 np0005625204.localdomain sshd[56029]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 61380 ssh2 [preauth]
Feb 20 07:55:30 np0005625204.localdomain sshd[56029]: Disconnecting authenticating user root 185.246.128.171 port 61380: Too many authentication failures [preauth]
Feb 20 07:55:30 np0005625204.localdomain sshd[56031]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:55:32 np0005625204.localdomain podman[56033]: 2026-02-20 07:55:32.135846948 +0000 UTC m=+0.077499400 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, container_name=metrics_qdr, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, vcs-type=git, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 07:55:32 np0005625204.localdomain podman[56033]: 2026-02-20 07:55:32.339238444 +0000 UTC m=+0.280890866 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, release=1766032510, url=https://www.redhat.com, container_name=metrics_qdr, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5)
Feb 20 07:55:32 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:55:34 np0005625204.localdomain sshd[56031]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 30759 ssh2 [preauth]
Feb 20 07:55:34 np0005625204.localdomain sshd[56031]: Disconnecting authenticating user root 185.246.128.171 port 30759: Too many authentication failures [preauth]
Feb 20 07:55:37 np0005625204.localdomain sshd[56062]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:44 np0005625204.localdomain sshd[56062]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 6888 ssh2 [preauth]
Feb 20 07:55:44 np0005625204.localdomain sshd[56062]: Disconnecting authenticating user root 185.246.128.171 port 6888: Too many authentication failures [preauth]
Feb 20 07:55:46 np0005625204.localdomain sshd[56064]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:51 np0005625204.localdomain sshd[56064]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 59972 ssh2 [preauth]
Feb 20 07:55:51 np0005625204.localdomain sshd[56064]: Disconnecting authenticating user root 185.246.128.171 port 59972: Too many authentication failures [preauth]
Feb 20 07:55:52 np0005625204.localdomain sshd[56066]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:55:53 np0005625204.localdomain sudo[56068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:55:53 np0005625204.localdomain sudo[56068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:55:53 np0005625204.localdomain sudo[56068]: pam_unix(sudo:session): session closed for user root
Feb 20 07:55:54 np0005625204.localdomain sudo[56083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:55:54 np0005625204.localdomain sudo[56083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:55:54 np0005625204.localdomain sudo[56083]: pam_unix(sudo:session): session closed for user root
Feb 20 07:55:55 np0005625204.localdomain sudo[56131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:55:55 np0005625204.localdomain sudo[56131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:55:55 np0005625204.localdomain sudo[56131]: pam_unix(sudo:session): session closed for user root
Feb 20 07:55:58 np0005625204.localdomain sshd[56066]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 34611 ssh2 [preauth]
Feb 20 07:55:58 np0005625204.localdomain sshd[56066]: Disconnecting authenticating user root 185.246.128.171 port 34611: Too many authentication failures [preauth]
Feb 20 07:56:00 np0005625204.localdomain sshd[56146]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:56:03 np0005625204.localdomain systemd[1]: tmp-crun.8tedRg.mount: Deactivated successfully.
Feb 20 07:56:03 np0005625204.localdomain podman[56148]: 2026-02-20 07:56:03.150197423 +0000 UTC m=+0.085544573 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z)
Feb 20 07:56:03 np0005625204.localdomain podman[56148]: 2026-02-20 07:56:03.31988743 +0000 UTC m=+0.255234540 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.5)
Feb 20 07:56:03 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:56:09 np0005625204.localdomain sshd[56146]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 21153 ssh2 [preauth]
Feb 20 07:56:09 np0005625204.localdomain sshd[56146]: Disconnecting authenticating user root 185.246.128.171 port 21153: Too many authentication failures [preauth]
Feb 20 07:56:11 np0005625204.localdomain sshd[56177]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:12 np0005625204.localdomain sshd[56178]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:12 np0005625204.localdomain sshd[56178]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:56:17 np0005625204.localdomain sshd[56177]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 22673 ssh2 [preauth]
Feb 20 07:56:17 np0005625204.localdomain sshd[56177]: Disconnecting authenticating user root 185.246.128.171 port 22673: Too many authentication failures [preauth]
Feb 20 07:56:18 np0005625204.localdomain sshd[56181]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:26 np0005625204.localdomain sshd[56181]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 4204 ssh2 [preauth]
Feb 20 07:56:26 np0005625204.localdomain sshd[56181]: Disconnecting authenticating user root 185.246.128.171 port 4204: Too many authentication failures [preauth]
Feb 20 07:56:28 np0005625204.localdomain sshd[56183]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:33 np0005625204.localdomain sshd[56183]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 61684 ssh2 [preauth]
Feb 20 07:56:33 np0005625204.localdomain sshd[56183]: Disconnecting authenticating user root 185.246.128.171 port 61684: Too many authentication failures [preauth]
Feb 20 07:56:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:56:33 np0005625204.localdomain podman[56185]: 2026-02-20 07:56:33.982239827 +0000 UTC m=+0.073065904 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc.)
Feb 20 07:56:34 np0005625204.localdomain podman[56185]: 2026-02-20 07:56:34.20415753 +0000 UTC m=+0.294983557 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 20 07:56:34 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:56:35 np0005625204.localdomain sshd[56214]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:44 np0005625204.localdomain sshd[56214]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 43514 ssh2 [preauth]
Feb 20 07:56:44 np0005625204.localdomain sshd[56214]: Disconnecting authenticating user root 185.246.128.171 port 43514: Too many authentication failures [preauth]
Feb 20 07:56:45 np0005625204.localdomain sshd[56216]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:51 np0005625204.localdomain sshd[56216]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 37264 ssh2 [preauth]
Feb 20 07:56:51 np0005625204.localdomain sshd[56216]: Disconnecting authenticating user root 185.246.128.171 port 37264: Too many authentication failures [preauth]
Feb 20 07:56:54 np0005625204.localdomain sshd[56218]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:55 np0005625204.localdomain sudo[56219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:56:55 np0005625204.localdomain sudo[56219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:56:55 np0005625204.localdomain sudo[56219]: pam_unix(sudo:session): session closed for user root
Feb 20 07:56:55 np0005625204.localdomain sudo[56234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:56:55 np0005625204.localdomain sudo[56234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:56:55 np0005625204.localdomain sudo[56234]: pam_unix(sudo:session): session closed for user root
Feb 20 07:56:56 np0005625204.localdomain sudo[56282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:56:56 np0005625204.localdomain sudo[56282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:56:56 np0005625204.localdomain sudo[56282]: pam_unix(sudo:session): session closed for user root
Feb 20 07:56:56 np0005625204.localdomain sshd[56297]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:58 np0005625204.localdomain sshd[56297]: Invalid user ftp-test from 178.217.173.50 port 60236
Feb 20 07:56:58 np0005625204.localdomain sshd[56297]: Received disconnect from 178.217.173.50 port 60236:11: Bye Bye [preauth]
Feb 20 07:56:58 np0005625204.localdomain sshd[56297]: Disconnected from invalid user ftp-test 178.217.173.50 port 60236 [preauth]
Feb 20 07:56:58 np0005625204.localdomain sshd[56218]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 27547 ssh2 [preauth]
Feb 20 07:56:58 np0005625204.localdomain sshd[56218]: Disconnecting authenticating user root 185.246.128.171 port 27547: Too many authentication failures [preauth]
Feb 20 07:56:59 np0005625204.localdomain sshd[56299]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:56:59 np0005625204.localdomain sshd[56299]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:56:59 np0005625204.localdomain sshd[56301]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:57:05 np0005625204.localdomain systemd[1]: tmp-crun.CpA3zb.mount: Deactivated successfully.
Feb 20 07:57:05 np0005625204.localdomain podman[56303]: 2026-02-20 07:57:05.201162173 +0000 UTC m=+0.136396925 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z)
Feb 20 07:57:05 np0005625204.localdomain podman[56303]: 2026-02-20 07:57:05.422225734 +0000 UTC m=+0.357460526 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1)
Feb 20 07:57:05 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:57:08 np0005625204.localdomain sshd[56301]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 55856 ssh2 [preauth]
Feb 20 07:57:08 np0005625204.localdomain sshd[56301]: Disconnecting authenticating user root 185.246.128.171 port 55856: Too many authentication failures [preauth]
Feb 20 07:57:10 np0005625204.localdomain sshd[56332]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:19 np0005625204.localdomain sshd[56332]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 56097 ssh2 [preauth]
Feb 20 07:57:19 np0005625204.localdomain sshd[56332]: Disconnecting authenticating user root 185.246.128.171 port 56097: Too many authentication failures [preauth]
Feb 20 07:57:20 np0005625204.localdomain sshd[56334]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:31 np0005625204.localdomain sshd[56334]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 53603 ssh2 [preauth]
Feb 20 07:57:31 np0005625204.localdomain sshd[56334]: Disconnecting authenticating user root 185.246.128.171 port 53603: Too many authentication failures [preauth]
Feb 20 07:57:33 np0005625204.localdomain sshd[56336]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:57:36 np0005625204.localdomain systemd[1]: tmp-crun.qMVpx0.mount: Deactivated successfully.
Feb 20 07:57:36 np0005625204.localdomain podman[56338]: 2026-02-20 07:57:36.413461245 +0000 UTC m=+0.352575139 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 07:57:36 np0005625204.localdomain podman[56338]: 2026-02-20 07:57:36.623765243 +0000 UTC m=+0.562879167 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.buildah.version=1.41.5)
Feb 20 07:57:36 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:57:37 np0005625204.localdomain sshd[56367]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:37 np0005625204.localdomain sshd[56336]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 64010 ssh2 [preauth]
Feb 20 07:57:37 np0005625204.localdomain sshd[56336]: Disconnecting authenticating user root 185.246.128.171 port 64010: Too many authentication failures [preauth]
Feb 20 07:57:38 np0005625204.localdomain sshd[56367]: Invalid user oracle from 101.36.109.176 port 43782
Feb 20 07:57:38 np0005625204.localdomain sshd[56367]: Received disconnect from 101.36.109.176 port 43782:11: Bye Bye [preauth]
Feb 20 07:57:38 np0005625204.localdomain sshd[56367]: Disconnected from invalid user oracle 101.36.109.176 port 43782 [preauth]
Feb 20 07:57:39 np0005625204.localdomain sshd[56369]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:43 np0005625204.localdomain sshd[56371]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:43 np0005625204.localdomain sshd[56371]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:57:47 np0005625204.localdomain sshd[56369]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 35117 ssh2 [preauth]
Feb 20 07:57:47 np0005625204.localdomain sshd[56369]: Disconnecting authenticating user root 185.246.128.171 port 35117: Too many authentication failures [preauth]
Feb 20 07:57:48 np0005625204.localdomain sshd[56373]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:57:56 np0005625204.localdomain sshd[56373]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 20290 ssh2 [preauth]
Feb 20 07:57:56 np0005625204.localdomain sshd[56373]: Disconnecting authenticating user root 185.246.128.171 port 20290: Too many authentication failures [preauth]
Feb 20 07:57:56 np0005625204.localdomain sudo[56375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:57:56 np0005625204.localdomain sudo[56375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:57:56 np0005625204.localdomain sudo[56375]: pam_unix(sudo:session): session closed for user root
Feb 20 07:57:56 np0005625204.localdomain sudo[56390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:57:56 np0005625204.localdomain sudo[56390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:57:57 np0005625204.localdomain sudo[56390]: pam_unix(sudo:session): session closed for user root
Feb 20 07:57:58 np0005625204.localdomain sudo[56437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:57:58 np0005625204.localdomain sudo[56437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:57:58 np0005625204.localdomain sudo[56437]: pam_unix(sudo:session): session closed for user root
Feb 20 07:57:58 np0005625204.localdomain sshd[56452]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:04 np0005625204.localdomain sshd[56452]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 17810 ssh2 [preauth]
Feb 20 07:58:04 np0005625204.localdomain sshd[56452]: Disconnecting authenticating user root 185.246.128.171 port 17810: Too many authentication failures [preauth]
Feb 20 07:58:04 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 18 pg[2.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [4,5,3] r=2 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:05 np0005625204.localdomain sshd[56454]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:06 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 20 pg[3.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [5,4,0] r=2 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:06 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 21 pg[4.0( empty local-lis/les=0/0 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [3,4,5] r=0 lpr=21 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:58:07 np0005625204.localdomain podman[56456]: 2026-02-20 07:58:07.145337158 +0000 UTC m=+0.089321586 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 07:58:07 np0005625204.localdomain podman[56456]: 2026-02-20 07:58:07.36051411 +0000 UTC m=+0.304498528 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510)
Feb 20 07:58:07 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:58:07 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 22 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=0/0 les/c/f=0/0/0 sis=21) [3,4,5] r=0 lpr=21 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:09 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 23 pg[5.0( empty local-lis/les=0/0 n=0 ec=23/23 lis/c=0/0 les/c/f=0/0/0 sis=23) [2,3,4] r=1 lpr=23 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:16 np0005625204.localdomain sshd[56454]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 59278 ssh2 [preauth]
Feb 20 07:58:16 np0005625204.localdomain sshd[56454]: Disconnecting authenticating user root 185.246.128.171 port 59278: Too many authentication failures [preauth]
Feb 20 07:58:18 np0005625204.localdomain sshd[56486]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:19 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 31 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=31 pruub=8.638875961s) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 active pruub 1116.948120117s@ mbc={}] start_peering_interval up [4,5,3] -> [4,5,3], acting [4,5,3] -> [4,5,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:19 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 31 pg[2.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=31 pruub=8.636317253s) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1116.948120117s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:19 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=31 pruub=10.822261810s) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 active pruub 1123.471557617s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,0], acting [5,4,0] -> [5,4,0], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:19 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 31 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=31 pruub=10.820794106s) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.471557617s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.19( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.17( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1f( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1e( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1d( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1c( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1a( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1b( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.9( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.8( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.5( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.2( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.1( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.6( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.4( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.7( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.3( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.c( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.b( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.e( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.d( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.f( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.10( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.a( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.11( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.12( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.16( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.13( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.18( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.15( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 32 pg[2.14( empty local-lis/les=18/19 n=0 ec=31/18 lis/c=18/18 les/c/f=19/19/0 sis=31) [4,5,3] r=2 lpr=31 pi=[18,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.17( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.19( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1a( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1b( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.18( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.16( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.14( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.15( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.13( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.11( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.10( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.12( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.f( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.e( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.d( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1c( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.c( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.2( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.3( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.4( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.5( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.6( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.8( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.9( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.7( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.a( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.b( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1d( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1e( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:20 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 32 pg[3.1f( empty local-lis/les=20/21 n=0 ec=31/20 lis/c=20/20 les/c/f=21/21/0 sis=31) [5,4,0] r=2 lpr=31 pi=[20,31)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:21 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 33 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33 pruub=9.736856461s) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active pruub 1120.090332031s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,5], acting [3,4,5] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:21 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 33 pg[5.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=33 pruub=12.287407875s) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 active pruub 1122.640991211s@ mbc={}] start_peering_interval up [2,3,4] -> [2,3,4], acting [2,3,4] -> [2,3,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:21 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 33 pg[4.0( empty local-lis/les=21/22 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33 pruub=9.736856461s) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown pruub 1120.090332031s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:21 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 33 pg[5.0( empty local-lis/les=23/24 n=0 ec=23/23 lis/c=23/23 les/c/f=24/24/0 sis=33 pruub=12.283823013s) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1122.640991211s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1e( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1f( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1e( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.10( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.11( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.10( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.13( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.12( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.12( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.14( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.13( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.15( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.11( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.15( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.14( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.16( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.17( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.17( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.16( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.9( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.9( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.8( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.8( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.b( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.a( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.b( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.a( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.c( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.d( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.d( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.c( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.4( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.5( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.19( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.7( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.18( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1b( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1a( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.6( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.6( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.7( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.5( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.3( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.2( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.2( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.4( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.3( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.f( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.e( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.e( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1d( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1c( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1d( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1c( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1b( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.f( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.1a( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.19( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[5.18( empty local-lis/les=23/24 n=0 ec=33/23 lis/c=23/23 les/c/f=24/24/0 sis=33) [2,3,4] r=1 lpr=33 pi=[23,33)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1f( empty local-lis/les=21/22 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.0( empty local-lis/les=33/34 n=0 ec=21/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 34 pg[4.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=21/21 les/c/f=22/22/0 sis=33) [3,4,5] r=0 lpr=33 pi=[21,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:25 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 35 pg[6.0( empty local-lis/les=0/0 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [0,4,2] r=0 lpr=35 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:25 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.0 scrub starts
Feb 20 07:58:26 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.0 scrub ok
Feb 20 07:58:26 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 36 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=0/0 les/c/f=0/0/0 sis=35) [0,4,2] r=0 lpr=35 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:27 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 36 pg[7.0( empty local-lis/les=0/0 n=0 ec=36/36 lis/c=0/0 les/c/f=0/0/0 sis=36) [1,5,3] r=2 lpr=36 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:27 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Feb 20 07:58:28 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Feb 20 07:58:28 np0005625204.localdomain sshd[56486]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 9132 ssh2 [preauth]
Feb 20 07:58:28 np0005625204.localdomain sshd[56486]: Disconnecting authenticating user root 185.246.128.171 port 9132: Too many authentication failures [preauth]
Feb 20 07:58:29 np0005625204.localdomain sshd[56489]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:29 np0005625204.localdomain sshd[56491]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:29 np0005625204.localdomain sshd[56489]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:58:30 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.18 deep-scrub starts
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.18 deep-scrub ok
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.4( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,1] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.183403969s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.5( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.183325768s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658691406s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.3( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182825089s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.6( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182825089s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.658691406s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.1f( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182967186s) [0,1,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659423828s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,5], acting [5,4,0] -> [0,1,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182967186s) [0,1,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.659423828s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.19( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.1e( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1e( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182214737s) [3,2,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.660034180s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,4], acting [5,4,0] -> [3,2,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630276680s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636314392s) [2,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807495117s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,1], acting [3,4,5] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.18( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630190849s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.19( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636209488s) [2,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807495117s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.182172775s) [3,2,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.660034180s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.180464745s) [4,0,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658325195s@ mbc={}] start_peering_interval up [5,4,0] -> [4,0,5], acting [5,4,0] -> [4,0,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.3( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.180406570s) [4,0,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658325195s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636122704s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807739258s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.632182121s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803833008s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178568840s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350341797s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.636039734s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807739258s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.181464195s) [3,4,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659423828s@ mbc={}] start_peering_interval up [5,4,0] -> [3,4,5], acting [5,4,0] -> [3,4,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.181440353s) [3,4,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.659423828s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.18( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631990433s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.803833008s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.5( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631323814s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803466797s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,3], acting [2,3,4] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631259918s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.803466797s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178009033s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350341797s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.7( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177729607s) [2,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350219727s@ mbc={}] start_peering_interval up [4,5,3] -> [2,1,0], acting [4,5,3] -> [2,1,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177687645s) [2,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350219727s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628846169s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,5], acting [3,4,5] -> [4,3,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.634988785s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807739258s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628745079s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.634940147s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807739258s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628884315s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801879883s@ mbc={}] start_peering_interval up [2,3,4] -> [1,0,2], acting [2,3,4] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631159782s) [4,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.804199219s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,0], acting [2,3,4] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.631127357s) [4,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.804199219s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628637314s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801879883s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630361557s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803710938s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,5], acting [2,3,4] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630361557s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.803710938s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178629875s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351440430s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177393913s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350952148s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178629875s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.351440430s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627976418s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801635742s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.177356720s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350952148s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627976418s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.801635742s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627916336s) [2,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801635742s@ mbc={}] start_peering_interval up [3,4,5] -> [2,3,4], acting [3,4,5] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.9( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176346779s) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350219727s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,5], acting [4,5,3] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.629531860s) [2,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.803466797s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,4], acting [2,3,4] -> [2,0,4], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175090790s) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654663086s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,2], acting [5,4,0] -> [0,4,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175090790s) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.654663086s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179287910s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658935547s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179253578s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658935547s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174358368s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654174805s@ mbc={}] start_peering_interval up [5,4,0] -> [4,3,5], acting [5,4,0] -> [4,3,5], acting_primary 5 -> 4, up_primary 5 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174321175s) [4,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.654174805s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178557396s) [2,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [2,0,4], acting [5,4,0] -> [2,0,4], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179244995s) [5,1,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659423828s@ mbc={}] start_peering_interval up [5,4,0] -> [5,1,3], acting [5,4,0] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.8( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178516388s) [2,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658691406s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.633274078s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807373047s@ mbc={}] start_peering_interval up [3,4,5] -> [4,5,0], acting [3,4,5] -> [4,5,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179925919s) [3,5,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.660156250s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,4], acting [5,4,0] -> [3,5,4], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.7( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179894447s) [3,5,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.660156250s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.9( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176346779s) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.350219727s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.9( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.179183006s) [5,1,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.659423828s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.629234314s) [2,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.803466797s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627415657s) [2,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801635742s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627312660s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801757812s@ mbc={}] start_peering_interval up [2,3,4] -> [4,2,3], acting [2,3,4] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.8( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175848007s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350341797s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.b( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627437592s) [4,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.802001953s@ mbc={}] start_peering_interval up [2,3,4] -> [4,0,2], acting [2,3,4] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178317070s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.659301758s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.8( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.175795555s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350341797s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.178133965s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.659301758s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.633084297s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807373047s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.2( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627393723s) [4,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.802001953s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627050400s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801757812s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.4( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.2( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176150322s) [1,0,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351684570s@ mbc={}] start_peering_interval up [4,5,3] -> [1,0,2], acting [4,5,3] -> [1,0,2], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.2( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176084518s) [1,0,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351684570s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176688194s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658691406s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.4( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176637650s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658691406s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.4( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174972534s) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350708008s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,1], acting [4,5,3] -> [3,2,1], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176395416s) [3,5,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.658569336s@ mbc={}] start_peering_interval up [5,4,0] -> [3,5,1], acting [5,4,0] -> [3,5,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.2( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176329613s) [3,5,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.658569336s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171910286s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654174805s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1c( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171862602s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.654174805s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171505928s) [1,2,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653930664s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,3], acting [5,4,0] -> [1,2,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.d( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171465874s) [1,2,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653930664s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.a( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.4( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174972534s) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.350708008s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176141739s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352172852s@ mbc={}] start_peering_interval up [4,5,3] -> [3,5,4], acting [4,5,3] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624894142s) [3,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801025391s@ mbc={}] start_peering_interval up [2,3,4] -> [3,1,2], acting [2,3,4] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.176141739s) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.352172852s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.6( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624894142s) [3,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.801025391s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.6( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174986839s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351562500s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.7( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174946785s) [4,2,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351562500s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,3], acting [4,5,3] -> [4,2,3], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.6( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174986839s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.351562500s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627578735s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.804443359s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170887947s) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654052734s@ mbc={}] start_peering_interval up [5,4,0] -> [2,4,0], acting [5,4,0] -> [2,4,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.e( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170839310s) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.654052734s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170619965s) [1,5,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653930664s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,0], acting [5,4,0] -> [1,5,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.7( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174694061s) [4,2,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351562500s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170265198s) [1,5,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653808594s@ mbc={}] start_peering_interval up [5,4,0] -> [1,5,3], acting [5,4,0] -> [1,5,3], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.7( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.10( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170221329s) [1,5,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653808594s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.f( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170529366s) [1,5,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653930664s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170422554s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.654174805s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.12( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170422554s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.654174805s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168743134s) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652832031s@ mbc={}] start_peering_interval up [5,4,0] -> [1,2,0], acting [5,4,0] -> [1,2,0], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.14( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168713570s) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652832031s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169594765s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653686523s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,2], acting [5,4,0] -> [1,3,2], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.13( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169507027s) [1,3,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653686523s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168907166s) [2,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653198242s@ mbc={}] start_peering_interval up [5,4,0] -> [2,1,0], acting [5,4,0] -> [2,1,0], acting_primary 5 -> 2, up_primary 5 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.15( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168875694s) [2,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653198242s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623968124s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630991936s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808593750s@ mbc={}] start_peering_interval up [3,4,5] -> [1,5,0], acting [3,4,5] -> [1,5,0], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.3( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174083710s) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351806641s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,5], acting [4,5,3] -> [4,3,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.5( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.630887032s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808593750s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.3( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.174035072s) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351806641s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.7( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623549461s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168410301s) [1,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652954102s@ mbc={}] start_peering_interval up [5,4,0] -> [1,3,5], acting [5,4,0] -> [1,3,5], acting_primary 5 -> 1, up_primary 5 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168010712s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [3,2,1], acting [5,4,0] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167948723s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [0,4,5], acting [5,4,0] -> [0,4,5], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.16( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168293953s) [1,3,5] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652954102s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.18( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167943001s) [3,2,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652709961s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.17( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167948723s) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.652709961s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.16( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167690277s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.653198242s@ mbc={}] start_peering_interval up [5,4,0] -> [5,4,3], acting [5,4,0] -> [5,4,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167144775s) [0,1,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [0,1,2], acting [5,4,0] -> [0,1,2], acting_primary 5 -> 0, up_primary 5 -> 0, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.2( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628471375s) [4,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807617188s@ mbc={}] start_peering_interval up [3,4,5] -> [4,3,2], acting [3,4,5] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.c( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628397942s) [4,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807617188s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.19( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167144775s) [0,1,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1137.652709961s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1b( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.167296410s) [5,4,3] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.653198242s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621114731s) [2,4,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800659180s@ mbc={}] start_peering_interval up [2,3,4] -> [2,4,0], acting [2,3,4] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627498627s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.804443359s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.b( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.d( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621026039s) [2,4,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800659180s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165511131s) [5,3,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1137.652709961s@ mbc={}] start_peering_interval up [5,4,0] -> [5,3,4], acting [5,4,0] -> [5,3,4], acting_primary 5 -> 5, up_primary 5 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[3.1a( empty local-lis/les=31/32 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165446281s) [5,3,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1137.652709961s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628199577s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808227539s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620414734s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800415039s@ mbc={}] start_peering_interval up [2,3,4] -> [3,4,2], acting [2,3,4] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.c( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620414734s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.800415039s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172441483s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352539062s@ mbc={}] start_peering_interval up [4,5,3] -> [3,2,4], acting [4,5,3] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621196747s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801635742s@ mbc={}] start_peering_interval up [2,3,4] -> [1,5,0], acting [2,3,4] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.12( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.9( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621147156s) [1,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801635742s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.d( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.628086090s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808227539s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.1e( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619400024s) [2,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800292969s@ mbc={}] start_peering_interval up [2,3,4] -> [2,0,1], acting [2,3,4] -> [2,0,1], acting_primary 2 -> 2, up_primary 2 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618499756s) [3,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799438477s@ mbc={}] start_peering_interval up [2,3,4] -> [3,5,4], acting [2,3,4] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.17( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618499756s) [3,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.799438477s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.8( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619309425s) [2,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800292969s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.e( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172441483s) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.352539062s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618299484s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799682617s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.11( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171576500s) [4,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352905273s@ mbc={}] start_peering_interval up [4,5,3] -> [4,3,2], acting [4,5,3] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627405167s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.19( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172412872s) [3,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353881836s@ mbc={}] start_peering_interval up [4,5,3] -> [3,4,2], acting [4,5,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.11( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171514511s) [4,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352905273s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1f( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.627290726s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808715820s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.19( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.172412872s) [3,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1133.353881836s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1e( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.618128777s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.799682617s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.617027283s) [4,5,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798706055s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,3], acting [2,3,4] -> [4,5,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.1f( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616981506s) [4,5,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798706055s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626946449s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808837891s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1e( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626896858s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808837891s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.18( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171956062s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353759766s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616610527s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798583984s@ mbc={}] start_peering_interval up [2,3,4] -> [4,5,0], acting [2,3,4] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.18( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.171730042s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353759766s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.10( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616552353s) [4,5,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798583984s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.17( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170897484s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353393555s@ mbc={}] start_peering_interval up [4,5,3] -> [1,5,3], acting [4,5,3] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626388550s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808837891s@ mbc={}] start_peering_interval up [3,4,5] -> [3,4,2], acting [3,4,5] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.11( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.626388550s) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.808837891s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.17( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170782089s) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353393555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616309166s) [1,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798950195s@ mbc={}] start_peering_interval up [2,3,4] -> [1,2,0], acting [2,3,4] -> [1,2,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.11( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.616275787s) [1,2,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798950195s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.16( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170540810s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353393555s@ mbc={}] start_peering_interval up [4,5,3] -> [1,2,0], acting [4,5,3] -> [1,2,0], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625931740s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808837891s@ mbc={}] start_peering_interval up [3,4,5] -> [3,2,4], acting [3,4,5] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.16( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170490265s) [1,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353393555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615800858s) [5,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.798828125s@ mbc={}] start_peering_interval up [2,3,4] -> [5,1,3], acting [2,3,4] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.10( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625931740s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.808837891s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.12( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615706444s) [5,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.798828125s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.15( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170089722s) [5,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353271484s@ mbc={}] start_peering_interval up [4,5,3] -> [5,0,4], acting [4,5,3] -> [5,0,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625356674s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [3,1,5], acting [3,4,5] -> [3,1,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.15( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.170002937s) [5,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353271484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.17( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.625356674s) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.808715820s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624859810s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [4,2,3], acting [3,4,5] -> [4,2,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.13( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615407944s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799316406s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,1], acting [2,3,4] -> [5,0,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624911308s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808715820s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.13( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624801636s) [4,2,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808715820s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.12( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624871254s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808715820s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615819931s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799804688s@ mbc={}] start_peering_interval up [2,3,4] -> [3,2,4], acting [2,3,4] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.14( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615819931s) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown pruub 1128.799804688s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.14( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169187546s) [4,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353271484s@ mbc={}] start_peering_interval up [4,5,3] -> [4,2,0], acting [4,5,3] -> [4,2,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.13( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169013023s) [2,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353149414s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,3], acting [4,5,3] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.14( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.169098854s) [4,2,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353271484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.13( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168964386s) [2,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353149414s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624094963s) [5,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808471680s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,1], acting [3,4,5] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615510941s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.799926758s@ mbc={}] start_peering_interval up [2,3,4] -> [4,3,5], acting [2,3,4] -> [4,3,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.15( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.624039650s) [5,3,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808471680s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.15( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615470886s) [4,3,5] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.799926758s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.12( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168498993s) [5,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353149414s@ mbc={}] start_peering_interval up [4,5,3] -> [5,3,1], acting [4,5,3] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.12( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168471336s) [5,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353149414s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.13( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614545822s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.799316406s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615341187s) [1,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800048828s@ mbc={}] start_peering_interval up [2,3,4] -> [1,3,2], acting [2,3,4] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623558998s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808349609s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.16( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.615289688s) [1,3,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800048828s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.10( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168370247s) [2,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353271484s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,4], acting [4,5,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.14( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623507500s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808349609s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.10( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.168342590s) [2,0,4] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353271484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623075485s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808227539s@ mbc={}] start_peering_interval up [3,4,5] -> [5,0,1], acting [3,4,5] -> [5,0,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.9( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623042107s) [5,0,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808227539s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623076439s) [5,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808349609s@ mbc={}] start_peering_interval up [3,4,5] -> [5,4,3], acting [3,4,5] -> [5,4,3], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623312950s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808593750s@ mbc={}] start_peering_interval up [3,4,5] -> [0,4,5], acting [3,4,5] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.8( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.623011589s) [5,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808349609s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614774704s) [0,2,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800170898s@ mbc={}] start_peering_interval up [2,3,4] -> [0,2,4], acting [2,3,4] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166985512s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352416992s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,3], acting [4,5,3] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.a( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614736557s) [0,2,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800170898s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.622882843s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808471680s@ mbc={}] start_peering_interval up [3,4,5] -> [0,1,5], acting [3,4,5] -> [0,1,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.b( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.622827530s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808471680s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.16( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.622802734s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808593750s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614774704s) [5,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800415039s@ mbc={}] start_peering_interval up [2,3,4] -> [5,0,4], acting [2,3,4] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.d( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166861534s) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352416992s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166196823s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352294922s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.b( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614451408s) [5,0,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800415039s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.c( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.166126251s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352294922s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621610641s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807983398s@ mbc={}] start_peering_interval up [3,4,5] -> [1,0,2], acting [3,4,5] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165349007s) [5,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.351684570s@ mbc={}] start_peering_interval up [4,5,3] -> [5,1,0], acting [4,5,3] -> [5,1,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165320396s) [5,1,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.351684570s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165679932s) [2,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.352172852s@ mbc={}] start_peering_interval up [4,5,3] -> [2,3,1], acting [4,5,3] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.a( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.165613174s) [2,3,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.352172852s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614287376s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.800903320s@ mbc={}] start_peering_interval up [2,3,4] -> [5,3,4], acting [2,3,4] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621454239s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808227539s@ mbc={}] start_peering_interval up [3,4,5] -> [5,3,4], acting [3,4,5] -> [5,3,4], acting_primary 3 -> 5, up_primary 3 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.4( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614025116s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.800903320s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.6( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621428490s) [5,3,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808227539s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621025085s) [2,1,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808105469s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,0], acting [3,4,5] -> [2,1,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.a( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.621115685s) [1,0,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807983398s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620933533s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.808105469s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,4], acting [3,4,5] -> [0,5,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.1( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620966911s) [2,1,0] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808105469s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614357948s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801513672s@ mbc={}] start_peering_interval up [2,3,4] -> [0,4,5], acting [2,3,4] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.7( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.620886803s) [0,5,4] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.808105469s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.5( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614321709s) [0,4,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801513672s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619892120s) [0,5,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807495117s@ mbc={}] start_peering_interval up [3,4,5] -> [0,5,1], acting [3,4,5] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.4( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619792938s) [0,5,1] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807495117s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.613944054s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.801757812s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,2], acting [2,3,4] -> [0,1,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619462013s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.807373047s@ mbc={}] start_peering_interval up [3,4,5] -> [2,1,3], acting [3,4,5] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.3( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.613870621s) [0,1,2] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.801757812s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.5( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162936211s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350952148s@ mbc={}] start_peering_interval up [4,5,3] -> [2,0,1], acting [4,5,3] -> [2,0,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.2( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.619343758s) [2,1,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.807373047s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.5( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162478447s) [2,0,1] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350952148s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162004471s) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.350708008s@ mbc={}] start_peering_interval up [4,5,3] -> [5,4,3], acting [4,5,3] -> [5,4,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1b( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.161961555s) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.350708008s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.152458191s) [0,4,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.341430664s@ mbc={}] start_peering_interval up [4,5,3] -> [0,4,2], acting [4,5,3] -> [0,4,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.1f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.152385712s) [0,4,2] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.341430664s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.614631653s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.804199219s@ mbc={}] start_peering_interval up [2,3,4] -> [0,1,5], acting [2,3,4] -> [0,1,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.617092133s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active pruub 1128.806762695s@ mbc={}] start_peering_interval up [3,4,5] -> [2,4,3], acting [3,4,5] -> [2,4,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[4.3( empty local-lis/les=33/34 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.617017746s) [2,4,3] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.806762695s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162965775s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active pruub 1133.353027344s@ mbc={}] start_peering_interval up [4,5,3] -> [2,4,0], acting [4,5,3] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[5.19( empty local-lis/les=33/34 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39 pruub=8.613686562s) [0,1,5] r=-1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1128.804199219s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.18( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:58:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[2.f( empty local-lis/les=31/32 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39 pruub=13.162833214s) [2,4,0] r=-1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1133.353027344s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.10( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.c( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.a( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.5( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.16( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,3,5] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.13( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.16( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.11( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,2,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.a( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,0,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.5( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.1b( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,0,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.9( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [1,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.8( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.2( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,0,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.10( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,5,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.14( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [4,2,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.2( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,0,2] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1c( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,3,2] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.e( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,5,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.1c( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [4,2,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.d( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [1,2,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1d( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1a( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,3,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.1b( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,4,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 39 pg[3.9( empty local-lis/les=0/0 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,1,3] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.b( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.1d( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.f( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,4,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.8( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.b( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.9( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.14( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.13( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [5,0,1] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.15( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [5,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.c( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,0,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.1c( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,1,0] r=2 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.5( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,0,1] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[4.1( empty local-lis/les=0/0 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,1,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.d( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,4,0] r=2 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[5.e( empty local-lis/les=0/0 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [2,0,4] r=1 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 39 pg[2.10( empty local-lis/les=0/0 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [2,0,4] r=1 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.1e( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.19( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,1,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.6( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.19( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.1f( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,1,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.4( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,1] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.5( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.7( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.b( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.16( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.12( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.12( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.17( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[4.1e( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,4,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.1d( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.1( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.2( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.7( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,5,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.9( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.b( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,5] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.17( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,5,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.17( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,1,5] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.1e( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.a( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[3.1( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[2.1f( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [0,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 40 pg[5.3( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [0,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.18( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.19( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,4,2] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.e( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.c( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.6( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,1,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.1e( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[3.4( empty local-lis/les=39/40 n=0 ec=31/20 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.6( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,4] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.11( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,4,2] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.f( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[4.10( empty local-lis/les=39/40 n=0 ec=33/21 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[5.14( empty local-lis/les=39/40 n=0 ec=33/23 lis/c=33/33 les/c/f=34/34/0 sis=39) [3,2,4] r=0 lpr=39 pi=[33,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:32 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 40 pg[2.4( empty local-lis/les=39/40 n=0 ec=31/18 lis/c=31/31 les/c/f=32/32/0 sis=39) [3,2,1] r=0 lpr=39 pi=[31,39)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:58:33 np0005625204.localdomain sudo[56494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:58:33 np0005625204.localdomain sudo[56494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:58:33 np0005625204.localdomain sudo[56494]: pam_unix(sudo:session): session closed for user root
Feb 20 07:58:34 np0005625204.localdomain sshd[56491]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 8432 ssh2 [preauth]
Feb 20 07:58:34 np0005625204.localdomain sshd[56491]: Disconnecting authenticating user root 185.246.128.171 port 8432: Too many authentication failures [preauth]
Feb 20 07:58:35 np0005625204.localdomain sudo[56509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:58:35 np0005625204.localdomain sudo[56509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:58:35 np0005625204.localdomain sudo[56509]: pam_unix(sudo:session): session closed for user root
Feb 20 07:58:35 np0005625204.localdomain sshd[56524]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:35 np0005625204.localdomain sudo[56525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:58:35 np0005625204.localdomain sudo[56525]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:58:35 np0005625204.localdomain sudo[56525]: pam_unix(sudo:session): session closed for user root
Feb 20 07:58:36 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Feb 20 07:58:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:58:38 np0005625204.localdomain podman[56541]: 2026-02-20 07:58:38.129465745 +0000 UTC m=+0.070461675 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, config_id=tripleo_step1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public)
Feb 20 07:58:38 np0005625204.localdomain podman[56541]: 2026-02-20 07:58:38.321933095 +0000 UTC m=+0.262928945 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 07:58:38 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:58:41 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.17 scrub starts
Feb 20 07:58:41 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.17 scrub ok
Feb 20 07:58:44 np0005625204.localdomain sshd[56524]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 42827 ssh2 [preauth]
Feb 20 07:58:44 np0005625204.localdomain sshd[56524]: Disconnecting authenticating user root 185.246.128.171 port 42827: Too many authentication failures [preauth]
Feb 20 07:58:46 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.19 scrub starts
Feb 20 07:58:46 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.19 scrub ok
Feb 20 07:58:47 np0005625204.localdomain sshd[56573]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:47 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1 scrub starts
Feb 20 07:58:47 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1 scrub ok
Feb 20 07:58:50 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.c deep-scrub starts
Feb 20 07:58:50 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.c deep-scrub ok
Feb 20 07:58:51 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.6 scrub starts
Feb 20 07:58:51 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.6 scrub ok
Feb 20 07:58:52 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.1e scrub starts
Feb 20 07:58:53 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.1e scrub ok
Feb 20 07:58:53 np0005625204.localdomain sshd[56573]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 51130 ssh2 [preauth]
Feb 20 07:58:53 np0005625204.localdomain sshd[56573]: Disconnecting authenticating user root 185.246.128.171 port 51130: Too many authentication failures [preauth]
Feb 20 07:58:55 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.a scrub starts
Feb 20 07:58:55 np0005625204.localdomain sshd[56575]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:58:55 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.a scrub ok
Feb 20 07:58:57 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.6 scrub starts
Feb 20 07:58:57 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.6 scrub ok
Feb 20 07:58:58 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1f scrub starts
Feb 20 07:58:59 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.1f scrub ok
Feb 20 07:59:00 np0005625204.localdomain sudo[56590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrnbpxnqdhnpuszbpepivtqkcmncbujo ; /usr/bin/python3
Feb 20 07:59:00 np0005625204.localdomain sudo[56590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:00 np0005625204.localdomain python3[56592]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:00 np0005625204.localdomain sudo[56590]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:00 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.3 scrub starts
Feb 20 07:59:01 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.3 scrub ok
Feb 20 07:59:02 np0005625204.localdomain sshd[56575]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 32283 ssh2 [preauth]
Feb 20 07:59:02 np0005625204.localdomain sshd[56575]: Disconnecting authenticating user root 185.246.128.171 port 32283: Too many authentication failures [preauth]
Feb 20 07:59:02 np0005625204.localdomain sudo[56606]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avhvnleytviklvyroonntfocpiiljecy ; /usr/bin/python3
Feb 20 07:59:02 np0005625204.localdomain sudo[56606]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:02 np0005625204.localdomain python3[56608]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:02 np0005625204.localdomain sudo[56606]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:04 np0005625204.localdomain sudo[56622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-spoqjcuvgmlkjiqltunhywtxkseuwkjg ; /usr/bin/python3
Feb 20 07:59:04 np0005625204.localdomain sudo[56622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:04 np0005625204.localdomain python3[56624]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:04 np0005625204.localdomain sudo[56622]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:04 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.12 scrub starts
Feb 20 07:59:04 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.12 scrub ok
Feb 20 07:59:05 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.14 scrub starts
Feb 20 07:59:05 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.14 scrub ok
Feb 20 07:59:05 np0005625204.localdomain sshd[56625]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:06 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.1e scrub starts
Feb 20 07:59:07 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.1e scrub ok
Feb 20 07:59:07 np0005625204.localdomain sudo[56672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyhoyrqxiewrowljyeqzcjaqqrgnrmdd ; /usr/bin/python3
Feb 20 07:59:07 np0005625204.localdomain sudo[56672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:07 np0005625204.localdomain python3[56674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:07 np0005625204.localdomain sudo[56672]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:08 np0005625204.localdomain sudo[56715]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abpeizqtisuxstubwtuacqqqzyszfbdj ; /usr/bin/python3
Feb 20 07:59:08 np0005625204.localdomain sudo[56715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:08 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.17 deep-scrub starts
Feb 20 07:59:08 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.17 deep-scrub ok
Feb 20 07:59:08 np0005625204.localdomain python3[56717]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574347.6070158-92373-97076641946657/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=8e2004121a34320613d32710ae37702da8d027e6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:08 np0005625204.localdomain sudo[56715]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:59:09 np0005625204.localdomain systemd[1]: tmp-crun.OMzGSO.mount: Deactivated successfully.
Feb 20 07:59:09 np0005625204.localdomain podman[56732]: 2026-02-20 07:59:09.14872189 +0000 UTC m=+0.086170049 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, tcib_managed=true, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5)
Feb 20 07:59:09 np0005625204.localdomain podman[56732]: 2026-02-20 07:59:09.349013725 +0000 UTC m=+0.286461884 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Feb 20 07:59:09 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:59:09 np0005625204.localdomain sshd[56625]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 30296 ssh2 [preauth]
Feb 20 07:59:09 np0005625204.localdomain sshd[56625]: Disconnecting authenticating user root 185.246.128.171 port 30296: Too many authentication failures [preauth]
Feb 20 07:59:10 np0005625204.localdomain sshd[56759]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:10 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.19 scrub starts
Feb 20 07:59:10 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.19 scrub ok
Feb 20 07:59:11 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.19 deep-scrub starts
Feb 20 07:59:11 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.19 deep-scrub ok
Feb 20 07:59:12 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.11 scrub starts
Feb 20 07:59:12 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.11 scrub ok
Feb 20 07:59:12 np0005625204.localdomain sudo[56806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzfdzmdwucngmjqfslkhttszumworqdt ; /usr/bin/python3
Feb 20 07:59:12 np0005625204.localdomain sudo[56806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:13 np0005625204.localdomain python3[56808]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:13 np0005625204.localdomain sudo[56806]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:13 np0005625204.localdomain sudo[56849]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwvomdgmgwmzqgctbqeduwmflvechlsf ; /usr/bin/python3
Feb 20 07:59:13 np0005625204.localdomain sudo[56849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:13 np0005625204.localdomain python3[56851]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574352.7095907-92373-118933211119620/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=417007d20895a54571330144b727b714177f3d13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:13 np0005625204.localdomain sudo[56849]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:14 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.b scrub starts
Feb 20 07:59:14 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.b scrub ok
Feb 20 07:59:15 np0005625204.localdomain sshd[56866]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:15 np0005625204.localdomain sshd[56866]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 07:59:16 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.10 scrub starts
Feb 20 07:59:16 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.10 scrub ok
Feb 20 07:59:16 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.5 scrub starts
Feb 20 07:59:16 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 5.5 scrub ok
Feb 20 07:59:17 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.e scrub starts
Feb 20 07:59:17 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.e scrub ok
Feb 20 07:59:17 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.16 deep-scrub starts
Feb 20 07:59:17 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.16 deep-scrub ok
Feb 20 07:59:17 np0005625204.localdomain sudo[56913]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptmdmfwclfghysmpzlclzjosfsdmjsdj ; /usr/bin/python3
Feb 20 07:59:17 np0005625204.localdomain sudo[56913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:18 np0005625204.localdomain python3[56915]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:18 np0005625204.localdomain sudo[56913]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:18 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Feb 20 07:59:18 np0005625204.localdomain sshd[56759]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 60406 ssh2 [preauth]
Feb 20 07:59:18 np0005625204.localdomain sshd[56759]: Disconnecting authenticating user root 185.246.128.171 port 60406: Too many authentication failures [preauth]
Feb 20 07:59:18 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Feb 20 07:59:18 np0005625204.localdomain sudo[56956]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wphvbyftoovuxvagzlsolkmckyyogrjp ; /usr/bin/python3
Feb 20 07:59:18 np0005625204.localdomain sudo[56956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:18 np0005625204.localdomain python3[56958]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574357.8343172-92373-95668928802339/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=2a03ad5f1837679340274b70e67e768ad4c81335 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:18 np0005625204.localdomain sudo[56956]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:19 np0005625204.localdomain sshd[56973]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:22 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Feb 20 07:59:22 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Feb 20 07:59:22 np0005625204.localdomain sudo[57020]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcocpjgmhixoafroeddnyvacbghijpoa ; /usr/bin/python3
Feb 20 07:59:22 np0005625204.localdomain sudo[57020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:22 np0005625204.localdomain python3[57022]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:22 np0005625204.localdomain sudo[57020]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:22 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.7 scrub starts
Feb 20 07:59:22 np0005625204.localdomain sudo[57065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyghrnsrnilyowdqneexwkyrhapdcfsr ; /usr/bin/python3
Feb 20 07:59:22 np0005625204.localdomain sudo[57065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:23 np0005625204.localdomain python3[57067]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574362.3786244-92796-268830979843795/source _original_basename=tmpxxzpl50x follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:23 np0005625204.localdomain sudo[57065]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=43 pruub=15.158725739s) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active pruub 1191.458862305s@ mbc={}] start_peering_interval up [0,4,2] -> [0,4,2], acting [0,4,2] -> [0,4,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 43 pg[7.0( v 40'39 (0'0,40'39] local-lis/les=36/37 n=22 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=43 pruub=8.497297287s) [1,5,3] r=2 lpr=43 pi=[36,43)/1 luod=0'0 lua=38'37 crt=40'39 lcod 38'38 mlcod 0'0 active pruub 1180.491577148s@ mbc={}] start_peering_interval up [1,5,3] -> [1,5,3], acting [1,5,3] -> [1,5,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 43 pg[6.0( empty local-lis/les=35/36 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=43 pruub=15.158725739s) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown pruub 1191.458862305s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 43 pg[7.0( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=36/36 lis/c=36/36 les/c/f=37/37/0 sis=43 pruub=8.494708061s) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 lcod 38'38 mlcod 0'0 unknown NOTIFY pruub 1180.491577148s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:23 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.4 scrub starts
Feb 20 07:59:24 np0005625204.localdomain sudo[57127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-furxpiyfvdewqybnfrwjaysvutsgrzaq ; /usr/bin/python3
Feb 20 07:59:24 np0005625204.localdomain sudo[57127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.4 scrub ok
Feb 20 07:59:24 np0005625204.localdomain python3[57129]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:24 np0005625204.localdomain sudo[57127]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=35/36 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.c( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.d( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.4( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.2( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.3( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.5( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=2 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.f( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.e( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.9( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.b( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.7( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.8( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 44 pg[7.a( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=36/37 n=1 ec=43/36 lis/c=36/36 les/c/f=37/37/0 sis=43) [1,5,3] r=2 lpr=43 pi=[36,43)/1 crt=40'39 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.0( empty local-lis/les=43/44 n=0 ec=35/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 44 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=35/35 les/c/f=36/36/0 sis=43) [0,4,2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:24 np0005625204.localdomain sudo[57170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbtkcjtbahqizgdjjvgyaumemkntsljp ; /usr/bin/python3
Feb 20 07:59:24 np0005625204.localdomain sudo[57170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:24 np0005625204.localdomain python3[57172]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574363.9345176-92882-190388927568376/source _original_basename=tmpt7i83os7 follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:24 np0005625204.localdomain sudo[57170]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Feb 20 07:59:24 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Feb 20 07:59:25 np0005625204.localdomain sudo[57200]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyinlahtqnqtoqzeaxcbbjhshysahwvg ; /usr/bin/python3
Feb 20 07:59:25 np0005625204.localdomain sudo[57200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:25 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.f deep-scrub starts
Feb 20 07:59:25 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 4.f deep-scrub ok
Feb 20 07:59:25 np0005625204.localdomain python3[57202]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Feb 20 07:59:25 np0005625204.localdomain crontab[57203]: (root) LIST (root)
Feb 20 07:59:25 np0005625204.localdomain crontab[57204]: (root) REPLACE (root)
Feb 20 07:59:25 np0005625204.localdomain sudo[57200]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:25 np0005625204.localdomain sudo[57218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpdagssgasqxvcjpchmbffzvxjcnafbr ; /usr/bin/python3
Feb 20 07:59:25 np0005625204.localdomain sudo[57218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:25 np0005625204.localdomain python3[57220]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:59:25 np0005625204.localdomain sudo[57218]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:26 np0005625204.localdomain sudo[57268]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzdynxdbwmtfzzhlwsknhbiiddsjltpe ; /usr/bin/python3
Feb 20 07:59:26 np0005625204.localdomain sudo[57268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:26 np0005625204.localdomain sudo[57268]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:26 np0005625204.localdomain sudo[57286]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyqnsehsuakbvsujoyihpeuxttcglarr ; /usr/bin/python3
Feb 20 07:59:26 np0005625204.localdomain sudo[57286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:26 np0005625204.localdomain sudo[57286]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:26 np0005625204.localdomain sshd[56973]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 50078 ssh2 [preauth]
Feb 20 07:59:26 np0005625204.localdomain sshd[56973]: Disconnecting authenticating user root 185.246.128.171 port 50078: Too many authentication failures [preauth]
Feb 20 07:59:27 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.0 scrub starts
Feb 20 07:59:27 np0005625204.localdomain sudo[57390]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qliaxuontnrhehtdiigvwqcjsnmeqdri ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574366.7728791-92975-202100251770581/async_wrapper.py 986433874107 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574366.7728791-92975-202100251770581/AnsiballZ_command.py _
Feb 20 07:59:27 np0005625204.localdomain sudo[57390]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 07:59:27 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.0 scrub ok
Feb 20 07:59:27 np0005625204.localdomain ansible-async_wrapper.py[57392]: Invoked with 986433874107 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574366.7728791-92975-202100251770581/AnsiballZ_command.py _
Feb 20 07:59:27 np0005625204.localdomain ansible-async_wrapper.py[57395]: Starting module and watcher
Feb 20 07:59:27 np0005625204.localdomain ansible-async_wrapper.py[57395]: Start watching 57396 (3600)
Feb 20 07:59:27 np0005625204.localdomain ansible-async_wrapper.py[57396]: Start module (57396)
Feb 20 07:59:27 np0005625204.localdomain ansible-async_wrapper.py[57392]: Return async_wrapper task started.
Feb 20 07:59:27 np0005625204.localdomain sudo[57390]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:27 np0005625204.localdomain sudo[57414]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcupdxqtuulpedvfjsteogqxfprawodh ; /usr/bin/python3
Feb 20 07:59:27 np0005625204.localdomain sudo[57414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:27 np0005625204.localdomain sshd[57417]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:27 np0005625204.localdomain python3[57416]: ansible-ansible.legacy.async_status Invoked with jid=986433874107.57392 mode=status _async_dir=/tmp/.ansible_async
Feb 20 07:59:27 np0005625204.localdomain sudo[57414]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:28 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Feb 20 07:59:28 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1e deep-scrub starts
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 2.1e deep-scrub ok
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1f( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.c( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.6( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.985249519s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.985187531s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040893555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.4( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983015060s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040527344s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.982960701s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040527344s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983129501s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983056068s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040893555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983232498s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.041137695s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.983161926s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.041137695s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.982010841s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040039062s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981976509s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040039062s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981136322s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.039428711s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.1( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981098175s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.039428711s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981787682s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040039062s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.981742859s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040039062s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.f( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.14( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.b( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.11( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.13( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,2,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1d( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.980316162s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1189.040039062s@ mbc={}] start_peering_interval up [1,5,3] -> [4,2,3], acting [1,5,3] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.980286598s) [4,2,3] r=2 lpr=45 pi=[43,45)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1189.040039062s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970592499s) [0,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337646484s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,2], acting [0,4,2] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967342377s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.334350586s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.18( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970592499s) [0,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.337646484s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967270851s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.334350586s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970020294s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337402344s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969986916s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337402344s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.973694801s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.341186523s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.973631859s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.341186523s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969996452s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337646484s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.3( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969961166s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337646484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968859673s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.336669922s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968693733s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.336669922s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969968796s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337768555s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968120575s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.336059570s@ mbc={}] start_peering_interval up [0,4,2] -> [4,3,2], acting [0,4,2] -> [4,3,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.6( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969835281s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337768555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.7( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968057632s) [4,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.336059570s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970770836s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.338867188s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.9( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970770836s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.338867188s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.971567154s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.339965820s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,5], acting [0,4,2] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969411850s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337890625s@ mbc={}] start_peering_interval up [0,4,2] -> [4,2,0], acting [0,4,2] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970956802s) [3,1,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.339477539s@ mbc={}] start_peering_interval up [0,4,2] -> [3,1,2], acting [0,4,2] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.5( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969371796s) [4,2,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337890625s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.4( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.971508980s) [3,1,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.339965820s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970055580s) [4,0,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.338989258s@ mbc={}] start_peering_interval up [0,4,2] -> [4,0,2], acting [0,4,2] -> [4,0,2], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970895767s) [3,1,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.339477539s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.a( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970000267s) [4,0,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.338989258s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.972677231s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.341674805s@ mbc={}] start_peering_interval up [0,4,2] -> [3,4,5], acting [0,4,2] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967591286s) [0,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.336669922s@ mbc={}] start_peering_interval up [0,4,2] -> [0,1,5], acting [0,4,2] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.14( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.972595215s) [3,4,5] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.341674805s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968501091s) [2,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337646484s@ mbc={}] start_peering_interval up [0,4,2] -> [2,1,3], acting [0,4,2] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.16( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967591286s) [0,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.336669922s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968437195s) [2,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337646484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.968077660s) [3,5,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337402344s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,4], acting [0,4,2] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966302872s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.335571289s@ mbc={}] start_peering_interval up [0,4,2] -> [0,2,4], acting [0,4,2] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.10( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966302872s) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown pruub 1193.335571289s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.11( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967981339s) [3,5,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337402344s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965125084s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.334960938s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1f( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965083122s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.334960938s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965057373s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.334960938s@ mbc={}] start_peering_interval up [0,4,2] -> [4,5,0], acting [0,4,2] -> [4,5,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970343590s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.340209961s@ mbc={}] start_peering_interval up [0,4,2] -> [3,5,1], acting [0,4,2] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969474792s) [5,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.339355469s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,3], acting [0,4,2] -> [5,1,3], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1e( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.969427109s) [5,1,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.339355469s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970239639s) [3,5,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.340209961s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970639229s) [3,2,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.340698242s@ mbc={}] start_peering_interval up [0,4,2] -> [3,2,1], acting [0,4,2] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.15( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965029716s) [4,5,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.334960938s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.13( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970547676s) [3,2,1] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.340698242s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967091560s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337402344s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970276833s) [5,3,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.340698242s@ mbc={}] start_peering_interval up [0,4,2] -> [5,3,4], acting [0,4,2] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.2( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.967032433s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337402344s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1c( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.970193863s) [5,3,4] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.340698242s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965265274s) [5,4,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.335937500s@ mbc={}] start_peering_interval up [0,4,2] -> [5,4,0], acting [0,4,2] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.964879990s) [5,0,1] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.335815430s@ mbc={}] start_peering_interval up [0,4,2] -> [5,0,1], acting [0,4,2] -> [5,0,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966756821s) [1,2,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337768555s@ mbc={}] start_peering_interval up [0,4,2] -> [1,2,3], acting [0,4,2] -> [1,2,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966783524s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337890625s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.19( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966718674s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337890625s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.17( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.964653969s) [5,0,1] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.335815430s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.8( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966633797s) [1,2,3] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337768555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965171814s) [5,4,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.335937500s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966559410s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337890625s@ mbc={}] start_peering_interval up [0,4,2] -> [1,3,2], acting [0,4,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965885162s) [5,1,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active pruub 1193.337524414s@ mbc={}] start_peering_interval up [0,4,2] -> [5,1,0], acting [0,4,2] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.d( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.966259003s) [1,3,2] r=-1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337890625s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 45 pg[6.1b( empty local-lis/les=43/44 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45 pruub=10.965665817s) [5,1,0] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1193.337524414s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.12 scrub starts
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 3.12 scrub ok
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [2,1,3] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.2( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.7( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [4,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.e( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [4,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.19( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.8( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,2,3] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.d( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [1,3,2] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1e( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [5,1,3] r=2 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 45 pg[6.1c( empty local-lis/les=0/0 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [5,3,4] r=1 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.9( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.10( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,2,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.18( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 46 pg[6.16( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [0,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.1f( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.14( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.11( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,4] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.6( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,4,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.4( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.c( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,5] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.f( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.1d( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,5,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.b( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,1,2] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:30 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 46 pg[6.13( empty local-lis/les=45/46 n=0 ec=43/35 lis/c=43/43 les/c/f=44/44/0 sis=45) [3,2,1] r=0 lpr=45 pi=[43,45)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    (file: /etc/puppet/hiera.yaml)
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Warning: Undefined variable '::deploy_config_name';
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    (file & line not available)
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    (file & line not available)
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.12 seconds
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Notice: Applied catalog in 0.04 seconds
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Application:
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    Initial environment: production
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    Converged environment: production
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:          Run mode: user
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Changes:
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Events:
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Resources:
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:             Total: 10
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Time:
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:          Schedule: 0.00
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:              File: 0.00
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:              Exec: 0.01
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:            Augeas: 0.01
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    Transaction evaluation: 0.03
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    Catalog application: 0.04
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:    Config retrieval: 0.20
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:          Last run: 1771574371
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:        Filebucket: 0.00
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:             Total: 0.04
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]: Version:
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:            Config: 1771574371
Feb 20 07:59:31 np0005625204.localdomain puppet-user[57407]:            Puppet: 7.10.0
Feb 20 07:59:31 np0005625204.localdomain ansible-async_wrapper.py[57396]: Module complete (57396)
Feb 20 07:59:32 np0005625204.localdomain ansible-async_wrapper.py[57395]: Done in kid B.
Feb 20 07:59:33 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.7 deep-scrub starts
Feb 20 07:59:33 np0005625204.localdomain sshd[57417]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 37435 ssh2 [preauth]
Feb 20 07:59:33 np0005625204.localdomain sshd[57417]: Disconnecting authenticating user root 185.246.128.171 port 37435: Too many authentication failures [preauth]
Feb 20 07:59:34 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Feb 20 07:59:34 np0005625204.localdomain sshd[57530]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:35 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 4.7 deep-scrub ok
Feb 20 07:59:35 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.9 scrub starts
Feb 20 07:59:35 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Feb 20 07:59:35 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.9 scrub ok
Feb 20 07:59:35 np0005625204.localdomain sudo[57532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:59:35 np0005625204.localdomain sudo[57532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:59:35 np0005625204.localdomain sudo[57532]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:36 np0005625204.localdomain sudo[57547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 07:59:36 np0005625204.localdomain sudo[57547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:59:36 np0005625204.localdomain sudo[57547]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:36 np0005625204.localdomain sudo[57584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 07:59:36 np0005625204.localdomain sudo[57584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:59:36 np0005625204.localdomain sudo[57584]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:36 np0005625204.localdomain sudo[57599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 07:59:36 np0005625204.localdomain sudo[57599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:59:37 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.b scrub starts
Feb 20 07:59:37 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.b scrub ok
Feb 20 07:59:37 np0005625204.localdomain sudo[57599]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:37 np0005625204.localdomain sudo[57660]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhfsuotomxtzpjpbnzvxmvuklumsrnla ; /usr/bin/python3
Feb 20 07:59:37 np0005625204.localdomain sudo[57660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:37 np0005625204.localdomain sudo[57663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 07:59:37 np0005625204.localdomain sudo[57663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 07:59:37 np0005625204.localdomain sudo[57663]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:37 np0005625204.localdomain python3[57662]: ansible-ansible.legacy.async_status Invoked with jid=986433874107.57392 mode=status _async_dir=/tmp/.ansible_async
Feb 20 07:59:37 np0005625204.localdomain sudo[57660]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:38 np0005625204.localdomain sudo[57692]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duaitsgwacmzpjvjugmcgzerbycmebot ; /usr/bin/python3
Feb 20 07:59:38 np0005625204.localdomain sudo[57692]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:38 np0005625204.localdomain python3[57694]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:59:38 np0005625204.localdomain sudo[57692]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:38 np0005625204.localdomain sudo[57708]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chnpedeqhwrzqvufspzmpkontludotut ; /usr/bin/python3
Feb 20 07:59:38 np0005625204.localdomain sudo[57708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:38 np0005625204.localdomain sshd[57530]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 18940 ssh2 [preauth]
Feb 20 07:59:38 np0005625204.localdomain sshd[57530]: Disconnecting authenticating user root 185.246.128.171 port 18940: Too many authentication failures [preauth]
Feb 20 07:59:39 np0005625204.localdomain python3[57710]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:59:39 np0005625204.localdomain sudo[57708]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:39 np0005625204.localdomain sudo[57758]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqswjjnetvzbhynfhwudmfpfvjcheurg ; /usr/bin/python3
Feb 20 07:59:39 np0005625204.localdomain sudo[57758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 07:59:39 np0005625204.localdomain podman[57761]: 2026-02-20 07:59:39.476338519 +0000 UTC m=+0.083390313 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=metrics_qdr, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 07:59:39 np0005625204.localdomain python3[57760]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:39 np0005625204.localdomain sudo[57758]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800383568s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800305367s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800383568s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.040893555s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800305367s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.040893555s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800362587s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.040893555s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.800362587s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.040893555s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.798828125s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1197.039794922s@ mbc={}] start_peering_interval up [1,5,3] -> [3,5,1], acting [1,5,3] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:39 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 47 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47 pruub=8.798828125s) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1197.039794922s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:39 np0005625204.localdomain sudo[57804]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prcqmukxxywmhksrvjwjltprpunsrfrc ; /usr/bin/python3
Feb 20 07:59:39 np0005625204.localdomain sudo[57804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:39 np0005625204.localdomain podman[57761]: 2026-02-20 07:59:39.667851685 +0000 UTC m=+0.274903449 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 07:59:39 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 07:59:39 np0005625204.localdomain python3[57806]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmp9uhuy3f7 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 07:59:39 np0005625204.localdomain sudo[57804]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:40 np0005625204.localdomain sudo[57941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vizjmgkuygsmlbprewygdchaybszkspp ; /usr/bin/python3
Feb 20 07:59:40 np0005625204.localdomain sudo[57941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:40 np0005625204.localdomain sshd[57956]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:40 np0005625204.localdomain python3[57955]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:40 np0005625204.localdomain sudo[57941]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:40 np0005625204.localdomain sudo[57970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daikdkdwrtntiiotqqsbriwnxfrvlzzz ; /usr/bin/python3
Feb 20 07:59:40 np0005625204.localdomain sudo[57970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:40 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.2( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:40 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:40 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:40 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 48 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=47) [3,5,1] r=0 lpr=47 pi=[43,47)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:41 np0005625204.localdomain sudo[57970]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:41 np0005625204.localdomain sudo[58115]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhcqcjvrpqpauecqsmnfeyzhndpplprc ; /usr/bin/python3
Feb 20 07:59:41 np0005625204.localdomain sudo[58115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:41 np0005625204.localdomain python3[58117]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 20 07:59:41 np0005625204.localdomain sudo[58115]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.798413277s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.087524414s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.798413277s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.087524414s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797753334s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.086791992s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.802041054s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.091308594s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797753334s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.086791992s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.802041054s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.091308594s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797941208s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1203.087768555s@ mbc={}] start_peering_interval up [4,2,3] -> [3,4,2], acting [4,2,3] -> [3,4,2], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:41 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 49 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49 pruub=12.797941208s) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1203.087768555s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:41 np0005625204.localdomain rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:59:41 np0005625204.localdomain rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 07:59:42 np0005625204.localdomain sudo[58137]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxbuonfnadrrykbtkdpeougwrktpstsi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:42 np0005625204.localdomain sudo[58137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:42 np0005625204.localdomain python3[58139]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:42 np0005625204.localdomain sudo[58137]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:42 np0005625204.localdomain sudo[58153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtttzoimfvpcgjirvwhekligzzytnhgi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:42 np0005625204.localdomain sudo[58153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:42 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:42 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:42 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.3( v 40'39 (0'0,40'39] local-lis/les=49/50 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:42 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 50 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=49) [3,4,2] r=0 lpr=49 pi=[45,49)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:42 np0005625204.localdomain sudo[58153]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:43 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.2 scrub starts
Feb 20 07:59:43 np0005625204.localdomain sudo[58169]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oakbnbfxctxiexjestfxyyyxkcoalcde ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:43 np0005625204.localdomain sudo[58169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:43 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.10 scrub starts
Feb 20 07:59:43 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.2 scrub ok
Feb 20 07:59:43 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.10 scrub ok
Feb 20 07:59:43 np0005625204.localdomain python3[58171]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 07:59:43 np0005625204.localdomain sudo[58169]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:43 np0005625204.localdomain sudo[58219]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyrfjczghvjccgzioybbawgydrqnxcqb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:43 np0005625204.localdomain sudo[58219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:43 np0005625204.localdomain python3[58221]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:43 np0005625204.localdomain sudo[58219]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:43 np0005625204.localdomain sudo[58237]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iklxlroptbkdegikgqdpctatqakhjfgm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:43 np0005625204.localdomain sudo[58237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:44 np0005625204.localdomain python3[58239]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:44 np0005625204.localdomain sudo[58237]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:44 np0005625204.localdomain sudo[58299]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-holxpafggfdgdbfrntyeyjzgrglduxyo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:44 np0005625204.localdomain sudo[58299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:44 np0005625204.localdomain python3[58301]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:44 np0005625204.localdomain sudo[58299]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:44 np0005625204.localdomain sudo[58317]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tbywbpzeieniqmkdhvgvijhldcnxmebc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:44 np0005625204.localdomain sudo[58317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:44 np0005625204.localdomain python3[58319]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:44 np0005625204.localdomain sudo[58317]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:45 np0005625204.localdomain sudo[58379]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvjfedrjejooowyxdkanbspcmomtmmzb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:45 np0005625204.localdomain sudo[58379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:45 np0005625204.localdomain python3[58381]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:45 np0005625204.localdomain sudo[58379]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:45 np0005625204.localdomain sudo[58397]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lttmgxesfqpwsiklktvtlieltnkpuuaq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:45 np0005625204.localdomain sudo[58397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:45 np0005625204.localdomain python3[58399]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:45 np0005625204.localdomain sudo[58397]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:45 np0005625204.localdomain sudo[58459]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfxnmxujuhwlfywxrcfptvjjcugbyxmx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:45 np0005625204.localdomain sudo[58459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:46 np0005625204.localdomain python3[58461]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:46 np0005625204.localdomain sudo[58459]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:46 np0005625204.localdomain sudo[58477]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxfugbuqpksmfmslaycisryhhsmesrtc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:46 np0005625204.localdomain sudo[58477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:46 np0005625204.localdomain python3[58479]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:46 np0005625204.localdomain sudo[58477]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:46 np0005625204.localdomain sudo[58507]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bedwacroosovrhylsynsaotuvahejiwl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:46 np0005625204.localdomain sudo[58507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:46 np0005625204.localdomain python3[58509]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:59:46 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:59:47 np0005625204.localdomain systemd-sysv-generator[58539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:59:47 np0005625204.localdomain systemd-rc-local-generator[58534]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:59:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 07:59:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4236 writes, 19K keys, 4236 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4236 writes, 358 syncs, 11.83 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 978 writes, 3451 keys, 978 commit groups, 1.0 writes per commit group, ingest: 1.47 MB, 0.00 MB/s
                                                          Interval WAL: 978 writes, 213 syncs, 4.59 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 07:59:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:59:47 np0005625204.localdomain sudo[58507]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:47 np0005625204.localdomain sudo[58593]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imwjktoxtphlevijvabydeqszdyzpqdr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:47 np0005625204.localdomain sudo[58593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:47 np0005625204.localdomain python3[58595]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:47 np0005625204.localdomain sudo[58593]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:47 np0005625204.localdomain sudo[58611]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocelgaulzlihszsplzbkewvycrchhymp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:47 np0005625204.localdomain sudo[58611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:47 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.4 scrub starts
Feb 20 07:59:48 np0005625204.localdomain python3[58613]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:48 np0005625204.localdomain sudo[58611]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:48 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.4 scrub ok
Feb 20 07:59:48 np0005625204.localdomain sudo[58673]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzvkmrglfuxxuypdiyynwkcmekrwmkbz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:48 np0005625204.localdomain sudo[58673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:48 np0005625204.localdomain python3[58675]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 07:59:48 np0005625204.localdomain sudo[58673]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:48 np0005625204.localdomain sudo[58691]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucnruzvmfbpswpmqclnvlqtlignuoagz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:48 np0005625204.localdomain sudo[58691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:48 np0005625204.localdomain python3[58693]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:48 np0005625204.localdomain sudo[58691]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:48 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.1e scrub starts
Feb 20 07:59:49 np0005625204.localdomain sudo[58721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uinxmpokvpnhmuxlkjdrpocbgazhdfyf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:49 np0005625204.localdomain sudo[58721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.1e scrub ok
Feb 20 07:59:49 np0005625204.localdomain python3[58723]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 07:59:49 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 07:59:49 np0005625204.localdomain systemd-sysv-generator[58752]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 07:59:49 np0005625204.localdomain systemd-rc-local-generator[58748]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 07:59:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 07:59:49 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 07:59:49 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 07:59:49 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 07:59:49 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 07:59:49 np0005625204.localdomain sudo[58721]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.598536491s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1213.040771484s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.4( v 40'39 (0'0,40'39] local-lis/les=43/44 n=2 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.598453522s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1213.040771484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.597406387s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1213.040161133s@ mbc={}] start_peering_interval up [1,5,3] -> [0,1,2], acting [1,5,3] -> [0,1,2], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 51 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51 pruub=14.597302437s) [0,1,2] r=-1 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1213.040161133s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 51 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:49 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 51 pg[7.4( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:50 np0005625204.localdomain sudo[58780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiukbkiyjeiiovxvfidpldkdgxgryqpg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:50 np0005625204.localdomain sudo[58780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:50 np0005625204.localdomain sshd[57956]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 51736 ssh2 [preauth]
Feb 20 07:59:50 np0005625204.localdomain sshd[57956]: Disconnecting authenticating user root 185.246.128.171 port 51736: Too many authentication failures [preauth]
Feb 20 07:59:50 np0005625204.localdomain python3[58782]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 20 07:59:50 np0005625204.localdomain sudo[58780]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:50 np0005625204.localdomain sudo[58796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeexfjfzohtnabjymzficaichnrlkqbq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:50 np0005625204.localdomain sudo[58796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:50 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 52 pg[7.4( v 40'39 lc 38'15 (0'0,40'39] local-lis/les=51/52 n=4 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=4 mbc={255={(1+2)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:50 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 52 pg[7.c( v 40'39 lc 38'17 (0'0,40'39] local-lis/les=51/52 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=51) [0,1,2] r=0 lpr=51 pi=[43,51)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:51 np0005625204.localdomain sudo[58796]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:51 np0005625204.localdomain sshd[58824]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 07:59:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4907 writes, 22K keys, 4907 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4907 writes, 480 syncs, 10.22 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1520 writes, 5414 keys, 1520 commit groups, 1.0 writes per commit group, ingest: 2.10 MB, 0.00 MB/s
                                                          Interval WAL: 1520 writes, 282 syncs, 5.39 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.2e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 07:59:51 np0005625204.localdomain sudo[58838]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvnzvsznypmnvgqkudowmicdntnjgbrg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 07:59:51 np0005625204.localdomain sudo[58838]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:51 np0005625204.localdomain python3[58840]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 20 07:59:51 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.18 scrub starts
Feb 20 07:59:52 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.18 scrub ok
Feb 20 07:59:52 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.7 scrub starts
Feb 20 07:59:52 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.7 scrub ok
Feb 20 07:59:52 np0005625204.localdomain podman[58911]: 2026-02-20 07:59:52.271755511 +0000 UTC m=+0.086071216 container create 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute_init_log, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 07:59:52 np0005625204.localdomain podman[58917]: 2026-02-20 07:59:52.290279211 +0000 UTC m=+0.089876773 container create 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step2, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, container_name=nova_virtqemud_init_logs)
Feb 20 07:59:52 np0005625204.localdomain podman[58911]: 2026-02-20 07:59:52.22128577 +0000 UTC m=+0.035601555 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: Started libpod-conmon-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706.scope.
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: Started libpod-conmon-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb.scope.
Feb 20 07:59:52 np0005625204.localdomain podman[58917]: 2026-02-20 07:59:52.231494424 +0000 UTC m=+0.031092016 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:59:52 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 07:59:52 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Feb 20 07:59:52 np0005625204.localdomain podman[58911]: 2026-02-20 07:59:52.362704055 +0000 UTC m=+0.177019780 container init 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step2, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 20 07:59:52 np0005625204.localdomain podman[58911]: 2026-02-20 07:59:52.372761895 +0000 UTC m=+0.187077620 container start 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute_init_log, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510)
Feb 20 07:59:52 np0005625204.localdomain python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: libpod-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706.scope: Deactivated successfully.
Feb 20 07:59:52 np0005625204.localdomain podman[58917]: 2026-02-20 07:59:52.412388563 +0000 UTC m=+0.211986125 container init 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, tcib_managed=true, container_name=nova_virtqemud_init_logs, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=)
Feb 20 07:59:52 np0005625204.localdomain podman[58917]: 2026-02-20 07:59:52.42498127 +0000 UTC m=+0.224578832 container start 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_id=tripleo_step2, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 20 07:59:52 np0005625204.localdomain python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: libpod-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb.scope: Deactivated successfully.
Feb 20 07:59:52 np0005625204.localdomain podman[58949]: 2026-02-20 07:59:52.465833925 +0000 UTC m=+0.076298136 container died 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute_init_log, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13)
Feb 20 07:59:52 np0005625204.localdomain podman[58973]: 2026-02-20 07:59:52.492901797 +0000 UTC m=+0.049775761 container died 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, container_name=nova_virtqemud_init_logs, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']})
Feb 20 07:59:52 np0005625204.localdomain podman[58949]: 2026-02-20 07:59:52.59809722 +0000 UTC m=+0.208561381 container cleanup 4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step2, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute_init_log, distribution-scope=public, vcs-type=git)
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: libpod-conmon-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706.scope: Deactivated successfully.
Feb 20 07:59:52 np0005625204.localdomain podman[58979]: 2026-02-20 07:59:52.618061844 +0000 UTC m=+0.167034695 container cleanup 823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, container_name=nova_virtqemud_init_logs, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 07:59:52 np0005625204.localdomain systemd[1]: libpod-conmon-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb.scope: Deactivated successfully.
Feb 20 07:59:53 np0005625204.localdomain podman[59101]: 2026-02-20 07:59:53.022176783 +0000 UTC m=+0.078846824 container create 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git)
Feb 20 07:59:53 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.16 deep-scrub starts
Feb 20 07:59:53 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 6.16 deep-scrub ok
Feb 20 07:59:53 np0005625204.localdomain podman[59102]: 2026-02-20 07:59:53.057779667 +0000 UTC m=+0.106358970 container create f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, tcib_managed=true, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: Started libpod-conmon-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope.
Feb 20 07:59:53 np0005625204.localdomain podman[59101]: 2026-02-20 07:59:52.978011405 +0000 UTC m=+0.034681456 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:59:53 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: Started libpod-conmon-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope.
Feb 20 07:59:53 np0005625204.localdomain podman[59102]: 2026-02-20 07:59:52.994074419 +0000 UTC m=+0.042653732 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 20 07:59:53 np0005625204.localdomain podman[59101]: 2026-02-20 07:59:53.097938391 +0000 UTC m=+0.154608422 container init 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, container_name=create_virtlogd_wrapper, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13)
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 07:59:53 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 07:59:53 np0005625204.localdomain podman[59101]: 2026-02-20 07:59:53.109715403 +0000 UTC m=+0.166385434 container start 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step2, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=create_virtlogd_wrapper, batch=17.1_20260112.1)
Feb 20 07:59:53 np0005625204.localdomain podman[59101]: 2026-02-20 07:59:53.110082884 +0000 UTC m=+0.166752975 container attach 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=create_virtlogd_wrapper, tcib_managed=true, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step2, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T23:31:49Z)
Feb 20 07:59:53 np0005625204.localdomain podman[59102]: 2026-02-20 07:59:53.118933075 +0000 UTC m=+0.167512358 container init f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step2, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 20 07:59:53 np0005625204.localdomain podman[59102]: 2026-02-20 07:59:53.128075277 +0000 UTC m=+0.176654560 container start f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=create_haproxy_wrapper, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step2, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20260112.1)
Feb 20 07:59:53 np0005625204.localdomain podman[59102]: 2026-02-20 07:59:53.128547031 +0000 UTC m=+0.177126314 container attach f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=create_haproxy_wrapper, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865-merged.mount: Deactivated successfully.
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-823c3a117af8f3d0ec529c30f1e497e563ce771adce3f15d5fc5550123693ecb-userdata-shm.mount: Deactivated successfully.
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e-merged.mount: Deactivated successfully.
Feb 20 07:59:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ca2944dade8c8f5021760e66f29ffe531fb453aa8d941b8dd06319840473706-userdata-shm.mount: Deactivated successfully.
Feb 20 07:59:54 np0005625204.localdomain ovs-vsctl[59203]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Feb 20 07:59:54 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.18 scrub starts
Feb 20 07:59:54 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 3.18 scrub ok
Feb 20 07:59:55 np0005625204.localdomain systemd[1]: libpod-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope: Deactivated successfully.
Feb 20 07:59:55 np0005625204.localdomain systemd[1]: libpod-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope: Consumed 2.082s CPU time.
Feb 20 07:59:55 np0005625204.localdomain podman[59101]: 2026-02-20 07:59:55.186747573 +0000 UTC m=+2.243417604 container died 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, container_name=create_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 07:59:55 np0005625204.localdomain systemd[1]: tmp-crun.LNP1Ln.mount: Deactivated successfully.
Feb 20 07:59:55 np0005625204.localdomain podman[59353]: 2026-02-20 07:59:55.263077399 +0000 UTC m=+0.063902585 container cleanup 8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, config_id=tripleo_step2, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 20 07:59:55 np0005625204.localdomain systemd[1]: libpod-conmon-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452.scope: Deactivated successfully.
Feb 20 07:59:55 np0005625204.localdomain python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Feb 20 07:59:55 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.464720726s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1219.088256836s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:55 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.464027405s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1219.087646484s@ mbc={}] start_peering_interval up [4,2,3] -> [2,0,4], acting [4,2,3] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:55 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.464625359s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1219.088256836s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:55 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 53 pg[7.5( v 40'39 (0'0,40'39] local-lis/les=45/46 n=2 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53 pruub=14.463951111s) [2,0,4] r=-1 lpr=53 pi=[45,53)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1219.087646484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de-merged.mount: Deactivated successfully.
Feb 20 07:59:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8aa7f75f4a6f084d0253fd3186e46abbabfa047ab8d8709a13de3f3b8e270452-userdata-shm.mount: Deactivated successfully.
Feb 20 07:59:56 np0005625204.localdomain sshd[58824]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 56741 ssh2 [preauth]
Feb 20 07:59:56 np0005625204.localdomain sshd[58824]: Disconnecting authenticating user root 185.246.128.171 port 56741: Too many authentication failures [preauth]
Feb 20 07:59:56 np0005625204.localdomain systemd[1]: libpod-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope: Deactivated successfully.
Feb 20 07:59:56 np0005625204.localdomain systemd[1]: libpod-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope: Consumed 2.170s CPU time.
Feb 20 07:59:56 np0005625204.localdomain podman[59397]: 2026-02-20 07:59:56.870133475 +0000 UTC m=+0.049372957 container died f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=create_haproxy_wrapper, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true)
Feb 20 07:59:56 np0005625204.localdomain systemd[1]: tmp-crun.TEbLEa.mount: Deactivated successfully.
Feb 20 07:59:56 np0005625204.localdomain podman[59397]: 2026-02-20 07:59:56.907618098 +0000 UTC m=+0.086857550 container cleanup f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step2, container_name=create_haproxy_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 07:59:56 np0005625204.localdomain systemd[1]: libpod-conmon-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124.scope: Deactivated successfully.
Feb 20 07:59:56 np0005625204.localdomain python3[58840]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Feb 20 07:59:57 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 53 pg[7.d( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53) [2,0,4] r=1 lpr=53 pi=[45,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:57 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 53 pg[7.5( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=53) [2,0,4] r=1 lpr=53 pi=[45,53)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:57 np0005625204.localdomain sudo[58838]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb-merged.mount: Deactivated successfully.
Feb 20 07:59:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f03f93f8e63c86a231adafb64e90e260c7e98163089e4d1c2562669770258124-userdata-shm.mount: Deactivated successfully.
Feb 20 07:59:57 np0005625204.localdomain sudo[59447]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmhjqwysztczsxtvqdsmxepvrovflceq ; /usr/bin/python3
Feb 20 07:59:57 np0005625204.localdomain sudo[59447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:57 np0005625204.localdomain python3[59449]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:57 np0005625204.localdomain sudo[59447]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:58 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 55 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:58 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 55 pg[7.6( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 07:59:58 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583439827s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1221.292724609s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:58 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583265305s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1221.292602539s@ mbc={}] start_peering_interval up [3,5,1] -> [0,4,5], acting [3,5,1] -> [0,4,5], acting_primary 3 -> 0, up_primary 3 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 07:59:58 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.6( v 40'39 (0'0,40'39] local-lis/les=47/48 n=2 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583332062s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.292724609s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:58 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 55 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55 pruub=14.583180428s) [0,4,5] r=-1 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1221.292602539s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 07:59:58 np0005625204.localdomain sudo[59495]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acpesoygmfhguzzkuflobouaorduhrvh ; /usr/bin/python3
Feb 20 07:59:58 np0005625204.localdomain sudo[59495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:58 np0005625204.localdomain sudo[59495]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:58 np0005625204.localdomain sshd[59498]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 07:59:58 np0005625204.localdomain sudo[59539]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfhzesdnfbuqiofhijqzuxajqniufqts ; /usr/bin/python3
Feb 20 07:59:58 np0005625204.localdomain sudo[59539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:58 np0005625204.localdomain sudo[59539]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:59 np0005625204.localdomain sudo[59569]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eltngmtjbmfuztpmagzkickygnejxhwr ; /usr/bin/python3
Feb 20 07:59:59 np0005625204.localdomain sudo[59569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:59 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 56 pg[7.e( v 40'39 lc 38'19 (0'0,40'39] local-lis/les=55/56 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:59 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 56 pg[7.6( v 40'39 lc 0'0 (0'0,40'39] local-lis/les=55/56 n=2 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=55) [0,4,5] r=0 lpr=55 pi=[47,55)/1 crt=40'39 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 07:59:59 np0005625204.localdomain python3[59571]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005625204 step=2 update_config_hash_only=False
Feb 20 07:59:59 np0005625204.localdomain sudo[59569]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:59 np0005625204.localdomain sudo[59586]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgobujkfrjxpruyldfrxmcqgrtwnfjay ; /usr/bin/python3
Feb 20 07:59:59 np0005625204.localdomain sudo[59586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 07:59:59 np0005625204.localdomain python3[59588]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 07:59:59 np0005625204.localdomain sudo[59586]: pam_unix(sudo:session): session closed for user root
Feb 20 07:59:59 np0005625204.localdomain sudo[59602]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoxegnfdnejfxoikbllbfttmzqhcjgbl ; /usr/bin/python3
Feb 20 07:59:59 np0005625204.localdomain sudo[59602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:00:00 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600752831s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1223.322021484s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:00 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.7( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600689888s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.322021484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:00 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600103378s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1223.322143555s@ mbc={}] start_peering_interval up [3,4,2] -> [1,5,3], acting [3,4,2] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:00 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 57 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=57 pruub=14.600008965s) [1,5,3] r=2 lpr=57 pi=[49,57)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1223.322143555s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:00 np0005625204.localdomain python3[59604]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 20 08:00:00 np0005625204.localdomain sudo[59602]: pam_unix(sudo:session): session closed for user root
Feb 20 08:00:01 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.b scrub starts
Feb 20 08:00:01 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.b scrub ok
Feb 20 08:00:02 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.6 scrub starts
Feb 20 08:00:02 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 59 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=10.256752968s) [3,4,5] r=0 lpr=59 pi=[43,59)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1221.042114258s@ mbc={}] start_peering_interval up [1,5,3] -> [3,4,5], acting [1,5,3] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:02 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 59 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=43/44 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=59 pruub=10.256752968s) [3,4,5] r=0 lpr=59 pi=[43,59)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1221.042114258s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 08:00:02 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.6 scrub ok
Feb 20 08:00:03 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 60 pg[7.8( v 40'39 (0'0,40'39] local-lis/les=59/60 n=1 ec=43/36 lis/c=43/43 les/c/f=44/44/0 sis=59) [3,4,5] r=0 lpr=59 pi=[43,59)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 08:00:03 np0005625204.localdomain sshd[59605]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:03 np0005625204.localdomain sshd[59605]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:00:06 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.e deep-scrub starts
Feb 20 08:00:06 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.13 scrub starts
Feb 20 08:00:06 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.e deep-scrub ok
Feb 20 08:00:06 np0005625204.localdomain sshd[59498]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 36674 ssh2 [preauth]
Feb 20 08:00:06 np0005625204.localdomain sshd[59498]: Disconnecting authenticating user root 185.246.128.171 port 36674: Too many authentication failures [preauth]
Feb 20 08:00:06 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.13 scrub ok
Feb 20 08:00:07 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1f scrub starts
Feb 20 08:00:07 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1f scrub ok
Feb 20 08:00:07 np0005625204.localdomain sshd[59607]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:09 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 61 pg[7.9( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61) [0,2,4] r=0 lpr=61 pi=[45,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 08:00:09 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 61 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=8.968554497s) [0,2,4] r=-1 lpr=61 pi=[45,61)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1227.091918945s@ mbc={}] start_peering_interval up [4,2,3] -> [0,2,4], acting [4,2,3] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:09 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 61 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=45/46 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61 pruub=8.968462944s) [0,2,4] r=-1 lpr=61 pi=[45,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1227.091918945s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:10 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.c scrub starts
Feb 20 08:00:10 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.c scrub ok
Feb 20 08:00:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:00:10 np0005625204.localdomain podman[59609]: 2026-02-20 08:00:10.163666197 +0000 UTC m=+0.088706158 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, release=1766032510, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 20 08:00:10 np0005625204.localdomain podman[59609]: 2026-02-20 08:00:10.377945861 +0000 UTC m=+0.302985832 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:00:10 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:00:10 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 62 pg[7.9( v 40'39 (0'0,40'39] local-lis/les=61/62 n=1 ec=43/36 lis/c=45/45 les/c/f=46/46/0 sis=61) [0,2,4] r=0 lpr=61 pi=[45,61)/1 crt=40'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 08:00:11 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.6 scrub starts
Feb 20 08:00:11 np0005625204.localdomain sshd[59607]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 32787 ssh2 [preauth]
Feb 20 08:00:11 np0005625204.localdomain sshd[59607]: Disconnecting authenticating user root 185.246.128.171 port 32787: Too many authentication failures [preauth]
Feb 20 08:00:11 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.6 scrub ok
Feb 20 08:00:12 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.9 scrub starts
Feb 20 08:00:12 np0005625204.localdomain ceph-osd[32226]: log_channel(cluster) log [DBG] : 7.9 scrub ok
Feb 20 08:00:12 np0005625204.localdomain sshd[59639]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:15 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 63 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=63 pruub=13.070129395s) [2,0,4] r=-1 lpr=63 pi=[47,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1237.293090820s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,4], acting [3,5,1] -> [2,0,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:15 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 63 pg[7.a( v 40'39 (0'0,40'39] local-lis/les=47/48 n=1 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=63 pruub=13.069807053s) [2,0,4] r=-1 lpr=63 pi=[47,63)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1237.293090820s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:16 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.f scrub starts
Feb 20 08:00:16 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.f scrub ok
Feb 20 08:00:16 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 63 pg[7.a( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=47/47 les/c/f=48/48/0 sis=63) [2,0,4] r=1 lpr=63 pi=[47,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:17 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 65 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=13.054226875s) [3,1,2] r=0 lpr=65 pi=[49,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1239.326904297s@ mbc={}] start_peering_interval up [3,4,2] -> [3,1,2], acting [3,4,2] -> [3,1,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:17 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 65 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=49/50 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=65 pruub=13.054226875s) [3,1,2] r=0 lpr=65 pi=[49,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown pruub 1239.326904297s@ mbc={}] state<Start>: transitioning to Primary
Feb 20 08:00:17 np0005625204.localdomain sshd[59639]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 59575 ssh2 [preauth]
Feb 20 08:00:17 np0005625204.localdomain sshd[59639]: Disconnecting authenticating user root 185.246.128.171 port 59575: Too many authentication failures [preauth]
Feb 20 08:00:18 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.11 scrub starts
Feb 20 08:00:18 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.11 scrub ok
Feb 20 08:00:18 np0005625204.localdomain sshd[59641]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:19 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 66 pg[7.b( v 40'39 (0'0,40'39] local-lis/les=65/66 n=1 ec=43/36 lis/c=49/49 les/c/f=50/50/0 sis=65) [3,1,2] r=0 lpr=65 pi=[49,65)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 08:00:19 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 67 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=51/52 n=1 ec=43/36 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=11.157955170s) [1,3,2] r=-1 lpr=67 pi=[51,67)/1 crt=40'39 mlcod 0'0 active pruub 1243.806396484s@ mbc={255={}}] start_peering_interval up [0,1,2] -> [1,3,2], acting [0,1,2] -> [1,3,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:19 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 67 pg[7.c( v 40'39 (0'0,40'39] local-lis/les=51/52 n=1 ec=43/36 lis/c=51/51 les/c/f=52/52/0 sis=67 pruub=11.157869339s) [1,3,2] r=-1 lpr=67 pi=[51,67)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1243.806396484s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:20 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 67 pg[7.c( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=51/51 les/c/f=52/52/0 sis=67) [1,3,2] r=1 lpr=67 pi=[51,67)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:21 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 69 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=43/36 lis/c=53/53 les/c/f=54/54/0 sis=69 pruub=15.295172691s) [1,3,5] r=-1 lpr=69 pi=[53,69)/1 luod=0'0 crt=40'39 mlcod 0'0 active pruub 1249.991577148s@ mbc={}] start_peering_interval up [2,0,4] -> [1,3,5], acting [2,0,4] -> [1,3,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:21 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 69 pg[7.d( v 40'39 (0'0,40'39] local-lis/les=53/54 n=1 ec=43/36 lis/c=53/53 les/c/f=54/54/0 sis=69 pruub=15.295073509s) [1,3,5] r=-1 lpr=69 pi=[53,69)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1249.991577148s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:22 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 69 pg[7.d( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=53/53 les/c/f=54/54/0 sis=69) [1,3,5] r=1 lpr=69 pi=[53,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:23 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.4 scrub starts
Feb 20 08:00:23 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.4 scrub ok
Feb 20 08:00:24 np0005625204.localdomain sshd[59641]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 33821 ssh2 [preauth]
Feb 20 08:00:24 np0005625204.localdomain sshd[59641]: Disconnecting authenticating user root 185.246.128.171 port 33821: Too many authentication failures [preauth]
Feb 20 08:00:24 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.14 scrub starts
Feb 20 08:00:24 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.14 scrub ok
Feb 20 08:00:25 np0005625204.localdomain sshd[59643]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:26 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1d scrub starts
Feb 20 08:00:26 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 6.1d scrub ok
Feb 20 08:00:28 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.2 scrub starts
Feb 20 08:00:28 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.2 scrub ok
Feb 20 08:00:29 np0005625204.localdomain sshd[59643]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 13075 ssh2 [preauth]
Feb 20 08:00:29 np0005625204.localdomain sshd[59643]: Disconnecting authenticating user root 185.246.128.171 port 13075: Too many authentication failures [preauth]
Feb 20 08:00:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 71 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71 pruub=9.173424721s) [3,5,1] r=-1 lpr=71 pi=[55,71)/1 crt=40'39 mlcod 0'0 active pruub 1252.020019531s@ mbc={255={}}] start_peering_interval up [0,4,5] -> [3,5,1], acting [0,4,5] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:29 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 71 pg[7.e( v 40'39 (0'0,40'39] local-lis/les=55/56 n=1 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71 pruub=9.173316956s) [3,5,1] r=-1 lpr=71 pi=[55,71)/1 crt=40'39 mlcod 0'0 unknown NOTIFY pruub 1252.020019531s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:29 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 71 pg[7.e( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71) [3,5,1] r=0 lpr=71 pi=[55,71)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 08:00:30 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.3 scrub starts
Feb 20 08:00:30 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.3 scrub ok
Feb 20 08:00:31 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Feb 20 08:00:31 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.8 scrub ok
Feb 20 08:00:31 np0005625204.localdomain sshd[59645]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 72 pg[7.e( v 40'39 lc 38'19 (0'0,40'39] local-lis/les=71/72 n=1 ec=43/36 lis/c=55/55 les/c/f=56/56/0 sis=71) [3,5,1] r=0 lpr=71 pi=[55,71)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 08:00:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 73 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73 pruub=9.211073875s) [0,5,1] r=-1 lpr=73 pi=[57,73)/1 luod=0'0 crt=40'39 lcod 0'0 mlcod 0'0 active pruub 1249.801635742s@ mbc={}] start_peering_interval up [1,5,3] -> [0,5,1], acting [1,5,3] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 20 08:00:31 np0005625204.localdomain ceph-osd[33177]: osd.3 pg_epoch: 73 pg[7.f( v 40'39 (0'0,40'39] local-lis/les=57/58 n=1 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73 pruub=9.210969925s) [0,5,1] r=-1 lpr=73 pi=[57,73)/1 crt=40'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1249.801635742s@ mbc={}] state<Start>: transitioning to Stray
Feb 20 08:00:31 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 73 pg[7.f( empty local-lis/les=0/0 n=0 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73) [0,5,1] r=0 lpr=73 pi=[57,73)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 20 08:00:32 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.b scrub starts
Feb 20 08:00:32 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 7.b scrub ok
Feb 20 08:00:33 np0005625204.localdomain ceph-osd[32226]: osd.0 pg_epoch: 74 pg[7.f( v 40'39 lc 38'1 (0'0,40'39] local-lis/les=73/74 n=3 ec=43/36 lis/c=57/57 les/c/f=58/58/0 sis=73) [0,5,1] r=0 lpr=73 pi=[57,73)/1 crt=40'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(2+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 20 08:00:33 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Feb 20 08:00:34 np0005625204.localdomain sshd[59645]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 49559 ssh2 [preauth]
Feb 20 08:00:34 np0005625204.localdomain sshd[59645]: Disconnecting authenticating user root 185.246.128.171 port 49559: Too many authentication failures [preauth]
Feb 20 08:00:36 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.17 scrub starts
Feb 20 08:00:36 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.17 scrub ok
Feb 20 08:00:37 np0005625204.localdomain sshd[59647]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:37 np0005625204.localdomain sudo[59648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:00:37 np0005625204.localdomain sudo[59648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:00:37 np0005625204.localdomain sudo[59648]: pam_unix(sudo:session): session closed for user root
Feb 20 08:00:38 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub starts
Feb 20 08:00:38 np0005625204.localdomain sudo[59663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:00:38 np0005625204.localdomain sudo[59663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:00:38 np0005625204.localdomain ceph-osd[33177]: log_channel(cluster) log [DBG] : 5.1d scrub ok
Feb 20 08:00:38 np0005625204.localdomain sudo[59663]: pam_unix(sudo:session): session closed for user root
Feb 20 08:00:39 np0005625204.localdomain sudo[59710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:00:39 np0005625204.localdomain sudo[59710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:00:39 np0005625204.localdomain sudo[59710]: pam_unix(sudo:session): session closed for user root
Feb 20 08:00:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:00:41 np0005625204.localdomain podman[59725]: 2026-02-20 08:00:41.129898205 +0000 UTC m=+0.068987331 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, release=1766032510)
Feb 20 08:00:41 np0005625204.localdomain podman[59725]: 2026-02-20 08:00:41.353390703 +0000 UTC m=+0.292479879 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, version=17.1.13, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 20 08:00:41 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:00:43 np0005625204.localdomain sshd[59647]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 25398 ssh2 [preauth]
Feb 20 08:00:43 np0005625204.localdomain sshd[59647]: Disconnecting authenticating user root 185.246.128.171 port 25398: Too many authentication failures [preauth]
Feb 20 08:00:46 np0005625204.localdomain sshd[59754]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:52 np0005625204.localdomain sshd[59756]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:52 np0005625204.localdomain sshd[59756]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:00:53 np0005625204.localdomain sshd[59754]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 15908 ssh2 [preauth]
Feb 20 08:00:53 np0005625204.localdomain sshd[59754]: Disconnecting authenticating user root 185.246.128.171 port 15908: Too many authentication failures [preauth]
Feb 20 08:00:54 np0005625204.localdomain sshd[59758]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:00:57 np0005625204.localdomain sshd[59758]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 2539 ssh2 [preauth]
Feb 20 08:00:57 np0005625204.localdomain sshd[59758]: Disconnecting authenticating user root 185.246.128.171 port 2539: Too many authentication failures [preauth]
Feb 20 08:00:58 np0005625204.localdomain sshd[59760]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:01 np0005625204.localdomain CROND[59763]: (root) CMD (run-parts /etc/cron.hourly)
Feb 20 08:01:01 np0005625204.localdomain run-parts[59766]: (/etc/cron.hourly) starting 0anacron
Feb 20 08:01:01 np0005625204.localdomain run-parts[59772]: (/etc/cron.hourly) finished 0anacron
Feb 20 08:01:01 np0005625204.localdomain CROND[59762]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 20 08:01:04 np0005625204.localdomain sshd[59760]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 31928 ssh2 [preauth]
Feb 20 08:01:04 np0005625204.localdomain sshd[59760]: Disconnecting authenticating user root 185.246.128.171 port 31928: Too many authentication failures [preauth]
Feb 20 08:01:05 np0005625204.localdomain sshd[59773]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:06 np0005625204.localdomain sshd[59773]: Invalid user n8n from 178.217.173.50 port 42812
Feb 20 08:01:06 np0005625204.localdomain sshd[59773]: Received disconnect from 178.217.173.50 port 42812:11: Bye Bye [preauth]
Feb 20 08:01:06 np0005625204.localdomain sshd[59773]: Disconnected from invalid user n8n 178.217.173.50 port 42812 [preauth]
Feb 20 08:01:07 np0005625204.localdomain sshd[59775]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:09 np0005625204.localdomain sshd[59777]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:10 np0005625204.localdomain sshd[59777]: Received disconnect from 101.36.109.176 port 51552:11: Bye Bye [preauth]
Feb 20 08:01:10 np0005625204.localdomain sshd[59777]: Disconnected from authenticating user root 101.36.109.176 port 51552 [preauth]
Feb 20 08:01:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:01:12 np0005625204.localdomain podman[59779]: 2026-02-20 08:01:12.147854851 +0000 UTC m=+0.087136372 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:01:12 np0005625204.localdomain podman[59779]: 2026-02-20 08:01:12.358179262 +0000 UTC m=+0.297460723 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:01:12 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:01:13 np0005625204.localdomain sshd[59775]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 21698 ssh2 [preauth]
Feb 20 08:01:13 np0005625204.localdomain sshd[59775]: Disconnecting authenticating user root 185.246.128.171 port 21698: Too many authentication failures [preauth]
Feb 20 08:01:14 np0005625204.localdomain sshd[59808]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:20 np0005625204.localdomain sshd[59808]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 10708 ssh2 [preauth]
Feb 20 08:01:20 np0005625204.localdomain sshd[59808]: Disconnecting authenticating user root 185.246.128.171 port 10708: Too many authentication failures [preauth]
Feb 20 08:01:22 np0005625204.localdomain sshd[59810]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:26 np0005625204.localdomain sshd[59810]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 59787 ssh2 [preauth]
Feb 20 08:01:26 np0005625204.localdomain sshd[59810]: Disconnecting authenticating user root 185.246.128.171 port 59787: Too many authentication failures [preauth]
Feb 20 08:01:29 np0005625204.localdomain sshd[59812]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:34 np0005625204.localdomain sshd[59812]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 40702 ssh2 [preauth]
Feb 20 08:01:34 np0005625204.localdomain sshd[59812]: Disconnecting authenticating user root 185.246.128.171 port 40702: Too many authentication failures [preauth]
Feb 20 08:01:35 np0005625204.localdomain sshd[59814]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:39 np0005625204.localdomain sudo[59816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:01:39 np0005625204.localdomain sudo[59816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:01:39 np0005625204.localdomain sudo[59816]: pam_unix(sudo:session): session closed for user root
Feb 20 08:01:39 np0005625204.localdomain sudo[59831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 08:01:39 np0005625204.localdomain sudo[59831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:01:40 np0005625204.localdomain sshd[59929]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:40 np0005625204.localdomain systemd[1]: tmp-crun.dq5Nqp.mount: Deactivated successfully.
Feb 20 08:01:40 np0005625204.localdomain podman[59919]: 2026-02-20 08:01:40.368821187 +0000 UTC m=+0.090244654 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1770267347, build-date=2026-02-09T10:25:24Z)
Feb 20 08:01:40 np0005625204.localdomain podman[59919]: 2026-02-20 08:01:40.472019785 +0000 UTC m=+0.193443312 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, release=1770267347, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 08:01:40 np0005625204.localdomain sshd[59929]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:01:40 np0005625204.localdomain sudo[59831]: pam_unix(sudo:session): session closed for user root
Feb 20 08:01:40 np0005625204.localdomain sudo[59986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:01:40 np0005625204.localdomain sudo[59986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:01:40 np0005625204.localdomain sudo[59986]: pam_unix(sudo:session): session closed for user root
Feb 20 08:01:40 np0005625204.localdomain sudo[60001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:01:40 np0005625204.localdomain sudo[60001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:01:41 np0005625204.localdomain sshd[59814]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 20984 ssh2 [preauth]
Feb 20 08:01:41 np0005625204.localdomain sshd[59814]: Disconnecting authenticating user root 185.246.128.171 port 20984: Too many authentication failures [preauth]
Feb 20 08:01:41 np0005625204.localdomain sudo[60001]: pam_unix(sudo:session): session closed for user root
Feb 20 08:01:42 np0005625204.localdomain sudo[60047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:01:42 np0005625204.localdomain sudo[60047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:01:42 np0005625204.localdomain sudo[60047]: pam_unix(sudo:session): session closed for user root
Feb 20 08:01:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:01:43 np0005625204.localdomain sshd[60073]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:43 np0005625204.localdomain podman[60062]: 2026-02-20 08:01:43.134791317 +0000 UTC m=+0.074741675 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, url=https://www.redhat.com)
Feb 20 08:01:43 np0005625204.localdomain podman[60062]: 2026-02-20 08:01:43.329180006 +0000 UTC m=+0.269130374 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 20 08:01:43 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:01:48 np0005625204.localdomain sshd[60073]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 6830 ssh2 [preauth]
Feb 20 08:01:48 np0005625204.localdomain sshd[60073]: Disconnecting authenticating user root 185.246.128.171 port 6830: Too many authentication failures [preauth]
Feb 20 08:01:49 np0005625204.localdomain sshd[60093]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:01:53 np0005625204.localdomain sshd[60093]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 52713 ssh2 [preauth]
Feb 20 08:01:53 np0005625204.localdomain sshd[60093]: Disconnecting authenticating user root 185.246.128.171 port 52713: Too many authentication failures [preauth]
Feb 20 08:01:55 np0005625204.localdomain sshd[60095]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:03 np0005625204.localdomain sshd[60095]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 24850 ssh2 [preauth]
Feb 20 08:02:03 np0005625204.localdomain sshd[60095]: Disconnecting authenticating user root 185.246.128.171 port 24850: Too many authentication failures [preauth]
Feb 20 08:02:03 np0005625204.localdomain sshd[60097]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:07 np0005625204.localdomain sshd[60097]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 20586 ssh2 [preauth]
Feb 20 08:02:07 np0005625204.localdomain sshd[60097]: Disconnecting authenticating user root 185.246.128.171 port 20586: Too many authentication failures [preauth]
Feb 20 08:02:09 np0005625204.localdomain sshd[60099]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:02:14 np0005625204.localdomain podman[60101]: 2026-02-20 08:02:14.133849401 +0000 UTC m=+0.077437905 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:02:14 np0005625204.localdomain podman[60101]: 2026-02-20 08:02:14.315485242 +0000 UTC m=+0.259073806 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:02:14 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:02:21 np0005625204.localdomain sshd[60099]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 61382 ssh2 [preauth]
Feb 20 08:02:21 np0005625204.localdomain sshd[60099]: Disconnecting authenticating user root 185.246.128.171 port 61382: Too many authentication failures [preauth]
Feb 20 08:02:23 np0005625204.localdomain sshd[60131]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:27 np0005625204.localdomain sshd[60133]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:27 np0005625204.localdomain sshd[60133]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:02:28 np0005625204.localdomain sshd[60131]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 25454 ssh2 [preauth]
Feb 20 08:02:28 np0005625204.localdomain sshd[60131]: Disconnecting authenticating user root 185.246.128.171 port 25454: Too many authentication failures [preauth]
Feb 20 08:02:29 np0005625204.localdomain sshd[60135]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:40 np0005625204.localdomain sshd[60135]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 5365 ssh2 [preauth]
Feb 20 08:02:40 np0005625204.localdomain sshd[60135]: Disconnecting authenticating user root 185.246.128.171 port 5365: Too many authentication failures [preauth]
Feb 20 08:02:42 np0005625204.localdomain sudo[60137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:02:42 np0005625204.localdomain sudo[60137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:02:42 np0005625204.localdomain sudo[60137]: pam_unix(sudo:session): session closed for user root
Feb 20 08:02:42 np0005625204.localdomain sudo[60152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:02:42 np0005625204.localdomain sudo[60152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:02:42 np0005625204.localdomain sshd[60184]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:42 np0005625204.localdomain sudo[60152]: pam_unix(sudo:session): session closed for user root
Feb 20 08:02:43 np0005625204.localdomain sudo[60200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:02:43 np0005625204.localdomain sudo[60200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:02:43 np0005625204.localdomain sudo[60200]: pam_unix(sudo:session): session closed for user root
Feb 20 08:02:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:02:45 np0005625204.localdomain systemd[1]: tmp-crun.r3yGUB.mount: Deactivated successfully.
Feb 20 08:02:45 np0005625204.localdomain podman[60215]: 2026-02-20 08:02:45.151597698 +0000 UTC m=+0.089029909 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git)
Feb 20 08:02:45 np0005625204.localdomain podman[60215]: 2026-02-20 08:02:45.359168937 +0000 UTC m=+0.296601188 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:02:45 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:02:48 np0005625204.localdomain sshd[60184]: error: maximum authentication attempts exceeded for root from 185.246.128.171 port 35286 ssh2 [preauth]
Feb 20 08:02:48 np0005625204.localdomain sshd[60184]: Disconnecting authenticating user root 185.246.128.171 port 35286: Too many authentication failures [preauth]
Feb 20 08:02:49 np0005625204.localdomain sshd[60244]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:51 np0005625204.localdomain sshd[60244]: Disconnecting authenticating user root 185.246.128.171 port 16811: Change of username or service not allowed: (root,ssh-connection) -> (sftp_user,ssh-connection) [preauth]
Feb 20 08:02:53 np0005625204.localdomain sshd[60246]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:56 np0005625204.localdomain sshd[60246]: Invalid user sftp_user from 185.246.128.171 port 46356
Feb 20 08:02:56 np0005625204.localdomain sshd[60246]: Disconnecting invalid user sftp_user 185.246.128.171 port 46356: Change of username or service not allowed: (sftp_user,ssh-connection) -> (user01,ssh-connection) [preauth]
Feb 20 08:02:57 np0005625204.localdomain sshd[60248]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:02:59 np0005625204.localdomain sshd[60248]: Invalid user user01 from 185.246.128.171 port 8777
Feb 20 08:03:00 np0005625204.localdomain sshd[60248]: Disconnecting invalid user user01 185.246.128.171 port 8777: Change of username or service not allowed: (user01,ssh-connection) -> (nginx,ssh-connection) [preauth]
Feb 20 08:03:02 np0005625204.localdomain sshd[60250]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:04 np0005625204.localdomain sshd[60250]: Invalid user nginx from 185.246.128.171 port 43994
Feb 20 08:03:06 np0005625204.localdomain sshd[60250]: Disconnecting invalid user nginx 185.246.128.171 port 43994: Change of username or service not allowed: (nginx,ssh-connection) -> (adam,ssh-connection) [preauth]
Feb 20 08:03:07 np0005625204.localdomain sshd[60252]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:08 np0005625204.localdomain sshd[60252]: Invalid user adam from 185.246.128.171 port 19590
Feb 20 08:03:09 np0005625204.localdomain sshd[60252]: Disconnecting invalid user adam 185.246.128.171 port 19590: Change of username or service not allowed: (adam,ssh-connection) -> (smb,ssh-connection) [preauth]
Feb 20 08:03:10 np0005625204.localdomain sshd[60254]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:12 np0005625204.localdomain sshd[60256]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:12 np0005625204.localdomain sshd[60256]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:03:14 np0005625204.localdomain sshd[60254]: Invalid user smb from 185.246.128.171 port 39633
Feb 20 08:03:15 np0005625204.localdomain sshd[60254]: Disconnecting invalid user smb 185.246.128.171 port 39633: Change of username or service not allowed: (smb,ssh-connection) -> (landscape,ssh-connection) [preauth]
Feb 20 08:03:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:03:16 np0005625204.localdomain podman[60258]: 2026-02-20 08:03:16.146547229 +0000 UTC m=+0.084453905 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, architecture=x86_64)
Feb 20 08:03:16 np0005625204.localdomain podman[60258]: 2026-02-20 08:03:16.377362145 +0000 UTC m=+0.315268811 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:03:16 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:03:17 np0005625204.localdomain sshd[60288]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:20 np0005625204.localdomain sshd[60288]: Invalid user landscape from 185.246.128.171 port 21076
Feb 20 08:03:21 np0005625204.localdomain sshd[60288]: Disconnecting invalid user landscape 185.246.128.171 port 21076: Change of username or service not allowed: (landscape,ssh-connection) -> (usr,ssh-connection) [preauth]
Feb 20 08:03:23 np0005625204.localdomain sshd[60290]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:24 np0005625204.localdomain sshd[60290]: Invalid user usr from 185.246.128.171 port 62756
Feb 20 08:03:24 np0005625204.localdomain sshd[60290]: Disconnecting invalid user usr 185.246.128.171 port 62756: Change of username or service not allowed: (usr,ssh-connection) -> (antminermonitor,ssh-connection) [preauth]
Feb 20 08:03:26 np0005625204.localdomain sshd[60292]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:27 np0005625204.localdomain sshd[60292]: Invalid user antminermonitor from 185.246.128.171 port 20226
Feb 20 08:03:28 np0005625204.localdomain sshd[60292]: Disconnecting invalid user antminermonitor 185.246.128.171 port 20226: Change of username or service not allowed: (antminermonitor,ssh-connection) -> (deployer,ssh-connect [preauth]
Feb 20 08:03:30 np0005625204.localdomain sshd[60294]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:31 np0005625204.localdomain sshd[60294]: Invalid user deployer from 185.246.128.171 port 48370
Feb 20 08:03:31 np0005625204.localdomain sshd[60294]: Disconnecting invalid user deployer 185.246.128.171 port 48370: Change of username or service not allowed: (deployer,ssh-connection) -> (samp,ssh-connection) [preauth]
Feb 20 08:03:33 np0005625204.localdomain sshd[60296]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:35 np0005625204.localdomain sshd[60296]: Invalid user samp from 185.246.128.171 port 3144
Feb 20 08:03:36 np0005625204.localdomain sshd[60296]: Disconnecting invalid user samp 185.246.128.171 port 3144: Change of username or service not allowed: (samp,ssh-connection) -> (gg,ssh-connection) [preauth]
Feb 20 08:03:38 np0005625204.localdomain sshd[60298]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:39 np0005625204.localdomain sshd[60298]: Invalid user gg from 185.246.128.171 port 38321
Feb 20 08:03:40 np0005625204.localdomain sshd[60298]: Disconnecting invalid user gg 185.246.128.171 port 38321: Change of username or service not allowed: (gg,ssh-connection) -> (testftp,ssh-connection) [preauth]
Feb 20 08:03:41 np0005625204.localdomain sshd[60300]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:43 np0005625204.localdomain sudo[60302]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:03:43 np0005625204.localdomain sudo[60302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:03:43 np0005625204.localdomain sudo[60302]: pam_unix(sudo:session): session closed for user root
Feb 20 08:03:43 np0005625204.localdomain sudo[60317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:03:43 np0005625204.localdomain sudo[60317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:03:43 np0005625204.localdomain sshd[60300]: Invalid user testftp from 185.246.128.171 port 61785
Feb 20 08:03:44 np0005625204.localdomain sshd[60300]: Disconnecting invalid user testftp 185.246.128.171 port 61785: Change of username or service not allowed: (testftp,ssh-connection) -> (nutanix,ssh-connection) [preauth]
Feb 20 08:03:44 np0005625204.localdomain sudo[60317]: pam_unix(sudo:session): session closed for user root
Feb 20 08:03:44 np0005625204.localdomain sshd[60364]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:45 np0005625204.localdomain sudo[60366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:03:45 np0005625204.localdomain sudo[60366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:03:45 np0005625204.localdomain sudo[60366]: pam_unix(sudo:session): session closed for user root
Feb 20 08:03:47 np0005625204.localdomain sshd[60364]: Invalid user nutanix from 185.246.128.171 port 20764
Feb 20 08:03:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:03:47 np0005625204.localdomain podman[60381]: 2026-02-20 08:03:47.102051912 +0000 UTC m=+0.066450835 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:03:47 np0005625204.localdomain podman[60381]: 2026-02-20 08:03:47.287103392 +0000 UTC m=+0.251502335 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:03:47 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:03:47 np0005625204.localdomain sshd[60364]: Disconnecting invalid user nutanix 185.246.128.171 port 20764: Change of username or service not allowed: (nutanix,ssh-connection) -> (samuel,ssh-connection) [preauth]
Feb 20 08:03:49 np0005625204.localdomain sshd[60410]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:52 np0005625204.localdomain sshd[60410]: Invalid user samuel from 185.246.128.171 port 52099
Feb 20 08:03:53 np0005625204.localdomain sshd[60410]: Disconnecting invalid user samuel 185.246.128.171 port 52099: Change of username or service not allowed: (samuel,ssh-connection) -> (Lucas,ssh-connection) [preauth]
Feb 20 08:03:53 np0005625204.localdomain sshd[60412]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:55 np0005625204.localdomain sshd[60412]: Invalid user Lucas from 185.246.128.171 port 19752
Feb 20 08:03:56 np0005625204.localdomain sshd[60412]: Disconnecting invalid user Lucas 185.246.128.171 port 19752: Change of username or service not allowed: (Lucas,ssh-connection) -> (ai,ssh-connection) [preauth]
Feb 20 08:03:58 np0005625204.localdomain sshd[60414]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:03:58 np0005625204.localdomain sshd[60414]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:03:58 np0005625204.localdomain sshd[60416]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:04 np0005625204.localdomain sshd[60416]: Invalid user ai from 185.246.128.171 port 51621
Feb 20 08:04:05 np0005625204.localdomain sshd[60416]: Disconnecting invalid user ai 185.246.128.171 port 51621: Change of username or service not allowed: (ai,ssh-connection) -> (seki,ssh-connection) [preauth]
Feb 20 08:04:06 np0005625204.localdomain sshd[60418]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:09 np0005625204.localdomain sshd[60418]: Invalid user seki from 185.246.128.171 port 38342
Feb 20 08:04:09 np0005625204.localdomain sshd[60418]: Disconnecting invalid user seki 185.246.128.171 port 38342: Change of username or service not allowed: (seki,ssh-connection) -> (alarm,ssh-connection) [preauth]
Feb 20 08:04:10 np0005625204.localdomain sshd[60420]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:11 np0005625204.localdomain sshd[60420]: Invalid user alarm from 185.246.128.171 port 5043
Feb 20 08:04:11 np0005625204.localdomain sshd[60420]: Disconnecting invalid user alarm 185.246.128.171 port 5043: Change of username or service not allowed: (alarm,ssh-connection) -> (download,ssh-connection) [preauth]
Feb 20 08:04:13 np0005625204.localdomain sshd[60422]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:16 np0005625204.localdomain sshd[60422]: Invalid user download from 185.246.128.171 port 25294
Feb 20 08:04:17 np0005625204.localdomain sshd[60422]: Disconnecting invalid user download 185.246.128.171 port 25294: Change of username or service not allowed: (download,ssh-connection) -> (richard,ssh-connection) [preauth]
Feb 20 08:04:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:04:18 np0005625204.localdomain systemd[1]: tmp-crun.DQMc8y.mount: Deactivated successfully.
Feb 20 08:04:18 np0005625204.localdomain podman[60424]: 2026-02-20 08:04:18.147476457 +0000 UTC m=+0.083791555 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:04:18 np0005625204.localdomain podman[60424]: 2026-02-20 08:04:18.341932694 +0000 UTC m=+0.278247822 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:04:18 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:04:18 np0005625204.localdomain sshd[60452]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:20 np0005625204.localdomain sshd[60452]: Invalid user richard from 185.246.128.171 port 61454
Feb 20 08:04:23 np0005625204.localdomain sshd[60452]: Disconnecting invalid user richard 185.246.128.171 port 61454: Change of username or service not allowed: (richard,ssh-connection) -> (moth3r,ssh-connection) [preauth]
Feb 20 08:04:23 np0005625204.localdomain sshd[60454]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:25 np0005625204.localdomain sshd[60454]: Invalid user moth3r from 185.246.128.171 port 31660
Feb 20 08:04:26 np0005625204.localdomain sshd[60454]: Disconnecting invalid user moth3r 185.246.128.171 port 31660: Change of username or service not allowed: (moth3r,ssh-connection) -> (chef,ssh-connection) [preauth]
Feb 20 08:04:28 np0005625204.localdomain sshd[60456]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:30 np0005625204.localdomain sudo[60503]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmsyfkuwdyqvtsfcbhiuiuewhmfluyqv ; /usr/bin/python3
Feb 20 08:04:30 np0005625204.localdomain sudo[60503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:30 np0005625204.localdomain python3[60505]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:30 np0005625204.localdomain sudo[60503]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:30 np0005625204.localdomain sudo[60548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sallsdaykyeyacqxkgoqkkmjepvqksni ; /usr/bin/python3
Feb 20 08:04:30 np0005625204.localdomain sudo[60548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:31 np0005625204.localdomain python3[60550]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574670.3824532-99243-6894457537755/source _original_basename=tmp1yp4vlhb follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:31 np0005625204.localdomain sudo[60548]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:31 np0005625204.localdomain sshd[60565]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:31 np0005625204.localdomain sshd[60456]: Invalid user chef from 185.246.128.171 port 64554
Feb 20 08:04:32 np0005625204.localdomain sudo[60579]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkjvvjqedgvmgpmhsggsppyshgkgmxah ; /usr/bin/python3
Feb 20 08:04:32 np0005625204.localdomain sudo[60579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:32 np0005625204.localdomain sshd[60456]: Disconnecting invalid user chef 185.246.128.171 port 64554: Change of username or service not allowed: (chef,ssh-connection) -> (ansible,ssh-connection) [preauth]
Feb 20 08:04:32 np0005625204.localdomain python3[60581]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:04:32 np0005625204.localdomain sudo[60579]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:32 np0005625204.localdomain sudo[60629]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpzqmtyfsujqydbqhivpflqvdwnnefce ; /usr/bin/python3
Feb 20 08:04:32 np0005625204.localdomain sudo[60629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:33 np0005625204.localdomain sudo[60629]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:33 np0005625204.localdomain sudo[60648]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqqulmtpqjsyjnnanyymzrjkqdarvnww ; /usr/bin/python3
Feb 20 08:04:33 np0005625204.localdomain sshd[60650]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:33 np0005625204.localdomain sudo[60648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:33 np0005625204.localdomain sudo[60648]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:33 np0005625204.localdomain sudo[60753]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwqjiiuqhrmijkugyzfnmjqsdklnyuuw ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574673.6000972-99471-144000516931635/async_wrapper.py 303677600785 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574673.6000972-99471-144000516931635/AnsiballZ_command.py _
Feb 20 08:04:33 np0005625204.localdomain sudo[60753]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 08:04:34 np0005625204.localdomain ansible-async_wrapper.py[60755]: Invoked with 303677600785 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574673.6000972-99471-144000516931635/AnsiballZ_command.py _
Feb 20 08:04:34 np0005625204.localdomain ansible-async_wrapper.py[60758]: Starting module and watcher
Feb 20 08:04:34 np0005625204.localdomain ansible-async_wrapper.py[60758]: Start watching 60759 (3600)
Feb 20 08:04:34 np0005625204.localdomain ansible-async_wrapper.py[60759]: Start module (60759)
Feb 20 08:04:34 np0005625204.localdomain ansible-async_wrapper.py[60755]: Return async_wrapper task started.
Feb 20 08:04:34 np0005625204.localdomain sudo[60753]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:34 np0005625204.localdomain sudo[60777]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkjfsmloffrxbgxrigpcrbnskzajwkyy ; /usr/bin/python3
Feb 20 08:04:34 np0005625204.localdomain sudo[60777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:34 np0005625204.localdomain python3[60779]: ansible-ansible.legacy.async_status Invoked with jid=303677600785.60755 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:04:34 np0005625204.localdomain sudo[60777]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:34 np0005625204.localdomain sshd[60565]: Invalid user omar from 115.190.172.63 port 36922
Feb 20 08:04:35 np0005625204.localdomain sshd[60565]: Received disconnect from 115.190.172.63 port 36922:11: Bye Bye [preauth]
Feb 20 08:04:35 np0005625204.localdomain sshd[60565]: Disconnected from invalid user omar 115.190.172.63 port 36922 [preauth]
Feb 20 08:04:36 np0005625204.localdomain sshd[60820]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:37 np0005625204.localdomain sshd[60650]: Invalid user ansible from 185.246.128.171 port 32451
Feb 20 08:04:37 np0005625204.localdomain sshd[60820]: Invalid user cod2server from 101.36.109.176 port 51884
Feb 20 08:04:37 np0005625204.localdomain sshd[60820]: Received disconnect from 101.36.109.176 port 51884:11: Bye Bye [preauth]
Feb 20 08:04:37 np0005625204.localdomain sshd[60820]: Disconnected from invalid user cod2server 101.36.109.176 port 51884 [preauth]
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    (file: /etc/puppet/hiera.yaml)
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Warning: Undefined variable '::deploy_config_name';
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    (file & line not available)
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    (file & line not available)
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 20 08:04:37 np0005625204.localdomain sshd[60892]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.12 seconds
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Notice: Applied catalog in 0.03 seconds
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Application:
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    Initial environment: production
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    Converged environment: production
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:          Run mode: user
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Changes:
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Events:
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Resources:
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:             Total: 10
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Time:
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:          Schedule: 0.00
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:              File: 0.00
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:              Exec: 0.00
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:            Augeas: 0.01
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    Transaction evaluation: 0.03
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    Catalog application: 0.03
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:    Config retrieval: 0.15
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:          Last run: 1771574677
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:        Filebucket: 0.00
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:             Total: 0.04
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]: Version:
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:            Config: 1771574677
Feb 20 08:04:37 np0005625204.localdomain puppet-user[60763]:            Puppet: 7.10.0
Feb 20 08:04:37 np0005625204.localdomain ansible-async_wrapper.py[60759]: Module complete (60759)
Feb 20 08:04:38 np0005625204.localdomain sshd[60892]: Received disconnect from 178.217.173.50 port 49396:11: Bye Bye [preauth]
Feb 20 08:04:38 np0005625204.localdomain sshd[60892]: Disconnected from authenticating user root 178.217.173.50 port 49396 [preauth]
Feb 20 08:04:39 np0005625204.localdomain ansible-async_wrapper.py[60758]: Done in kid B.
Feb 20 08:04:39 np0005625204.localdomain sshd[60650]: Disconnecting invalid user ansible 185.246.128.171 port 32451: Change of username or service not allowed: (ansible,ssh-connection) -> (training,ssh-connection) [preauth]
Feb 20 08:04:41 np0005625204.localdomain sshd[60895]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:43 np0005625204.localdomain sshd[60897]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:43 np0005625204.localdomain sshd[60897]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:04:44 np0005625204.localdomain sshd[60895]: Invalid user training from 185.246.128.171 port 23193
Feb 20 08:04:44 np0005625204.localdomain sshd[60895]: Disconnecting invalid user training 185.246.128.171 port 23193: Change of username or service not allowed: (training,ssh-connection) -> (telecom,ssh-connection) [preauth]
Feb 20 08:04:44 np0005625204.localdomain sudo[60912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrjbkbeyxarwehinpclwrtomjdemlfcv ; /usr/bin/python3
Feb 20 08:04:44 np0005625204.localdomain sudo[60912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:44 np0005625204.localdomain python3[60914]: ansible-ansible.legacy.async_status Invoked with jid=303677600785.60755 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:04:44 np0005625204.localdomain sudo[60912]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:45 np0005625204.localdomain sudo[60915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:04:45 np0005625204.localdomain sudo[60915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:04:45 np0005625204.localdomain sudo[60915]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:45 np0005625204.localdomain sudo[60930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:04:45 np0005625204.localdomain sudo[60930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:04:45 np0005625204.localdomain sudo[60958]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdmfpspkoxwkfqrfblphtaavytbplhbk ; /usr/bin/python3
Feb 20 08:04:45 np0005625204.localdomain sudo[60958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:45 np0005625204.localdomain sshd[60975]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:45 np0005625204.localdomain python3[60967]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 08:04:45 np0005625204.localdomain sudo[60958]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:45 np0005625204.localdomain sshd[60982]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:45 np0005625204.localdomain sudo[61004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbcbbqddpteiqszojujihelbbrkhrpmy ; /usr/bin/python3
Feb 20 08:04:45 np0005625204.localdomain sudo[61004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:45 np0005625204.localdomain sudo[60930]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:46 np0005625204.localdomain python3[61010]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:04:46 np0005625204.localdomain sudo[61004]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:46 np0005625204.localdomain sudo[61058]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnnjkbcfrazpzvxmnxjsrdrbuwejflsy ; /usr/bin/python3
Feb 20 08:04:46 np0005625204.localdomain sudo[61058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:46 np0005625204.localdomain python3[61060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:46 np0005625204.localdomain sudo[61061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:04:46 np0005625204.localdomain sudo[61061]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:04:46 np0005625204.localdomain sudo[61058]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:46 np0005625204.localdomain sudo[61061]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:46 np0005625204.localdomain sudo[61091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tueogzwapiqtisczasjfuoxebmwjmaep ; /usr/bin/python3
Feb 20 08:04:46 np0005625204.localdomain sudo[61091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:46 np0005625204.localdomain python3[61093]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpu3k5w8dk recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 08:04:46 np0005625204.localdomain sudo[61091]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:47 np0005625204.localdomain sudo[61122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzxipudrseorjqynczsmixovyaeajnpa ; /usr/bin/python3
Feb 20 08:04:47 np0005625204.localdomain sudo[61122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:47 np0005625204.localdomain python3[61124]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:47 np0005625204.localdomain sudo[61122]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:47 np0005625204.localdomain sudo[61138]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swzefjyqrjsmbbpkdcemtdwshkumfyzh ; /usr/bin/python3
Feb 20 08:04:47 np0005625204.localdomain sudo[61138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:47 np0005625204.localdomain sudo[61138]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:48 np0005625204.localdomain sshd[60975]: Received disconnect from 103.157.25.4 port 53732:11: Bye Bye [preauth]
Feb 20 08:04:48 np0005625204.localdomain sshd[60975]: Disconnected from authenticating user root 103.157.25.4 port 53732 [preauth]
Feb 20 08:04:48 np0005625204.localdomain sshd[60982]: Invalid user telecom from 185.246.128.171 port 52577
Feb 20 08:04:48 np0005625204.localdomain sudo[61225]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkhvybpfpfvqdxdnqyalbjjtuwfjsfzj ; /usr/bin/python3
Feb 20 08:04:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:04:48 np0005625204.localdomain sudo[61225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:48 np0005625204.localdomain sshd[60982]: Disconnecting invalid user telecom 185.246.128.171 port 52577: Change of username or service not allowed: (telecom,ssh-connection) -> (x,ssh-connection) [preauth]
Feb 20 08:04:48 np0005625204.localdomain podman[61227]: 2026-02-20 08:04:48.592411763 +0000 UTC m=+0.091101957 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:04:48 np0005625204.localdomain python3[61228]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 20 08:04:48 np0005625204.localdomain sudo[61225]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:48 np0005625204.localdomain podman[61227]: 2026-02-20 08:04:48.84201349 +0000 UTC m=+0.340703664 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:04:48 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:04:49 np0005625204.localdomain sudo[61272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycybspzlhrubxngacqgiunbkybfvsswq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:49 np0005625204.localdomain sudo[61272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:49 np0005625204.localdomain python3[61274]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:49 np0005625204.localdomain sudo[61272]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:49 np0005625204.localdomain sudo[61288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edzptncnzyhiwoztjjqnkzeidlrsbtbk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:49 np0005625204.localdomain sudo[61288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:50 np0005625204.localdomain sudo[61288]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:50 np0005625204.localdomain sudo[61305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hgicpasjkpafhmsosddhpfinfzmimnov ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:50 np0005625204.localdomain sudo[61305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:50 np0005625204.localdomain python3[61307]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:04:50 np0005625204.localdomain sudo[61305]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:51 np0005625204.localdomain sudo[61355]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzhxjpcqaumvfcidlrqfedvtoaianzmq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:51 np0005625204.localdomain sudo[61355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:51 np0005625204.localdomain sshd[61358]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:51 np0005625204.localdomain python3[61357]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:51 np0005625204.localdomain sudo[61355]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:51 np0005625204.localdomain sudo[61374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvkdkpzjujwvlqixhkezeauyzafkqgeu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:51 np0005625204.localdomain sudo[61374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:51 np0005625204.localdomain python3[61376]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:51 np0005625204.localdomain sudo[61374]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:52 np0005625204.localdomain sudo[61436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyehfllzzvwngslwjbrniinxdedfcxop ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:52 np0005625204.localdomain sudo[61436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:52 np0005625204.localdomain python3[61438]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:52 np0005625204.localdomain sudo[61436]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:52 np0005625204.localdomain sudo[61454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkdorifcqeuglxlefeeyxkrnerrhkxpy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:52 np0005625204.localdomain sudo[61454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:52 np0005625204.localdomain python3[61456]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:52 np0005625204.localdomain sudo[61454]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:52 np0005625204.localdomain sudo[61516]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yboyefpwehlcfljjlpimxzytawnolyat ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:52 np0005625204.localdomain sudo[61516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:52 np0005625204.localdomain python3[61518]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:52 np0005625204.localdomain sudo[61516]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:53 np0005625204.localdomain sudo[61535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qntwnpnzgavwrnimackpypvqlwtvndki ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:53 np0005625204.localdomain sudo[61535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:53 np0005625204.localdomain python3[61537]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:53 np0005625204.localdomain sudo[61535]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:53 np0005625204.localdomain sudo[61597]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sybjkyynzyvrugpxkgzzaihuqgjrkjhu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:53 np0005625204.localdomain sudo[61597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:53 np0005625204.localdomain python3[61599]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:53 np0005625204.localdomain sudo[61597]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:53 np0005625204.localdomain sudo[61615]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcliwqugtticggjhshvvbricrkwyslqc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:53 np0005625204.localdomain sudo[61615]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:54 np0005625204.localdomain python3[61617]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:54 np0005625204.localdomain sudo[61615]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:54 np0005625204.localdomain sshd[61358]: Invalid user x from 185.246.128.171 port 26101
Feb 20 08:04:54 np0005625204.localdomain sudo[61645]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxxmmkawlwnvbirvogakrvcxeasheqfl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:54 np0005625204.localdomain sudo[61645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:54 np0005625204.localdomain python3[61647]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:04:54 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:04:54 np0005625204.localdomain sshd[61358]: Disconnecting invalid user x 185.246.128.171 port 26101: Change of username or service not allowed: (x,ssh-connection) -> (kernel,ssh-connection) [preauth]
Feb 20 08:04:54 np0005625204.localdomain systemd-rc-local-generator[61671]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:04:54 np0005625204.localdomain systemd-sysv-generator[61674]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:04:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:04:55 np0005625204.localdomain sudo[61645]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:55 np0005625204.localdomain sshd[61724]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:55 np0005625204.localdomain sudo[61731]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esvgarnghftxhllflpslixncsckfnslr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:55 np0005625204.localdomain sudo[61731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:55 np0005625204.localdomain python3[61733]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:55 np0005625204.localdomain sudo[61731]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:55 np0005625204.localdomain sudo[61749]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obesfizggenjgplgslcmgrhzmavxeqdb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:55 np0005625204.localdomain sudo[61749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:55 np0005625204.localdomain python3[61751]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:55 np0005625204.localdomain sudo[61749]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:56 np0005625204.localdomain sudo[61812]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwecmwhflrslagzqhkpdzfwclqupmfhj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:56 np0005625204.localdomain sudo[61812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:56 np0005625204.localdomain python3[61814]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:04:56 np0005625204.localdomain sudo[61812]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:56 np0005625204.localdomain sudo[61830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idgnyuwmvbubtwmsysferkcnxkowbosw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:56 np0005625204.localdomain sudo[61830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:56 np0005625204.localdomain sshd[61724]: Invalid user kernel from 185.246.128.171 port 52661
Feb 20 08:04:56 np0005625204.localdomain python3[61832]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:04:56 np0005625204.localdomain sudo[61830]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:56 np0005625204.localdomain sshd[61724]: Disconnecting invalid user kernel 185.246.128.171 port 52661: Change of username or service not allowed: (kernel,ssh-connection) -> (delegate,ssh-connection) [preauth]
Feb 20 08:04:56 np0005625204.localdomain sudo[61860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbclibygyxqksddiidefmfrebtdwgkca ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:56 np0005625204.localdomain sudo[61860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:57 np0005625204.localdomain python3[61862]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:04:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:04:57 np0005625204.localdomain systemd-rc-local-generator[61888]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:04:57 np0005625204.localdomain systemd-sysv-generator[61892]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:04:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:04:57 np0005625204.localdomain sshd[61899]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:04:57 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 08:04:57 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 08:04:57 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 08:04:57 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 08:04:57 np0005625204.localdomain sudo[61860]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:57 np0005625204.localdomain sudo[61918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdsfsvpeddtjyptsepmocfvimaqugptk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:57 np0005625204.localdomain sudo[61918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:58 np0005625204.localdomain python3[61920]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 20 08:04:58 np0005625204.localdomain sudo[61918]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:58 np0005625204.localdomain sudo[61935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cahotsxwllcigszhbjysvayhvfqugjvh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:04:58 np0005625204.localdomain sudo[61935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:04:58 np0005625204.localdomain sudo[61935]: pam_unix(sudo:session): session closed for user root
Feb 20 08:04:59 np0005625204.localdomain sshd[61899]: Invalid user delegate from 185.246.128.171 port 2770
Feb 20 08:05:00 np0005625204.localdomain sudo[61977]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezwqyhilngjotsckqtdiiuzhahjexvoe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:00 np0005625204.localdomain sudo[61977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:00 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 20 08:05:00 np0005625204.localdomain sshd[61899]: Disconnecting invalid user delegate 185.246.128.171 port 2770: Change of username or service not allowed: (delegate,ssh-connection) -> (sshuser,ssh-connection) [preauth]
Feb 20 08:05:00 np0005625204.localdomain podman[62141]: 2026-02-20 08:05:00.565794354 +0000 UTC m=+0.082875177 container create 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.)
Feb 20 08:05:00 np0005625204.localdomain podman[62147]: 2026-02-20 08:05:00.596490689 +0000 UTC m=+0.100038510 container create d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, architecture=x86_64)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope.
Feb 20 08:05:00 np0005625204.localdomain podman[62150]: 2026-02-20 08:05:00.620399798 +0000 UTC m=+0.102289719 container create ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, container_name=rsyslog, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:05:00 np0005625204.localdomain podman[62141]: 2026-02-20 08:05:00.521783703 +0000 UTC m=+0.038864626 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 20 08:05:00 np0005625204.localdomain podman[62147]: 2026-02-20 08:05:00.538864033 +0000 UTC m=+0.042411874 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 20 08:05:00 np0005625204.localdomain podman[62182]: 2026-02-20 08:05:00.645957097 +0000 UTC m=+0.107395884 container create e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6.scope.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope.
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7/merged/scripts supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain podman[62150]: 2026-02-20 08:05:00.56206894 +0000 UTC m=+0.043958881 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 20 08:05:00 np0005625204.localdomain podman[62182]: 2026-02-20 08:05:00.569203357 +0000 UTC m=+0.030642154 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19.scope.
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b52b75a7380249fd6beb40dca6e23a5c2c2b3650de6523e005db6f52b5fe90d0/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain podman[62150]: 2026-02-20 08:05:00.697205179 +0000 UTC m=+0.179095080 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:05:00 np0005625204.localdomain podman[62182]: 2026-02-20 08:05:00.701330595 +0000 UTC m=+0.162769382 container init e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, container_name=nova_virtlogd_wrapper, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com)
Feb 20 08:05:00 np0005625204.localdomain podman[62141]: 2026-02-20 08:05:00.701964074 +0000 UTC m=+0.219044907 container init 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=collectd, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:05:00 np0005625204.localdomain podman[62150]: 2026-02-20 08:05:00.70609546 +0000 UTC m=+0.187985351 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64)
Feb 20 08:05:00 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=eb8c5e608f55bc52c95871f92a543185 --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 20 08:05:00 np0005625204.localdomain podman[62182]: 2026-02-20 08:05:00.710869545 +0000 UTC m=+0.172308332 container start e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, container_name=nova_virtlogd_wrapper, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git)
Feb 20 08:05:00 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:00 np0005625204.localdomain podman[62168]: 2026-02-20 08:05:00.720478358 +0000 UTC m=+0.198437788 container create a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, distribution-scope=public)
Feb 20 08:05:00 np0005625204.localdomain sudo[62233]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:00 np0005625204.localdomain sudo[62237]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:00 np0005625204.localdomain sudo[62233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:05:00 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:00 np0005625204.localdomain sudo[62238]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:00 np0005625204.localdomain podman[62141]: 2026-02-20 08:05:00.742287863 +0000 UTC m=+0.259368686 container start 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, version=17.1.13, config_id=tripleo_step3, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=collectd, managed_by=tripleo_ansible)
Feb 20 08:05:00 np0005625204.localdomain podman[62147]: 2026-02-20 08:05:00.747275315 +0000 UTC m=+0.250823156 container init d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_init_log, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 20 08:05:00 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d31718fcd17fdeee6489534105191c7a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 20 08:05:00 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0.scope.
Feb 20 08:05:00 np0005625204.localdomain podman[62147]: 2026-02-20 08:05:00.763023155 +0000 UTC m=+0.266570966 container start d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64)
Feb 20 08:05:00 np0005625204.localdomain podman[62168]: 2026-02-20 08:05:00.666446971 +0000 UTC m=+0.144406411 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: libpod-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6.scope: Deactivated successfully.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 20 08:05:00 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Feb 20 08:05:00 np0005625204.localdomain sudo[62233]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:00 np0005625204.localdomain podman[62168]: 2026-02-20 08:05:00.80387546 +0000 UTC m=+0.281834900 container init a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13)
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:00 np0005625204.localdomain podman[62168]: 2026-02-20 08:05:00.8219208 +0000 UTC m=+0.299880230 container start a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, container_name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:05:00 np0005625204.localdomain podman[62168]: 2026-02-20 08:05:00.822149067 +0000 UTC m=+0.300108517 container attach a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=nova_statedir_owner, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: libpod-a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0.scope: Deactivated successfully.
Feb 20 08:05:00 np0005625204.localdomain podman[62168]: 2026-02-20 08:05:00.871201642 +0000 UTC m=+0.349161122 container died a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, distribution-scope=public, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 20 08:05:00 np0005625204.localdomain podman[62254]: 2026-02-20 08:05:00.867065786 +0000 UTC m=+0.118761111 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team)
Feb 20 08:05:00 np0005625204.localdomain podman[62307]: 2026-02-20 08:05:00.904734274 +0000 UTC m=+0.094471300 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5)
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Queued start job for default target Main User Target.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Created slice User Application Slice.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Reached target Paths.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Reached target Timers.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Starting D-Bus User Message Bus Socket...
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Starting Create User's Volatile Files and Directories...
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Listening on D-Bus User Message Bus Socket.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Finished Create User's Volatile Files and Directories.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Reached target Sockets.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Reached target Basic System.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Reached target Main User Target.
Feb 20 08:05:00 np0005625204.localdomain systemd[62274]: Startup finished in 114ms.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started User Manager for UID 0.
Feb 20 08:05:00 np0005625204.localdomain podman[62307]: 2026-02-20 08:05:00.932784519 +0000 UTC m=+0.122521525 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, architecture=x86_64, vcs-type=git, config_id=tripleo_step3, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1)
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started Session c1 of User root.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: Started Session c2 of User root.
Feb 20 08:05:00 np0005625204.localdomain systemd[1]: libpod-conmon-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:00 np0005625204.localdomain sudo[62237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:00 np0005625204.localdomain sudo[62238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:00 np0005625204.localdomain podman[62281]: 2026-02-20 08:05:00.953408408 +0000 UTC m=+0.153190920 container died d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, url=https://www.redhat.com, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:05:00 np0005625204.localdomain podman[62254]: 2026-02-20 08:05:00.953950704 +0000 UTC m=+0.205646019 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, release=1766032510, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true)
Feb 20 08:05:01 np0005625204.localdomain podman[62271]: 2026-02-20 08:05:01.003466923 +0000 UTC m=+0.215676545 container cleanup d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step3, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_init_log, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13)
Feb 20 08:05:01 np0005625204.localdomain sudo[62237]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: libpod-conmon-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6.scope: Deactivated successfully.
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Feb 20 08:05:01 np0005625204.localdomain podman[62254]: unhealthy
Feb 20 08:05:01 np0005625204.localdomain sudo[62238]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Failed with result 'exit-code'.
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Feb 20 08:05:01 np0005625204.localdomain podman[62356]: 2026-02-20 08:05:01.044320258 +0000 UTC m=+0.162503604 container cleanup a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_statedir_owner, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: libpod-conmon-a0b3dd974f8c2cd1eb9fab6649ce846f2940c70de58af138c245431326cc3ab0.scope: Deactivated successfully.
Feb 20 08:05:01 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Feb 20 08:05:01 np0005625204.localdomain podman[62523]: 2026-02-20 08:05:01.425159175 +0000 UTC m=+0.090372735 container create 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: Started libpod-conmon-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def.scope.
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain podman[62523]: 2026-02-20 08:05:01.388792077 +0000 UTC m=+0.054005657 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:01 np0005625204.localdomain podman[62523]: 2026-02-20 08:05:01.494331504 +0000 UTC m=+0.159545074 container init 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-libvirt-container)
Feb 20 08:05:01 np0005625204.localdomain podman[62523]: 2026-02-20 08:05:01.504031039 +0000 UTC m=+0.169244569 container start 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 20 08:05:01 np0005625204.localdomain podman[62555]: 2026-02-20 08:05:01.509145345 +0000 UTC m=+0.077572515 container create c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt)
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: Started libpod-conmon-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2.scope.
Feb 20 08:05:01 np0005625204.localdomain podman[62555]: 2026-02-20 08:05:01.470330202 +0000 UTC m=+0.038757432 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-b52b75a7380249fd6beb40dca6e23a5c2c2b3650de6523e005db6f52b5fe90d0-merged.mount: Deactivated successfully.
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d8929bf8ef66124f8e88e178825d706927d90c4bbc7b4556919e66c09d2d9dc6-userdata-shm.mount: Deactivated successfully.
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:01 np0005625204.localdomain podman[62555]: 2026-02-20 08:05:01.630442462 +0000 UTC m=+0.198869622 container init c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=nova_virtsecretd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 20 08:05:01 np0005625204.localdomain podman[62555]: 2026-02-20 08:05:01.636157996 +0000 UTC m=+0.204585156 container start c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtsecretd, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, tcib_managed=true)
Feb 20 08:05:01 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:01 np0005625204.localdomain sudo[62582]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:01 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: Started Session c3 of User root.
Feb 20 08:05:01 np0005625204.localdomain sudo[62582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:01 np0005625204.localdomain sudo[62582]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:01 np0005625204.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Feb 20 08:05:02 np0005625204.localdomain podman[62703]: 2026-02-20 08:05:02.029302409 +0000 UTC m=+0.079302779 container create b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtnodedevd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5)
Feb 20 08:05:02 np0005625204.localdomain podman[62711]: 2026-02-20 08:05:02.062720607 +0000 UTC m=+0.100382461 container create 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Feb 20 08:05:02 np0005625204.localdomain podman[62703]: 2026-02-20 08:05:01.985775901 +0000 UTC m=+0.035776341 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started libpod-conmon-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope.
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started libpod-conmon-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope.
Feb 20 08:05:02 np0005625204.localdomain podman[62711]: 2026-02-20 08:05:01.995328163 +0000 UTC m=+0.032990037 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:05:02 np0005625204.localdomain podman[62711]: 2026-02-20 08:05:02.14453508 +0000 UTC m=+0.182196944 container init 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, release=1766032510, vcs-type=git, container_name=iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 20 08:05:02 np0005625204.localdomain sudo[62740]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:02 np0005625204.localdomain podman[62703]: 2026-02-20 08:05:02.173787091 +0000 UTC m=+0.223787481 container init b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, container_name=nova_virtnodedevd, release=1766032510, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc.)
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:05:02 np0005625204.localdomain podman[62703]: 2026-02-20 08:05:02.184093656 +0000 UTC m=+0.234094086 container start b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc.)
Feb 20 08:05:02 np0005625204.localdomain podman[62711]: 2026-02-20 08:05:02.186496109 +0000 UTC m=+0.224157983 container start 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:05:02 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:02 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:02 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=24eefedeb2e4ab8bab62979b617bbba7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started Session c4 of User root.
Feb 20 08:05:02 np0005625204.localdomain sudo[62740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:02 np0005625204.localdomain sudo[62744]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:02 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started Session c5 of User root.
Feb 20 08:05:02 np0005625204.localdomain sudo[62744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:02 np0005625204.localdomain podman[62742]: 2026-02-20 08:05:02.295209743 +0000 UTC m=+0.096798882 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 20 08:05:02 np0005625204.localdomain sudo[62740]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Feb 20 08:05:02 np0005625204.localdomain podman[62742]: 2026-02-20 08:05:02.30990075 +0000 UTC m=+0.111489859 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true)
Feb 20 08:05:02 np0005625204.localdomain podman[62742]: unhealthy
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed with result 'exit-code'.
Feb 20 08:05:02 np0005625204.localdomain kernel: Loading iSCSI transport class v2.0-870.
Feb 20 08:05:02 np0005625204.localdomain sudo[62744]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Feb 20 08:05:02 np0005625204.localdomain sshd[62886]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:02 np0005625204.localdomain podman[62880]: 2026-02-20 08:05:02.737895055 +0000 UTC m=+0.095303526 container create 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible)
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started libpod-conmon-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108.scope.
Feb 20 08:05:02 np0005625204.localdomain podman[62880]: 2026-02-20 08:05:02.694391318 +0000 UTC m=+0.051799819 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:02 np0005625204.localdomain podman[62880]: 2026-02-20 08:05:02.817428288 +0000 UTC m=+0.174836759 container init 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, container_name=nova_virtstoraged, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com)
Feb 20 08:05:02 np0005625204.localdomain podman[62880]: 2026-02-20 08:05:02.827895508 +0000 UTC m=+0.185303989 container start 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., container_name=nova_virtstoraged, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:05:02 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:02 np0005625204.localdomain sudo[62902]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:02 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: Started Session c6 of User root.
Feb 20 08:05:02 np0005625204.localdomain sudo[62902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:02 np0005625204.localdomain sudo[62902]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:02 np0005625204.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Feb 20 08:05:03 np0005625204.localdomain podman[62987]: 2026-02-20 08:05:03.288851026 +0000 UTC m=+0.089149718 container create 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, architecture=x86_64, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, container_name=nova_virtqemud, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: Started libpod-conmon-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope.
Feb 20 08:05:03 np0005625204.localdomain podman[62987]: 2026-02-20 08:05:03.246588038 +0000 UTC m=+0.046886760 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain podman[62987]: 2026-02-20 08:05:03.362724538 +0000 UTC m=+0.163023240 container init 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, container_name=nova_virtqemud, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=)
Feb 20 08:05:03 np0005625204.localdomain podman[62987]: 2026-02-20 08:05:03.372516636 +0000 UTC m=+0.172815328 container start 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_virtqemud, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:05:03 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:03 np0005625204.localdomain sudo[63006]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:03 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: Started Session c7 of User root.
Feb 20 08:05:03 np0005625204.localdomain sudo[63006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:03 np0005625204.localdomain sudo[63006]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Feb 20 08:05:03 np0005625204.localdomain podman[63090]: 2026-02-20 08:05:03.803925834 +0000 UTC m=+0.073677286 container create e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtproxyd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: Started libpod-conmon-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34.scope.
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:03 np0005625204.localdomain podman[63090]: 2026-02-20 08:05:03.763831173 +0000 UTC m=+0.033582615 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:03 np0005625204.localdomain podman[63090]: 2026-02-20 08:05:03.875374043 +0000 UTC m=+0.145125485 container init e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, config_id=tripleo_step3, container_name=nova_virtproxyd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:05:03 np0005625204.localdomain podman[63090]: 2026-02-20 08:05:03.884195581 +0000 UTC m=+0.153947013 container start e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, tcib_managed=true, config_id=tripleo_step3, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 20 08:05:03 np0005625204.localdomain python3[61979]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:05:03 np0005625204.localdomain sudo[63109]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:03 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:05:03 np0005625204.localdomain systemd[1]: Started Session c8 of User root.
Feb 20 08:05:03 np0005625204.localdomain sudo[63109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:04 np0005625204.localdomain sudo[63109]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:04 np0005625204.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Feb 20 08:05:04 np0005625204.localdomain sudo[61977]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:04 np0005625204.localdomain sudo[63170]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obrebhxahrjgugrirebpkkkrwtlrpslv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:04 np0005625204.localdomain sudo[63170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:04 np0005625204.localdomain python3[63172]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:04 np0005625204.localdomain sudo[63170]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:04 np0005625204.localdomain sudo[63186]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyizmkxzxjzbinerhnzftsoxzdzudzgm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:04 np0005625204.localdomain sudo[63186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:04 np0005625204.localdomain python3[63188]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:04 np0005625204.localdomain sudo[63186]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:04 np0005625204.localdomain sudo[63202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxqxoymwessnorfggcxelgoxveyqurpp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:04 np0005625204.localdomain sudo[63202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:05 np0005625204.localdomain python3[63204]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:05 np0005625204.localdomain sudo[63202]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:05 np0005625204.localdomain sudo[63218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcjxjkrvnbfxxzcihrevboudsamvswqt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:05 np0005625204.localdomain sudo[63218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:05 np0005625204.localdomain sshd[62886]: Invalid user sshuser from 185.246.128.171 port 39364
Feb 20 08:05:05 np0005625204.localdomain python3[63220]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:05 np0005625204.localdomain sudo[63218]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:05 np0005625204.localdomain sudo[63234]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqsbwzzclzmpjxzjgxnpekcibkplbqze ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:05 np0005625204.localdomain sudo[63234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:05 np0005625204.localdomain python3[63236]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:05 np0005625204.localdomain sudo[63234]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:05 np0005625204.localdomain sudo[63250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgymzaokvmxukhevnoczklmcaaumxogx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:05 np0005625204.localdomain sudo[63250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:05 np0005625204.localdomain python3[63252]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:05 np0005625204.localdomain sudo[63250]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:05 np0005625204.localdomain sshd[62886]: Disconnecting invalid user sshuser 185.246.128.171 port 39364: Change of username or service not allowed: (sshuser,ssh-connection) -> (proxyuser,ssh-connection) [preauth]
Feb 20 08:05:05 np0005625204.localdomain sudo[63266]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckrknkzdglungffaryzpdepxkckxqqep ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:05 np0005625204.localdomain sudo[63266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:06 np0005625204.localdomain python3[63268]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:06 np0005625204.localdomain sudo[63266]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:06 np0005625204.localdomain sudo[63282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqjxuhceqsodneprdqwotnzdvkolsdby ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:06 np0005625204.localdomain sudo[63282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:06 np0005625204.localdomain python3[63284]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:06 np0005625204.localdomain sudo[63282]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:06 np0005625204.localdomain sudo[63298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnyznsxhdrfmfxuvltscsinuhpqxvxiw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:06 np0005625204.localdomain sudo[63298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:06 np0005625204.localdomain python3[63300]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:06 np0005625204.localdomain sudo[63298]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:06 np0005625204.localdomain sudo[63314]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otnrggtcjhjkutntsbdeogooeqtxydqz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:06 np0005625204.localdomain sudo[63314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:06 np0005625204.localdomain python3[63316]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:06 np0005625204.localdomain sudo[63314]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:06 np0005625204.localdomain sudo[63330]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sflonygzcgnexcvvahcxzutbpemqmtli ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:06 np0005625204.localdomain sudo[63330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:07 np0005625204.localdomain python3[63332]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:07 np0005625204.localdomain sudo[63330]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:07 np0005625204.localdomain sudo[63346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvoegeyyjltgyblpajgolsyyloxwpozy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:07 np0005625204.localdomain sudo[63346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:07 np0005625204.localdomain python3[63348]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:07 np0005625204.localdomain sudo[63346]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:07 np0005625204.localdomain sudo[63362]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksltkzhyoqdcdnpcxjjwxoizcgdyohud ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:07 np0005625204.localdomain sudo[63362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:07 np0005625204.localdomain python3[63364]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:07 np0005625204.localdomain sudo[63362]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:07 np0005625204.localdomain sudo[63378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdwvvzoqzjtdccwsmwtwqfyxyckxiilb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:07 np0005625204.localdomain sudo[63378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:07 np0005625204.localdomain python3[63380]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:07 np0005625204.localdomain sudo[63378]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:07 np0005625204.localdomain sudo[63394]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okocabrqzronnjmuqacmxwfgyjuajvtt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:07 np0005625204.localdomain sudo[63394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:08 np0005625204.localdomain python3[63396]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:08 np0005625204.localdomain sudo[63394]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:08 np0005625204.localdomain sudo[63410]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeqquasnxfvfmatqmszrfkffwlbsvoue ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:08 np0005625204.localdomain sudo[63410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:08 np0005625204.localdomain python3[63412]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:08 np0005625204.localdomain sudo[63410]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:08 np0005625204.localdomain sudo[63426]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmoqtpcmfoyftaeniynfxtzoqhmtypec ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:08 np0005625204.localdomain sudo[63426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:08 np0005625204.localdomain sshd[63429]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:08 np0005625204.localdomain python3[63428]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:08 np0005625204.localdomain sudo[63426]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:08 np0005625204.localdomain sudo[63443]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frdpgusandvjflqzrmclkovfsmkmxgvn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:08 np0005625204.localdomain sudo[63443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:08 np0005625204.localdomain python3[63445]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:05:08 np0005625204.localdomain sudo[63443]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:09 np0005625204.localdomain sudo[63506]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flzgxupgcyxvxunkvsnsrhbpnargjfgv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:09 np0005625204.localdomain sudo[63506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:09 np0005625204.localdomain python3[63508]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:09 np0005625204.localdomain sudo[63506]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:09 np0005625204.localdomain sshd[63429]: Invalid user proxyuser from 185.246.128.171 port 14551
Feb 20 08:05:09 np0005625204.localdomain sudo[63535]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esoeyuakutrsczoifmtwfhszfpzdoizf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:09 np0005625204.localdomain sudo[63535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:09 np0005625204.localdomain python3[63537]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:09 np0005625204.localdomain sudo[63535]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:09 np0005625204.localdomain sshd[63429]: Disconnecting invalid user proxyuser 185.246.128.171 port 14551: Change of username or service not allowed: (proxyuser,ssh-connection) -> (jenkins,ssh-connection) [preauth]
Feb 20 08:05:10 np0005625204.localdomain sudo[63564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lygxqxguldjblaqvqukmcmdoorhoplcv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:10 np0005625204.localdomain sudo[63564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:10 np0005625204.localdomain sshd[63567]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:10 np0005625204.localdomain python3[63566]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:10 np0005625204.localdomain sudo[63564]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:10 np0005625204.localdomain sudo[63595]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrjpcgtcdosnwwchjusomfuphtxcplgg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:10 np0005625204.localdomain sudo[63595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:10 np0005625204.localdomain python3[63597]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:10 np0005625204.localdomain sudo[63595]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:11 np0005625204.localdomain sudo[63624]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hofmpppoedzyrfnpkrthcfrtlowvpqun ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:11 np0005625204.localdomain sudo[63624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:11 np0005625204.localdomain python3[63626]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:11 np0005625204.localdomain sudo[63624]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:11 np0005625204.localdomain sshd[63567]: Invalid user jenkins from 185.246.128.171 port 27062
Feb 20 08:05:11 np0005625204.localdomain sudo[63653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmikxrfaxiccrucbdtqqiajtzfgnqzao ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:11 np0005625204.localdomain sudo[63653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:11 np0005625204.localdomain python3[63655]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:11 np0005625204.localdomain sudo[63653]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:12 np0005625204.localdomain sudo[63682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnlggjwjoucywyaxhztpceqtncauwjii ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:12 np0005625204.localdomain sudo[63682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:12 np0005625204.localdomain python3[63684]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:12 np0005625204.localdomain sudo[63682]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:12 np0005625204.localdomain sudo[63711]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljrrordznkpjsgaoqxcwzqoigrqahhre ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:12 np0005625204.localdomain sudo[63711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:12 np0005625204.localdomain python3[63713]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:12 np0005625204.localdomain sudo[63711]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:13 np0005625204.localdomain sudo[63740]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngjbqqbegqrukxiumrfeneqigmpbsjct ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:13 np0005625204.localdomain sudo[63740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:13 np0005625204.localdomain python3[63742]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574708.7487237-100712-185764439060249/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:13 np0005625204.localdomain sudo[63740]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:13 np0005625204.localdomain sudo[63756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlrfbmmzutvohmkgdzactrdxkzhdywve ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:13 np0005625204.localdomain sudo[63756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:13 np0005625204.localdomain python3[63758]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 08:05:13 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:13 np0005625204.localdomain systemd-rc-local-generator[63784]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:13 np0005625204.localdomain systemd-sysv-generator[63787]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:13 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:14 np0005625204.localdomain sshd[63567]: Disconnecting invalid user jenkins 185.246.128.171 port 27062: Change of username or service not allowed: (jenkins,ssh-connection) -> (soporte,ssh-connection) [preauth]
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Activating special unit Exit the Session...
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped target Main User Target.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped target Basic System.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped target Paths.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped target Sockets.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped target Timers.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Closed D-Bus User Message Bus Socket.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Stopped Create User's Volatile Files and Directories.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Removed slice User Application Slice.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Reached target Shutdown.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Finished Exit the Session.
Feb 20 08:05:14 np0005625204.localdomain systemd[62274]: Reached target Exit the Session.
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 20 08:05:14 np0005625204.localdomain sudo[63756]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 20 08:05:14 np0005625204.localdomain sudo[63808]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhmudycizxinacxqqgceidooldxhjxny ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:14 np0005625204.localdomain sudo[63808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:14 np0005625204.localdomain python3[63810]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:14 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:14 np0005625204.localdomain systemd-sysv-generator[63843]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:14 np0005625204.localdomain systemd-rc-local-generator[63840]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:15 np0005625204.localdomain systemd[1]: Starting collectd container...
Feb 20 08:05:15 np0005625204.localdomain systemd[1]: Started collectd container.
Feb 20 08:05:15 np0005625204.localdomain sudo[63808]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:15 np0005625204.localdomain sudo[63875]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfabwuwkfdcugunhtpevxdqscxkbyato ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:15 np0005625204.localdomain sudo[63875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:15 np0005625204.localdomain python3[63877]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:15 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:15 np0005625204.localdomain systemd-rc-local-generator[63906]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:15 np0005625204.localdomain systemd-sysv-generator[63909]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:15 np0005625204.localdomain sshd[63915]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:16 np0005625204.localdomain systemd[1]: Starting iscsid container...
Feb 20 08:05:16 np0005625204.localdomain systemd[1]: Started iscsid container.
Feb 20 08:05:16 np0005625204.localdomain sudo[63875]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:16 np0005625204.localdomain sudo[63941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvsyknhtwstiisdqrjwhpriianapiugk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:16 np0005625204.localdomain sudo[63941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:16 np0005625204.localdomain python3[63943]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:17 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:17 np0005625204.localdomain systemd-rc-local-generator[63973]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:17 np0005625204.localdomain systemd-sysv-generator[63976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:18 np0005625204.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Feb 20 08:05:18 np0005625204.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Feb 20 08:05:18 np0005625204.localdomain sudo[63941]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:18 np0005625204.localdomain sudo[64010]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epnuitokmipdgfommtabsbkpbeysqpsg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:18 np0005625204.localdomain sudo[64010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:18 np0005625204.localdomain sshd[63915]: Invalid user soporte from 185.246.128.171 port 3562
Feb 20 08:05:18 np0005625204.localdomain python3[64012]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:18 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:18 np0005625204.localdomain systemd-sysv-generator[64040]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:18 np0005625204.localdomain systemd-rc-local-generator[64035]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:19 np0005625204.localdomain sshd[63915]: Disconnecting invalid user soporte 185.246.128.171 port 3562: Change of username or service not allowed: (soporte,ssh-connection) -> (opc,ssh-connection) [preauth]
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: Starting nova_virtnodedevd container...
Feb 20 08:05:19 np0005625204.localdomain podman[64051]: 2026-02-20 08:05:19.237012316 +0000 UTC m=+0.084071724 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 20 08:05:19 np0005625204.localdomain tripleo-start-podman-container[64052]: Creating additional drop-in dependency for "nova_virtnodedevd" (b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1)
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:19 np0005625204.localdomain podman[64051]: 2026-02-20 08:05:19.459108121 +0000 UTC m=+0.306167489 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1)
Feb 20 08:05:19 np0005625204.localdomain systemd-rc-local-generator[64133]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:19 np0005625204.localdomain systemd-sysv-generator[64139]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:05:19 np0005625204.localdomain systemd[1]: Started nova_virtnodedevd container.
Feb 20 08:05:19 np0005625204.localdomain sudo[64010]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:19 np0005625204.localdomain sudo[64163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rubcqvplgdzhoworypqqfvgzrgjasvwi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:19 np0005625204.localdomain sudo[64163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:20 np0005625204.localdomain python3[64165]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:21 np0005625204.localdomain sshd[64168]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:21 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:21 np0005625204.localdomain systemd-sysv-generator[64200]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:21 np0005625204.localdomain systemd-rc-local-generator[64193]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:21 np0005625204.localdomain systemd[1]: Starting nova_virtproxyd container...
Feb 20 08:05:21 np0005625204.localdomain tripleo-start-podman-container[64206]: Creating additional drop-in dependency for "nova_virtproxyd" (e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34)
Feb 20 08:05:21 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:21 np0005625204.localdomain systemd-sysv-generator[64271]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:21 np0005625204.localdomain systemd-rc-local-generator[64267]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:22 np0005625204.localdomain systemd[1]: Started nova_virtproxyd container.
Feb 20 08:05:22 np0005625204.localdomain sudo[64163]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:22 np0005625204.localdomain sudo[64291]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brnvfndvshzbifthyrktvnwbutdtpfbq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:22 np0005625204.localdomain sudo[64291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:22 np0005625204.localdomain python3[64293]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:22 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:22 np0005625204.localdomain systemd-sysv-generator[64325]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:22 np0005625204.localdomain systemd-rc-local-generator[64322]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:23 np0005625204.localdomain systemd[1]: Starting nova_virtqemud container...
Feb 20 08:05:23 np0005625204.localdomain tripleo-start-podman-container[64333]: Creating additional drop-in dependency for "nova_virtqemud" (0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69)
Feb 20 08:05:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:23 np0005625204.localdomain systemd-rc-local-generator[64390]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:23 np0005625204.localdomain systemd-sysv-generator[64394]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:23 np0005625204.localdomain systemd[1]: Started nova_virtqemud container.
Feb 20 08:05:23 np0005625204.localdomain sudo[64291]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:23 np0005625204.localdomain sudo[64415]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtcqfxpxqjhkecisdypzfncqkezinevp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:23 np0005625204.localdomain sudo[64415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:24 np0005625204.localdomain python3[64417]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:24 np0005625204.localdomain systemd-rc-local-generator[64443]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:24 np0005625204.localdomain systemd-sysv-generator[64451]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:24 np0005625204.localdomain systemd[1]: Starting nova_virtsecretd container...
Feb 20 08:05:24 np0005625204.localdomain tripleo-start-podman-container[64458]: Creating additional drop-in dependency for "nova_virtsecretd" (c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2)
Feb 20 08:05:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:24 np0005625204.localdomain systemd-sysv-generator[64518]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:24 np0005625204.localdomain systemd-rc-local-generator[64513]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:25 np0005625204.localdomain systemd[1]: Started nova_virtsecretd container.
Feb 20 08:05:25 np0005625204.localdomain sudo[64415]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:25 np0005625204.localdomain sudo[64540]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqbabqbdyoiftytmheikfpetfxjtijus ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:25 np0005625204.localdomain sudo[64540]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:25 np0005625204.localdomain sshd[64168]: Invalid user opc from 185.246.128.171 port 38592
Feb 20 08:05:25 np0005625204.localdomain python3[64542]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:25 np0005625204.localdomain systemd-sysv-generator[64572]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:25 np0005625204.localdomain systemd-rc-local-generator[64569]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:25 np0005625204.localdomain systemd[1]: Starting nova_virtstoraged container...
Feb 20 08:05:26 np0005625204.localdomain tripleo-start-podman-container[64582]: Creating additional drop-in dependency for "nova_virtstoraged" (025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108)
Feb 20 08:05:26 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:26 np0005625204.localdomain systemd-rc-local-generator[64639]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:26 np0005625204.localdomain systemd-sysv-generator[64642]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:26 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:26 np0005625204.localdomain systemd[1]: Started nova_virtstoraged container.
Feb 20 08:05:26 np0005625204.localdomain sudo[64540]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:26 np0005625204.localdomain sudo[64665]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqhetvvokmsdguoqetdbiyrgjrzhgsll ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:05:26 np0005625204.localdomain sudo[64665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:27 np0005625204.localdomain python3[64667]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:05:27 np0005625204.localdomain systemd-rc-local-generator[64695]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:05:27 np0005625204.localdomain systemd-sysv-generator[64698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: Starting rsyslog container...
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:27 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:27 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:27 np0005625204.localdomain podman[64707]: 2026-02-20 08:05:27.561216621 +0000 UTC m=+0.133423875 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step3, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:05:27 np0005625204.localdomain podman[64707]: 2026-02-20 08:05:27.571392793 +0000 UTC m=+0.143600037 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3)
Feb 20 08:05:27 np0005625204.localdomain podman[64707]: rsyslog
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: Started rsyslog container.
Feb 20 08:05:27 np0005625204.localdomain sudo[64726]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:27 np0005625204.localdomain sudo[64726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:27 np0005625204.localdomain sudo[64665]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:27 np0005625204.localdomain sudo[64726]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:27 np0005625204.localdomain podman[64744]: 2026-02-20 08:05:27.743228821 +0000 UTC m=+0.054299099 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1766032510, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 20 08:05:27 np0005625204.localdomain podman[64744]: 2026-02-20 08:05:27.770954616 +0000 UTC m=+0.082024894 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:27 np0005625204.localdomain sudo[64780]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vohhuxawmwuraxzimgyckjfpvxkrubce ; /usr/bin/python3
Feb 20 08:05:27 np0005625204.localdomain sudo[64780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:27 np0005625204.localdomain podman[64759]: 2026-02-20 08:05:27.862045258 +0000 UTC m=+0.066954004 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, version=17.1.13, release=1766032510, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, vcs-type=git, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 20 08:05:27 np0005625204.localdomain podman[64759]: rsyslog
Feb 20 08:05:27 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 20 08:05:28 np0005625204.localdomain python3[64786]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:28 np0005625204.localdomain sudo[64780]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Stopped rsyslog container.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Starting rsyslog container...
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:28 np0005625204.localdomain podman[64788]: 2026-02-20 08:05:28.171879355 +0000 UTC m=+0.102966467 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=rsyslog, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, config_id=tripleo_step3)
Feb 20 08:05:28 np0005625204.localdomain podman[64788]: 2026-02-20 08:05:28.180377528 +0000 UTC m=+0.111464650 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:05:28 np0005625204.localdomain podman[64788]: rsyslog
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Started rsyslog container.
Feb 20 08:05:28 np0005625204.localdomain sudo[64808]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:28 np0005625204.localdomain sudo[64808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:28 np0005625204.localdomain sshd[64168]: Disconnecting invalid user opc 185.246.128.171 port 38592: Change of username or service not allowed: (opc,ssh-connection) -> (suresh,ssh-connection) [preauth]
Feb 20 08:05:28 np0005625204.localdomain sudo[64808]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:28 np0005625204.localdomain podman[64811]: 2026-02-20 08:05:28.358304616 +0000 UTC m=+0.059047090 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, container_name=rsyslog, vendor=Red Hat, Inc., config_id=tripleo_step3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public)
Feb 20 08:05:28 np0005625204.localdomain podman[64811]: 2026-02-20 08:05:28.38192486 +0000 UTC m=+0.082667294 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64)
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:28 np0005625204.localdomain podman[64846]: 2026-02-20 08:05:28.466940391 +0000 UTC m=+0.056715790 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public)
Feb 20 08:05:28 np0005625204.localdomain podman[64846]: rsyslog
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5-userdata-shm.mount: Deactivated successfully.
Feb 20 08:05:28 np0005625204.localdomain sudo[64880]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrirzodojrdywjeqrhdeglqwddrlpzdp ; /usr/bin/python3
Feb 20 08:05:28 np0005625204.localdomain sudo[64880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:28 np0005625204.localdomain sudo[64880]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Stopped rsyslog container.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Starting rsyslog container...
Feb 20 08:05:28 np0005625204.localdomain sudo[64935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fivzkugvxtopwwhzpbhxjxtzvsrunrno ; /usr/bin/python3
Feb 20 08:05:28 np0005625204.localdomain sudo[64935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: tmp-crun.KrCevN.mount: Deactivated successfully.
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:28 np0005625204.localdomain podman[64911]: 2026-02-20 08:05:28.956304363 +0000 UTC m=+0.130160107 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, container_name=rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog)
Feb 20 08:05:28 np0005625204.localdomain podman[64911]: 2026-02-20 08:05:28.968481805 +0000 UTC m=+0.142337559 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 20 08:05:28 np0005625204.localdomain podman[64911]: rsyslog
Feb 20 08:05:28 np0005625204.localdomain systemd[1]: Started rsyslog container.
Feb 20 08:05:28 np0005625204.localdomain sudo[64946]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:28 np0005625204.localdomain sudo[64946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:29 np0005625204.localdomain sudo[64946]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:29 np0005625204.localdomain sudo[64935]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:29 np0005625204.localdomain podman[64950]: 2026-02-20 08:05:29.140170067 +0000 UTC m=+0.055153752 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, release=1766032510, build-date=2026-01-12T22:10:09Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, vendor=Red Hat, Inc.)
Feb 20 08:05:29 np0005625204.localdomain podman[64950]: 2026-02-20 08:05:29.162948526 +0000 UTC m=+0.077932151 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2026-01-12T22:10:09Z, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com)
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:29 np0005625204.localdomain podman[64978]: 2026-02-20 08:05:29.253781081 +0000 UTC m=+0.055924346 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, build-date=2026-01-12T22:10:09Z, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, url=https://www.redhat.com)
Feb 20 08:05:29 np0005625204.localdomain podman[64978]: rsyslog
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 20 08:05:29 np0005625204.localdomain sudo[65002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdrlvlcdmgnugkfwbmonqzzgpibbutdq ; /usr/bin/python3
Feb 20 08:05:29 np0005625204.localdomain sudo[65002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:29 np0005625204.localdomain python3[65004]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005625204 step=3 update_config_hash_only=False
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully.
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5-userdata-shm.mount: Deactivated successfully.
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: Stopped rsyslog container.
Feb 20 08:05:29 np0005625204.localdomain sudo[65002]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: Starting rsyslog container...
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:29 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:29 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:29 np0005625204.localdomain podman[65005]: 2026-02-20 08:05:29.659991847 +0000 UTC m=+0.108411550 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:09Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5)
Feb 20 08:05:29 np0005625204.localdomain podman[65005]: 2026-02-20 08:05:29.670183131 +0000 UTC m=+0.118602804 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.expose-services=, container_name=rsyslog, batch=17.1_20260112.1, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:05:29 np0005625204.localdomain podman[65005]: rsyslog
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: Started rsyslog container.
Feb 20 08:05:29 np0005625204.localdomain sshd[65025]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:29 np0005625204.localdomain sudo[65027]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:29 np0005625204.localdomain sudo[65027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:29 np0005625204.localdomain sudo[65027]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:29 np0005625204.localdomain podman[65031]: 2026-02-20 08:05:29.830305709 +0000 UTC m=+0.051615659 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, vcs-type=git, container_name=rsyslog, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, name=rhosp-rhel9/openstack-rsyslog, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, vendor=Red Hat, Inc.)
Feb 20 08:05:29 np0005625204.localdomain podman[65031]: 2026-02-20 08:05:29.85789729 +0000 UTC m=+0.079207210 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, container_name=rsyslog, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:05:29 np0005625204.localdomain sshd[65025]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:29 np0005625204.localdomain sudo[65065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybaeheatznlzlnwiggqqvzrwweumhlyz ; /usr/bin/python3
Feb 20 08:05:29 np0005625204.localdomain sudo[65065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:29 np0005625204.localdomain podman[65043]: 2026-02-20 08:05:29.94286992 +0000 UTC m=+0.051352280 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, vendor=Red Hat, Inc.)
Feb 20 08:05:29 np0005625204.localdomain podman[65043]: rsyslog
Feb 20 08:05:29 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 20 08:05:30 np0005625204.localdomain sshd[65070]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:30 np0005625204.localdomain python3[65069]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:05:30 np0005625204.localdomain sudo[65065]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:30 np0005625204.localdomain sudo[65085]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epeukxujfkqpcwjpbvtroccvrotstbyr ; /usr/bin/python3
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: Stopped rsyslog container.
Feb 20 08:05:30 np0005625204.localdomain sudo[65085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: Starting rsyslog container...
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:05:30 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:30 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 20 08:05:30 np0005625204.localdomain podman[65088]: 2026-02-20 08:05:30.386721847 +0000 UTC m=+0.110981405 container init ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-rsyslog-container, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, architecture=x86_64)
Feb 20 08:05:30 np0005625204.localdomain podman[65088]: 2026-02-20 08:05:30.394870239 +0000 UTC m=+0.119129797 container start ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-rsyslog-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:05:30 np0005625204.localdomain podman[65088]: rsyslog
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: Started rsyslog container.
Feb 20 08:05:30 np0005625204.localdomain sudo[65106]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:05:30 np0005625204.localdomain sudo[65106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:05:30 np0005625204.localdomain python3[65087]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 20 08:05:30 np0005625204.localdomain sudo[65085]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:30 np0005625204.localdomain sudo[65106]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: libpod-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5.scope: Deactivated successfully.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tmp-crun.r9jCUM.mount: Deactivated successfully.
Feb 20 08:05:30 np0005625204.localdomain podman[65109]: 2026-02-20 08:05:30.559894943 +0000 UTC m=+0.050101113 container died ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-rsyslog, container_name=rsyslog, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, distribution-scope=public)
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5-userdata-shm.mount: Deactivated successfully.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully.
Feb 20 08:05:30 np0005625204.localdomain podman[65109]: 2026-02-20 08:05:30.583194187 +0000 UTC m=+0.073400337 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, description=Red Hat OpenStack Platform 17.1 rsyslog, container_name=rsyslog, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3)
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:05:30 np0005625204.localdomain podman[65121]: 2026-02-20 08:05:30.671303381 +0000 UTC m=+0.058174663 container cleanup ee32eb0c88b9d59c33bb700a5cc7916c91e91fd297f7f0e485157219c25b37e5 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'eb8c5e608f55bc52c95871f92a543185'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, container_name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-rsyslog-container, vcs-type=git)
Feb 20 08:05:30 np0005625204.localdomain podman[65121]: rsyslog
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: Stopped rsyslog container.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 20 08:05:30 np0005625204.localdomain systemd[1]: Failed to start rsyslog container.
Feb 20 08:05:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:05:31 np0005625204.localdomain podman[65132]: 2026-02-20 08:05:31.145037408 +0000 UTC m=+0.086043194 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=collectd, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z)
Feb 20 08:05:31 np0005625204.localdomain podman[65132]: 2026-02-20 08:05:31.156082017 +0000 UTC m=+0.097087823 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:05:31 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:05:31 np0005625204.localdomain sshd[65070]: Invalid user suresh from 185.246.128.171 port 33846
Feb 20 08:05:31 np0005625204.localdomain sshd[65070]: Disconnecting invalid user suresh 185.246.128.171 port 33846: Change of username or service not allowed: (suresh,ssh-connection) -> (redmine,ssh-connection) [preauth]
Feb 20 08:05:32 np0005625204.localdomain sshd[65153]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:05:33 np0005625204.localdomain podman[65155]: 2026-02-20 08:05:33.135801859 +0000 UTC m=+0.075863751 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:05:33 np0005625204.localdomain podman[65155]: 2026-02-20 08:05:33.146375953 +0000 UTC m=+0.086437905 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, com.redhat.component=openstack-iscsid-container)
Feb 20 08:05:33 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:05:33 np0005625204.localdomain sshd[65153]: Invalid user redmine from 185.246.128.171 port 48917
Feb 20 08:05:34 np0005625204.localdomain sshd[65153]: Disconnecting invalid user redmine 185.246.128.171 port 48917: Change of username or service not allowed: (redmine,ssh-connection) -> (hamed,ssh-connection) [preauth]
Feb 20 08:05:37 np0005625204.localdomain sshd[65174]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:41 np0005625204.localdomain sshd[65174]: Invalid user hamed from 185.246.128.171 port 24596
Feb 20 08:05:41 np0005625204.localdomain sshd[65174]: Disconnecting invalid user hamed 185.246.128.171 port 24596: Change of username or service not allowed: (hamed,ssh-connection) -> (bkp,ssh-connection) [preauth]
Feb 20 08:05:42 np0005625204.localdomain sshd[65176]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:45 np0005625204.localdomain sshd[65176]: Invalid user bkp from 185.246.128.171 port 55866
Feb 20 08:05:46 np0005625204.localdomain sshd[65176]: Disconnecting invalid user bkp 185.246.128.171 port 55866: Change of username or service not allowed: (bkp,ssh-connection) -> (Cisco,ssh-connection) [preauth]
Feb 20 08:05:46 np0005625204.localdomain sudo[65178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:05:46 np0005625204.localdomain sudo[65178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:05:46 np0005625204.localdomain sudo[65178]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:46 np0005625204.localdomain sudo[65193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:05:46 np0005625204.localdomain sudo[65193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:05:46 np0005625204.localdomain sshd[65208]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:47 np0005625204.localdomain sudo[65193]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:47 np0005625204.localdomain sshd[65208]: Invalid user Cisco from 185.246.128.171 port 21291
Feb 20 08:05:47 np0005625204.localdomain sshd[65208]: Disconnecting invalid user Cisco 185.246.128.171 port 21291: Change of username or service not allowed: (Cisco,ssh-connection) -> (hscroot,ssh-connection) [preauth]
Feb 20 08:05:48 np0005625204.localdomain sudo[65241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:05:48 np0005625204.localdomain sudo[65241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:05:48 np0005625204.localdomain sudo[65241]: pam_unix(sudo:session): session closed for user root
Feb 20 08:05:48 np0005625204.localdomain sshd[65256]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:49 np0005625204.localdomain sshd[65256]: Invalid user hscroot from 185.246.128.171 port 31756
Feb 20 08:05:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:05:50 np0005625204.localdomain podman[65258]: 2026-02-20 08:05:50.017595803 +0000 UTC m=+0.090105784 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, config_id=tripleo_step1, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:05:50 np0005625204.localdomain sshd[65256]: Disconnecting invalid user hscroot 185.246.128.171 port 31756: Change of username or service not allowed: (hscroot,ssh-connection) -> (user6,ssh-connection) [preauth]
Feb 20 08:05:50 np0005625204.localdomain podman[65258]: 2026-02-20 08:05:50.258159577 +0000 UTC m=+0.330669558 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:05:50 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:05:51 np0005625204.localdomain sshd[65287]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:54 np0005625204.localdomain sshd[65287]: Invalid user user6 from 185.246.128.171 port 51836
Feb 20 08:05:54 np0005625204.localdomain sshd[65287]: Disconnecting invalid user user6 185.246.128.171 port 51836: Change of username or service not allowed: (user6,ssh-connection) -> (vali,ssh-connection) [preauth]
Feb 20 08:05:56 np0005625204.localdomain sshd[65289]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:05:58 np0005625204.localdomain sshd[65289]: Invalid user vali from 185.246.128.171 port 29288
Feb 20 08:05:58 np0005625204.localdomain sshd[65289]: Disconnecting invalid user vali 185.246.128.171 port 29288: Change of username or service not allowed: (vali,ssh-connection) -> (adfexc,ssh-connection) [preauth]
Feb 20 08:05:59 np0005625204.localdomain sshd[65291]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:01 np0005625204.localdomain sshd[65291]: Invalid user adfexc from 185.246.128.171 port 49973
Feb 20 08:06:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:06:01 np0005625204.localdomain systemd[1]: tmp-crun.VwWk1N.mount: Deactivated successfully.
Feb 20 08:06:01 np0005625204.localdomain podman[65293]: 2026-02-20 08:06:01.687795422 +0000 UTC m=+0.097032381 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 20 08:06:01 np0005625204.localdomain podman[65293]: 2026-02-20 08:06:01.723566276 +0000 UTC m=+0.132803175 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64)
Feb 20 08:06:01 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:06:01 np0005625204.localdomain sshd[65291]: Disconnecting invalid user adfexc 185.246.128.171 port 49973: Change of username or service not allowed: (adfexc,ssh-connection) -> (prometheus,ssh-connection) [preauth]
Feb 20 08:06:02 np0005625204.localdomain sshd[65315]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:06:04 np0005625204.localdomain podman[65317]: 2026-02-20 08:06:04.136416196 +0000 UTC m=+0.075570042 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 20 08:06:04 np0005625204.localdomain podman[65317]: 2026-02-20 08:06:04.148079713 +0000 UTC m=+0.087233629 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 20 08:06:04 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:06:05 np0005625204.localdomain sshd[65315]: Invalid user prometheus from 185.246.128.171 port 9166
Feb 20 08:06:05 np0005625204.localdomain sshd[65315]: Disconnecting invalid user prometheus 185.246.128.171 port 9166: Change of username or service not allowed: (prometheus,ssh-connection) -> (RPM,ssh-connection) [preauth]
Feb 20 08:06:07 np0005625204.localdomain sshd[65335]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:11 np0005625204.localdomain sshd[65335]: Invalid user RPM from 185.246.128.171 port 44903
Feb 20 08:06:12 np0005625204.localdomain sshd[65335]: Disconnecting invalid user RPM 185.246.128.171 port 44903: Change of username or service not allowed: (RPM,ssh-connection) -> (api,ssh-connection) [preauth]
Feb 20 08:06:14 np0005625204.localdomain sshd[65337]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:16 np0005625204.localdomain sshd[65339]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:16 np0005625204.localdomain sshd[65339]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:06:17 np0005625204.localdomain sshd[65337]: Invalid user api from 185.246.128.171 port 32545
Feb 20 08:06:18 np0005625204.localdomain sshd[65337]: Disconnecting invalid user api 185.246.128.171 port 32545: Change of username or service not allowed: (api,ssh-connection) -> (riscv,ssh-connection) [preauth]
Feb 20 08:06:19 np0005625204.localdomain sshd[65341]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:06:21 np0005625204.localdomain podman[65343]: 2026-02-20 08:06:21.145275395 +0000 UTC m=+0.084280851 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 20 08:06:21 np0005625204.localdomain podman[65343]: 2026-02-20 08:06:21.379596932 +0000 UTC m=+0.318602348 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 20 08:06:21 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:06:21 np0005625204.localdomain sshd[65341]: Invalid user riscv from 185.246.128.171 port 3435
Feb 20 08:06:21 np0005625204.localdomain sshd[65341]: Disconnecting invalid user riscv 185.246.128.171 port 3435: Change of username or service not allowed: (riscv,ssh-connection) -> (mit,ssh-connection) [preauth]
Feb 20 08:06:22 np0005625204.localdomain sshd[65371]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:23 np0005625204.localdomain sshd[65371]: Invalid user mit from 185.246.128.171 port 28463
Feb 20 08:06:24 np0005625204.localdomain sshd[65371]: Disconnecting invalid user mit 185.246.128.171 port 28463: Change of username or service not allowed: (mit,ssh-connection) -> (auditor,ssh-connection) [preauth]
Feb 20 08:06:24 np0005625204.localdomain sshd[65373]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:27 np0005625204.localdomain sshd[65373]: Invalid user auditor from 185.246.128.171 port 44297
Feb 20 08:06:28 np0005625204.localdomain sshd[65373]: Disconnecting invalid user auditor 185.246.128.171 port 44297: Change of username or service not allowed: (auditor,ssh-connection) -> (ftp,ssh-connection) [preauth]
Feb 20 08:06:29 np0005625204.localdomain sshd[65375]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:06:32 np0005625204.localdomain podman[65377]: 2026-02-20 08:06:32.124169837 +0000 UTC m=+0.068176451 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=collectd, version=17.1.13)
Feb 20 08:06:32 np0005625204.localdomain podman[65377]: 2026-02-20 08:06:32.131684881 +0000 UTC m=+0.075691475 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13)
Feb 20 08:06:32 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:06:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:06:35 np0005625204.localdomain podman[65397]: 2026-02-20 08:06:35.116207822 +0000 UTC m=+0.062495632 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, url=https://www.redhat.com, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:06:35 np0005625204.localdomain podman[65397]: 2026-02-20 08:06:35.149016879 +0000 UTC m=+0.095304669 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container)
Feb 20 08:06:35 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:06:38 np0005625204.localdomain sshd[65375]: error: maximum authentication attempts exceeded for ftp from 185.246.128.171 port 13762 ssh2 [preauth]
Feb 20 08:06:38 np0005625204.localdomain sshd[65375]: Disconnecting authenticating user ftp 185.246.128.171 port 13762: Too many authentication failures [preauth]
Feb 20 08:06:38 np0005625204.localdomain sshd[65417]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:41 np0005625204.localdomain sshd[65417]: Disconnecting authenticating user ftp 185.246.128.171 port 24597: Change of username or service not allowed: (ftp,ssh-connection) -> (administrator,ssh-connection) [preauth]
Feb 20 08:06:42 np0005625204.localdomain sshd[65419]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:45 np0005625204.localdomain sshd[65419]: Invalid user administrator from 185.246.128.171 port 53957
Feb 20 08:06:45 np0005625204.localdomain sshd[65419]: error: maximum authentication attempts exceeded for invalid user administrator from 185.246.128.171 port 53957 ssh2 [preauth]
Feb 20 08:06:45 np0005625204.localdomain sshd[65419]: Disconnecting invalid user administrator 185.246.128.171 port 53957: Too many authentication failures [preauth]
Feb 20 08:06:46 np0005625204.localdomain sshd[65421]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:48 np0005625204.localdomain sudo[65423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:06:48 np0005625204.localdomain sudo[65423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:06:48 np0005625204.localdomain sudo[65423]: pam_unix(sudo:session): session closed for user root
Feb 20 08:06:48 np0005625204.localdomain sudo[65438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:06:48 np0005625204.localdomain sudo[65438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:06:48 np0005625204.localdomain sudo[65438]: pam_unix(sudo:session): session closed for user root
Feb 20 08:06:49 np0005625204.localdomain sshd[65421]: Invalid user administrator from 185.246.128.171 port 25564
Feb 20 08:06:49 np0005625204.localdomain sudo[65484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:06:49 np0005625204.localdomain sudo[65484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:06:49 np0005625204.localdomain sudo[65484]: pam_unix(sudo:session): session closed for user root
Feb 20 08:06:51 np0005625204.localdomain sshd[65421]: Disconnecting invalid user administrator 185.246.128.171 port 25564: Change of username or service not allowed: (administrator,ssh-connection) -> (dev1,ssh-connection) [preauth]
Feb 20 08:06:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:06:51 np0005625204.localdomain podman[65499]: 2026-02-20 08:06:51.662003594 +0000 UTC m=+0.088648031 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:06:51 np0005625204.localdomain podman[65499]: 2026-02-20 08:06:51.864243617 +0000 UTC m=+0.290888024 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 20 08:06:51 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:06:51 np0005625204.localdomain sshd[65529]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:54 np0005625204.localdomain sshd[65529]: Invalid user dev1 from 185.246.128.171 port 2353
Feb 20 08:06:54 np0005625204.localdomain sshd[65529]: Disconnecting invalid user dev1 185.246.128.171 port 2353: Change of username or service not allowed: (dev1,ssh-connection) -> (hadoop,ssh-connection) [preauth]
Feb 20 08:06:55 np0005625204.localdomain sshd[65531]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:06:57 np0005625204.localdomain sshd[65531]: Invalid user hadoop from 185.246.128.171 port 32701
Feb 20 08:06:57 np0005625204.localdomain sshd[65531]: Disconnecting invalid user hadoop 185.246.128.171 port 32701: Change of username or service not allowed: (hadoop,ssh-connection) -> (redhat,ssh-connection) [preauth]
Feb 20 08:06:58 np0005625204.localdomain sshd[65533]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:01 np0005625204.localdomain sshd[65533]: Invalid user redhat from 185.246.128.171 port 61004
Feb 20 08:07:02 np0005625204.localdomain sshd[65533]: Disconnecting invalid user redhat 185.246.128.171 port 61004: Change of username or service not allowed: (redhat,ssh-connection) -> (pablo,ssh-connection) [preauth]
Feb 20 08:07:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:07:02 np0005625204.localdomain podman[65535]: 2026-02-20 08:07:02.373473222 +0000 UTC m=+0.086049412 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 20 08:07:02 np0005625204.localdomain podman[65535]: 2026-02-20 08:07:02.38984652 +0000 UTC m=+0.102422680 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 20 08:07:02 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:07:02 np0005625204.localdomain sshd[65555]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:03 np0005625204.localdomain sshd[65555]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:07:03 np0005625204.localdomain sshd[65557]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:04 np0005625204.localdomain sshd[65557]: Invalid user pablo from 185.246.128.171 port 39010
Feb 20 08:07:05 np0005625204.localdomain sshd[65557]: Disconnecting invalid user pablo 185.246.128.171 port 39010: Change of username or service not allowed: (pablo,ssh-connection) -> (cisco,ssh-connection) [preauth]
Feb 20 08:07:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:07:05 np0005625204.localdomain systemd[1]: tmp-crun.TVxCk8.mount: Deactivated successfully.
Feb 20 08:07:05 np0005625204.localdomain podman[65559]: 2026-02-20 08:07:05.458391064 +0000 UTC m=+0.098936288 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:07:05 np0005625204.localdomain podman[65559]: 2026-02-20 08:07:05.468252507 +0000 UTC m=+0.108797721 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, url=https://www.redhat.com)
Feb 20 08:07:05 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:07:05 np0005625204.localdomain sshd[65578]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:05 np0005625204.localdomain sshd[65579]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:08 np0005625204.localdomain sshd[65579]: Invalid user cisco from 185.246.128.171 port 5783
Feb 20 08:07:10 np0005625204.localdomain sshd[65579]: Disconnecting invalid user cisco 185.246.128.171 port 5783: Change of username or service not allowed: (cisco,ssh-connection) -> (note,ssh-connection) [preauth]
Feb 20 08:07:13 np0005625204.localdomain sshd[65581]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:16 np0005625204.localdomain sshd[65578]: error: kex_exchange_identification: read: Connection timed out
Feb 20 08:07:16 np0005625204.localdomain sshd[65578]: banner exchange: Connection from 115.190.172.63 port 38448: Connection timed out
Feb 20 08:07:16 np0005625204.localdomain sshd[65581]: Invalid user note from 185.246.128.171 port 24914
Feb 20 08:07:17 np0005625204.localdomain sshd[65581]: Disconnecting invalid user note 185.246.128.171 port 24914: Change of username or service not allowed: (note,ssh-connection) -> (VYOS,ssh-connection) [preauth]
Feb 20 08:07:17 np0005625204.localdomain sshd[65583]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:19 np0005625204.localdomain sshd[65583]: Invalid user VYOS from 185.246.128.171 port 5958
Feb 20 08:07:19 np0005625204.localdomain sshd[65583]: Disconnecting invalid user VYOS 185.246.128.171 port 5958: Change of username or service not allowed: (VYOS,ssh-connection) -> (marek,ssh-connection) [preauth]
Feb 20 08:07:20 np0005625204.localdomain sshd[65585]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:07:22 np0005625204.localdomain podman[65587]: 2026-02-20 08:07:22.1279396 +0000 UTC m=+0.072461399 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 20 08:07:22 np0005625204.localdomain sshd[65585]: Invalid user marek from 185.246.128.171 port 32542
Feb 20 08:07:22 np0005625204.localdomain podman[65587]: 2026-02-20 08:07:22.308069314 +0000 UTC m=+0.252591163 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:07:22 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:07:22 np0005625204.localdomain sshd[65585]: Disconnecting invalid user marek 185.246.128.171 port 32542: Change of username or service not allowed: (marek,ssh-connection) -> (gang,ssh-connection) [preauth]
Feb 20 08:07:24 np0005625204.localdomain sshd[65616]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:26 np0005625204.localdomain sshd[65616]: Invalid user gang from 185.246.128.171 port 15429
Feb 20 08:07:27 np0005625204.localdomain sshd[65616]: Disconnecting invalid user gang 185.246.128.171 port 15429: Change of username or service not allowed: (gang,ssh-connection) -> (wangqi,ssh-connection) [preauth]
Feb 20 08:07:28 np0005625204.localdomain sshd[65618]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:28 np0005625204.localdomain sshd[65620]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:29 np0005625204.localdomain sshd[65618]: Invalid user oracle from 152.32.189.21 port 52144
Feb 20 08:07:29 np0005625204.localdomain sshd[65618]: Received disconnect from 152.32.189.21 port 52144:11: Bye Bye [preauth]
Feb 20 08:07:29 np0005625204.localdomain sshd[65618]: Disconnected from invalid user oracle 152.32.189.21 port 52144 [preauth]
Feb 20 08:07:31 np0005625204.localdomain sshd[65620]: Invalid user wangqi from 185.246.128.171 port 57414
Feb 20 08:07:32 np0005625204.localdomain sshd[65620]: Disconnecting invalid user wangqi 185.246.128.171 port 57414: Change of username or service not allowed: (wangqi,ssh-connection) -> (staff,ssh-connection) [preauth]
Feb 20 08:07:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:07:32 np0005625204.localdomain podman[65622]: 2026-02-20 08:07:32.634840823 +0000 UTC m=+0.077898164 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:07:32 np0005625204.localdomain podman[65622]: 2026-02-20 08:07:32.648005788 +0000 UTC m=+0.091063139 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:07:32 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:07:33 np0005625204.localdomain sshd[65643]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:35 np0005625204.localdomain sshd[65643]: Invalid user staff from 185.246.128.171 port 43526
Feb 20 08:07:35 np0005625204.localdomain sshd[65643]: Disconnecting invalid user staff 185.246.128.171 port 43526: Change of username or service not allowed: (staff,ssh-connection) -> (hduser,ssh-connection) [preauth]
Feb 20 08:07:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:07:35 np0005625204.localdomain podman[65645]: 2026-02-20 08:07:35.935192735 +0000 UTC m=+0.079491773 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 20 08:07:35 np0005625204.localdomain podman[65645]: 2026-02-20 08:07:35.942895022 +0000 UTC m=+0.087194040 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, architecture=x86_64)
Feb 20 08:07:35 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:07:37 np0005625204.localdomain sshd[65664]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:40 np0005625204.localdomain sshd[65664]: Invalid user hduser from 185.246.128.171 port 24916
Feb 20 08:07:41 np0005625204.localdomain sshd[65664]: Disconnecting invalid user hduser 185.246.128.171 port 24916: Change of username or service not allowed: (hduser,ssh-connection) -> (connect,ssh-connection) [preauth]
Feb 20 08:07:43 np0005625204.localdomain sshd[65666]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:45 np0005625204.localdomain sshd[65666]: Invalid user connect from 185.246.128.171 port 24527
Feb 20 08:07:46 np0005625204.localdomain sshd[65666]: Disconnecting invalid user connect 185.246.128.171 port 24527: Change of username or service not allowed: (connect,ssh-connection) -> (nsroot,ssh-connection) [preauth]
Feb 20 08:07:47 np0005625204.localdomain sshd[65668]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:49 np0005625204.localdomain sshd[65668]: Invalid user nsroot from 185.246.128.171 port 8396
Feb 20 08:07:49 np0005625204.localdomain sudo[65670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:07:49 np0005625204.localdomain sudo[65670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:07:49 np0005625204.localdomain sudo[65670]: pam_unix(sudo:session): session closed for user root
Feb 20 08:07:49 np0005625204.localdomain sudo[65685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:07:49 np0005625204.localdomain sudo[65685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:07:50 np0005625204.localdomain sshd[65668]: Disconnecting invalid user nsroot 185.246.128.171 port 8396: Change of username or service not allowed: (nsroot,ssh-connection) -> (rob,ssh-connection) [preauth]
Feb 20 08:07:50 np0005625204.localdomain sshd[65717]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:50 np0005625204.localdomain sudo[65685]: pam_unix(sudo:session): session closed for user root
Feb 20 08:07:50 np0005625204.localdomain sshd[65717]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:07:52 np0005625204.localdomain sshd[65733]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:07:53 np0005625204.localdomain systemd[1]: tmp-crun.r2aXiz.mount: Deactivated successfully.
Feb 20 08:07:53 np0005625204.localdomain podman[65734]: 2026-02-20 08:07:53.166482817 +0000 UTC m=+0.102283336 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:07:53 np0005625204.localdomain podman[65734]: 2026-02-20 08:07:53.36830248 +0000 UTC m=+0.304103009 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:07:53 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:07:54 np0005625204.localdomain sudo[65764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:07:54 np0005625204.localdomain sudo[65764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:07:54 np0005625204.localdomain sudo[65764]: pam_unix(sudo:session): session closed for user root
Feb 20 08:07:55 np0005625204.localdomain sshd[65733]: Invalid user rob from 185.246.128.171 port 63502
Feb 20 08:07:55 np0005625204.localdomain sshd[65733]: Disconnecting invalid user rob 185.246.128.171 port 63502: Change of username or service not allowed: (rob,ssh-connection) -> (amir,ssh-connection) [preauth]
Feb 20 08:07:55 np0005625204.localdomain sshd[65779]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:57 np0005625204.localdomain sshd[65779]: Invalid user amir from 185.246.128.171 port 38948
Feb 20 08:07:57 np0005625204.localdomain sshd[65779]: Disconnecting invalid user amir 185.246.128.171 port 38948: Change of username or service not allowed: (amir,ssh-connection) -> (jimmy,ssh-connection) [preauth]
Feb 20 08:07:58 np0005625204.localdomain sshd[65781]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:07:59 np0005625204.localdomain sshd[65781]: Invalid user jimmy from 185.246.128.171 port 2664
Feb 20 08:07:59 np0005625204.localdomain sshd[65781]: Disconnecting invalid user jimmy 185.246.128.171 port 2664: Change of username or service not allowed: (jimmy,ssh-connection) -> (visitors,ssh-connection) [preauth]
Feb 20 08:08:01 np0005625204.localdomain sshd[65783]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:08:03 np0005625204.localdomain systemd[1]: tmp-crun.eyybr9.mount: Deactivated successfully.
Feb 20 08:08:03 np0005625204.localdomain podman[65785]: 2026-02-20 08:08:03.159521363 +0000 UTC m=+0.096933540 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 20 08:08:03 np0005625204.localdomain podman[65785]: 2026-02-20 08:08:03.171068089 +0000 UTC m=+0.108480266 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Feb 20 08:08:03 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:08:03 np0005625204.localdomain sshd[65783]: Invalid user visitors from 185.246.128.171 port 37059
Feb 20 08:08:04 np0005625204.localdomain sshd[65783]: Disconnecting invalid user visitors 185.246.128.171 port 37059: Change of username or service not allowed: (visitors,ssh-connection) -> (array,ssh-connection) [preauth]
Feb 20 08:08:04 np0005625204.localdomain sshd[65806]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:06 np0005625204.localdomain sshd[65808]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:08:06 np0005625204.localdomain podman[65809]: 2026-02-20 08:08:06.229792261 +0000 UTC m=+0.042769349 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:08:06 np0005625204.localdomain podman[65809]: 2026-02-20 08:08:06.235038983 +0000 UTC m=+0.048016081 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, vcs-type=git, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510)
Feb 20 08:08:06 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:08:06 np0005625204.localdomain sshd[65806]: Invalid user claude from 101.36.109.176 port 46146
Feb 20 08:08:06 np0005625204.localdomain sshd[65806]: Received disconnect from 101.36.109.176 port 46146:11: Bye Bye [preauth]
Feb 20 08:08:06 np0005625204.localdomain sshd[65806]: Disconnected from invalid user claude 101.36.109.176 port 46146 [preauth]
Feb 20 08:08:09 np0005625204.localdomain sshd[65808]: Invalid user array from 185.246.128.171 port 25874
Feb 20 08:08:10 np0005625204.localdomain sshd[65808]: Disconnecting invalid user array 185.246.128.171 port 25874: Change of username or service not allowed: (array,ssh-connection) -> (user20,ssh-connection) [preauth]
Feb 20 08:08:11 np0005625204.localdomain sshd[65830]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:12 np0005625204.localdomain sshd[65830]: Received disconnect from 178.217.173.50 port 55950:11: Bye Bye [preauth]
Feb 20 08:08:12 np0005625204.localdomain sshd[65830]: Disconnected from authenticating user root 178.217.173.50 port 55950 [preauth]
Feb 20 08:08:13 np0005625204.localdomain sshd[65832]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:14 np0005625204.localdomain sshd[65832]: Invalid user user20 from 185.246.128.171 port 43667
Feb 20 08:08:15 np0005625204.localdomain sshd[65832]: Disconnecting invalid user user20 185.246.128.171 port 43667: Change of username or service not allowed: (user20,ssh-connection) -> (proxy,ssh-connection) [preauth]
Feb 20 08:08:17 np0005625204.localdomain sshd[65834]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:19 np0005625204.localdomain sshd[65834]: Invalid user proxy from 185.246.128.171 port 33837
Feb 20 08:08:20 np0005625204.localdomain sshd[65834]: Disconnecting invalid user proxy 185.246.128.171 port 33837: Change of username or service not allowed: (proxy,ssh-connection) -> (kafka,ssh-connection) [preauth]
Feb 20 08:08:21 np0005625204.localdomain sshd[65836]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:23 np0005625204.localdomain sshd[65836]: Invalid user kafka from 185.246.128.171 port 12315
Feb 20 08:08:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:08:23 np0005625204.localdomain systemd[1]: tmp-crun.eeBItp.mount: Deactivated successfully.
Feb 20 08:08:23 np0005625204.localdomain podman[65838]: 2026-02-20 08:08:23.995271865 +0000 UTC m=+0.092215033 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:08:24 np0005625204.localdomain podman[65838]: 2026-02-20 08:08:24.174252975 +0000 UTC m=+0.271196183 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:08:24 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:08:24 np0005625204.localdomain sshd[65836]: Disconnecting invalid user kafka 185.246.128.171 port 12315: Change of username or service not allowed: (kafka,ssh-connection) -> (smart,ssh-connection) [preauth]
Feb 20 08:08:25 np0005625204.localdomain sshd[65866]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:28 np0005625204.localdomain sshd[65866]: Invalid user smart from 185.246.128.171 port 60318
Feb 20 08:08:29 np0005625204.localdomain sshd[65866]: Disconnecting invalid user smart 185.246.128.171 port 60318: Change of username or service not allowed: (smart,ssh-connection) -> (popo,ssh-connection) [preauth]
Feb 20 08:08:30 np0005625204.localdomain sshd[65868]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:32 np0005625204.localdomain sshd[65868]: Invalid user popo from 185.246.128.171 port 59589
Feb 20 08:08:32 np0005625204.localdomain sshd[65868]: Disconnecting invalid user popo 185.246.128.171 port 59589: Change of username or service not allowed: (popo,ssh-connection) -> (gwei,ssh-connection) [preauth]
Feb 20 08:08:32 np0005625204.localdomain sshd[65870]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:33 np0005625204.localdomain sshd[65870]: Invalid user gwei from 185.246.128.171 port 22441
Feb 20 08:08:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:08:33 np0005625204.localdomain podman[65872]: 2026-02-20 08:08:33.965229478 +0000 UTC m=+0.087943012 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true)
Feb 20 08:08:33 np0005625204.localdomain podman[65872]: 2026-02-20 08:08:33.974872806 +0000 UTC m=+0.097586340 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:08:33 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:08:34 np0005625204.localdomain sshd[65870]: Disconnecting invalid user gwei 185.246.128.171 port 22441: Change of username or service not allowed: (gwei,ssh-connection) -> (tt,ssh-connection) [preauth]
Feb 20 08:08:34 np0005625204.localdomain sshd[65893]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:35 np0005625204.localdomain sshd[65893]: Invalid user tt from 185.246.128.171 port 43823
Feb 20 08:08:36 np0005625204.localdomain sshd[65893]: Disconnecting invalid user tt 185.246.128.171 port 43823: Change of username or service not allowed: (tt,ssh-connection) -> (astra,ssh-connection) [preauth]
Feb 20 08:08:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:08:36 np0005625204.localdomain podman[65895]: 2026-02-20 08:08:36.959069419 +0000 UTC m=+0.076685445 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z)
Feb 20 08:08:36 np0005625204.localdomain podman[65895]: 2026-02-20 08:08:36.968571703 +0000 UTC m=+0.086187759 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container)
Feb 20 08:08:36 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:08:37 np0005625204.localdomain sshd[65916]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:37 np0005625204.localdomain sshd[65917]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:38 np0005625204.localdomain sshd[65916]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:08:40 np0005625204.localdomain sshd[65917]: Invalid user astra from 185.246.128.171 port 18188
Feb 20 08:08:40 np0005625204.localdomain sshd[65920]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:40 np0005625204.localdomain sshd[65917]: Disconnecting invalid user astra 185.246.128.171 port 18188: Change of username or service not allowed: (astra,ssh-connection) -> (validator,ssh-connection) [preauth]
Feb 20 08:08:41 np0005625204.localdomain sshd[65920]: Invalid user pasi from 77.232.138.190 port 54768
Feb 20 08:08:41 np0005625204.localdomain sshd[65920]: Received disconnect from 77.232.138.190 port 54768:11: Bye Bye [preauth]
Feb 20 08:08:41 np0005625204.localdomain sshd[65920]: Disconnected from invalid user pasi 77.232.138.190 port 54768 [preauth]
Feb 20 08:08:41 np0005625204.localdomain sshd[65922]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:43 np0005625204.localdomain sshd[65922]: Invalid user validator from 185.246.128.171 port 64118
Feb 20 08:08:44 np0005625204.localdomain sshd[65922]: Disconnecting invalid user validator 185.246.128.171 port 64118: Change of username or service not allowed: (validator,ssh-connection) -> (sonos,ssh-connection) [preauth]
Feb 20 08:08:46 np0005625204.localdomain sshd[65924]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:47 np0005625204.localdomain sshd[65924]: Invalid user sonos from 185.246.128.171 port 52461
Feb 20 08:08:48 np0005625204.localdomain sshd[65924]: Disconnecting invalid user sonos 185.246.128.171 port 52461: Change of username or service not allowed: (sonos,ssh-connection) -> (dlinares,ssh-connection) [preauth]
Feb 20 08:08:50 np0005625204.localdomain sshd[65926]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:53 np0005625204.localdomain sshd[65926]: Invalid user dlinares from 185.246.128.171 port 41955
Feb 20 08:08:54 np0005625204.localdomain sshd[65926]: Disconnecting invalid user dlinares 185.246.128.171 port 41955: Change of username or service not allowed: (dlinares,ssh-connection) -> (telecomadmin,ssh-connection [preauth]
Feb 20 08:08:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:08:54 np0005625204.localdomain sudo[65934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:08:54 np0005625204.localdomain sudo[65934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:08:54 np0005625204.localdomain sudo[65934]: pam_unix(sudo:session): session closed for user root
Feb 20 08:08:54 np0005625204.localdomain podman[65928]: 2026-02-20 08:08:54.538601759 +0000 UTC m=+0.098767777 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 20 08:08:54 np0005625204.localdomain sudo[65961]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:08:54 np0005625204.localdomain sudo[65961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:08:54 np0005625204.localdomain podman[65928]: 2026-02-20 08:08:54.754135535 +0000 UTC m=+0.314301563 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=metrics_qdr, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:08:54 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:08:55 np0005625204.localdomain sudo[65961]: pam_unix(sudo:session): session closed for user root
Feb 20 08:08:55 np0005625204.localdomain sudo[66019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:08:55 np0005625204.localdomain sudo[66019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:08:55 np0005625204.localdomain sudo[66019]: pam_unix(sudo:session): session closed for user root
Feb 20 08:08:55 np0005625204.localdomain sudo[66034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 08:08:55 np0005625204.localdomain sudo[66034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:08:55 np0005625204.localdomain sudo[66034]: pam_unix(sudo:session): session closed for user root
Feb 20 08:08:56 np0005625204.localdomain sshd[66068]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:08:58 np0005625204.localdomain sshd[66068]: Invalid user telecomadmin from 185.246.128.171 port 50465
Feb 20 08:08:59 np0005625204.localdomain sshd[66068]: Disconnecting invalid user telecomadmin 185.246.128.171 port 50465: Change of username or service not allowed: (telecomadmin,ssh-connection) -> (service,ssh-connection) [preauth]
Feb 20 08:09:00 np0005625204.localdomain sshd[66070]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:00 np0005625204.localdomain sudo[66072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:09:00 np0005625204.localdomain sudo[66072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:09:01 np0005625204.localdomain sudo[66072]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:01 np0005625204.localdomain sshd[66070]: Invalid user service from 185.246.128.171 port 29930
Feb 20 08:09:01 np0005625204.localdomain sshd[66070]: Disconnecting invalid user service 185.246.128.171 port 29930: Change of username or service not allowed: (service,ssh-connection) -> (ospite,ssh-connection) [preauth]
Feb 20 08:09:02 np0005625204.localdomain sshd[66088]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:09:04 np0005625204.localdomain systemd[1]: tmp-crun.U7QPFW.mount: Deactivated successfully.
Feb 20 08:09:04 np0005625204.localdomain podman[66090]: 2026-02-20 08:09:04.17478213 +0000 UTC m=+0.110988053 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, release=1766032510, config_id=tripleo_step3, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd)
Feb 20 08:09:04 np0005625204.localdomain podman[66090]: 2026-02-20 08:09:04.188004278 +0000 UTC m=+0.124210191 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:09:04 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:09:04 np0005625204.localdomain sshd[66088]: Invalid user ospite from 185.246.128.171 port 57904
Feb 20 08:09:04 np0005625204.localdomain sshd[66088]: Disconnecting invalid user ospite 185.246.128.171 port 57904: Change of username or service not allowed: (ospite,ssh-connection) -> (liuyu,ssh-connection) [preauth]
Feb 20 08:09:05 np0005625204.localdomain sshd[66110]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:06 np0005625204.localdomain sshd[66110]: Invalid user liuyu from 185.246.128.171 port 26279
Feb 20 08:09:06 np0005625204.localdomain sshd[66110]: Disconnecting invalid user liuyu 185.246.128.171 port 26279: Change of username or service not allowed: (liuyu,ssh-connection) -> (diego,ssh-connection) [preauth]
Feb 20 08:09:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:09:07 np0005625204.localdomain sshd[66112]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:07 np0005625204.localdomain podman[66113]: 2026-02-20 08:09:07.140192154 +0000 UTC m=+0.078070299 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, container_name=iscsid, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:09:07 np0005625204.localdomain podman[66113]: 2026-02-20 08:09:07.153302478 +0000 UTC m=+0.091180633 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:09:07 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:09:08 np0005625204.localdomain sshd[66112]: Invalid user diego from 185.246.128.171 port 44656
Feb 20 08:09:09 np0005625204.localdomain sshd[66112]: Disconnecting invalid user diego 185.246.128.171 port 44656: Change of username or service not allowed: (diego,ssh-connection) -> (asus,ssh-connection) [preauth]
Feb 20 08:09:10 np0005625204.localdomain sshd[66134]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:13 np0005625204.localdomain sshd[66134]: Invalid user asus from 185.246.128.171 port 30030
Feb 20 08:09:13 np0005625204.localdomain sshd[66134]: Disconnecting invalid user asus 185.246.128.171 port 30030: Change of username or service not allowed: (asus,ssh-connection) -> (sophia,ssh-connection) [preauth]
Feb 20 08:09:14 np0005625204.localdomain sshd[66136]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:15 np0005625204.localdomain sshd[66136]: Invalid user sophia from 185.246.128.171 port 7078
Feb 20 08:09:16 np0005625204.localdomain sshd[66136]: Disconnecting invalid user sophia 185.246.128.171 port 7078: Change of username or service not allowed: (sophia,ssh-connection) -> (spark,ssh-connection) [preauth]
Feb 20 08:09:16 np0005625204.localdomain sshd[66138]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:18 np0005625204.localdomain sudo[66185]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxflgujcxekfaxascczlmjbufwjtcxzp ; /usr/bin/python3
Feb 20 08:09:18 np0005625204.localdomain sudo[66185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:18 np0005625204.localdomain python3[66187]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:18 np0005625204.localdomain sudo[66185]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:18 np0005625204.localdomain sshd[66138]: Invalid user spark from 185.246.128.171 port 36679
Feb 20 08:09:18 np0005625204.localdomain sudo[66230]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhtvyymskodgszboxquvxbqxmxsmcpyu ; /usr/bin/python3
Feb 20 08:09:18 np0005625204.localdomain sudo[66230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:19 np0005625204.localdomain python3[66232]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574958.2871263-107660-72029222473044/source _original_basename=tmpkdj14tbp follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:19 np0005625204.localdomain sudo[66230]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:19 np0005625204.localdomain sshd[66138]: Disconnecting invalid user spark 185.246.128.171 port 36679: Change of username or service not allowed: (spark,ssh-connection) -> (ayush,ssh-connection) [preauth]
Feb 20 08:09:20 np0005625204.localdomain sudo[66292]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlhvszvcebjhzdyeuaazfvizlqnjatpn ; /usr/bin/python3
Feb 20 08:09:20 np0005625204.localdomain sudo[66292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:20 np0005625204.localdomain python3[66294]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:20 np0005625204.localdomain sudo[66292]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:20 np0005625204.localdomain sudo[66335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqyvxtvhzpbpojzhitdqyluxcxmyxekc ; /usr/bin/python3
Feb 20 08:09:20 np0005625204.localdomain sudo[66335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:20 np0005625204.localdomain python3[66337]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574959.9982226-107758-206776131067677/source _original_basename=tmp9141h3ir follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:20 np0005625204.localdomain sshd[66338]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:20 np0005625204.localdomain sudo[66335]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:21 np0005625204.localdomain sudo[66399]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahekucvqyuthmnzgexcvamykqyphkwba ; /usr/bin/python3
Feb 20 08:09:21 np0005625204.localdomain sudo[66399]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:21 np0005625204.localdomain python3[66401]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:21 np0005625204.localdomain sudo[66399]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:21 np0005625204.localdomain sudo[66442]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaaycgaauoikmhvtjpupkqfjzgtmylzp ; /usr/bin/python3
Feb 20 08:09:21 np0005625204.localdomain sudo[66442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:21 np0005625204.localdomain python3[66444]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574961.0563428-107808-178098317184402/source _original_basename=tmp2iqg9iv5 follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:21 np0005625204.localdomain sudo[66442]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:21 np0005625204.localdomain sshd[66338]: Invalid user ayush from 185.246.128.171 port 20957
Feb 20 08:09:22 np0005625204.localdomain sshd[66338]: Disconnecting invalid user ayush 185.246.128.171 port 20957: Change of username or service not allowed: (ayush,ssh-connection) -> (odoo16,ssh-connection) [preauth]
Feb 20 08:09:22 np0005625204.localdomain sudo[66504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlszwkcituneqjpfiwtbdsnmtzowtnlt ; /usr/bin/python3
Feb 20 08:09:22 np0005625204.localdomain sudo[66504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:22 np0005625204.localdomain python3[66506]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:22 np0005625204.localdomain sudo[66504]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:22 np0005625204.localdomain sudo[66547]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brgklvuuspnyaspdavmxcewozxtbxxkz ; /usr/bin/python3
Feb 20 08:09:22 np0005625204.localdomain sudo[66547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:22 np0005625204.localdomain sshd[66550]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:22 np0005625204.localdomain python3[66549]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574962.0301127-108022-30166966606923/source _original_basename=tmpuj_1ldb7 follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:22 np0005625204.localdomain sudo[66547]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:23 np0005625204.localdomain sudo[66579]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecvrwuyodwlqfmntbywrqdfvqfqlxebr ; /usr/bin/python3
Feb 20 08:09:23 np0005625204.localdomain sudo[66579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:23 np0005625204.localdomain python3[66581]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 20 08:09:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:23 np0005625204.localdomain systemd-sysv-generator[66606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:23 np0005625204.localdomain systemd-rc-local-generator[66602]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:23 np0005625204.localdomain sshd[66550]: Invalid user odoo16 from 185.246.128.171 port 44041
Feb 20 08:09:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:23 np0005625204.localdomain systemd-sysv-generator[66647]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:23 np0005625204.localdomain systemd-rc-local-generator[66644]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:24 np0005625204.localdomain sudo[66579]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:24 np0005625204.localdomain sudo[66669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etmuqyuqtqtpwvdrmrjzofwqvwlksahe ; /usr/bin/python3
Feb 20 08:09:24 np0005625204.localdomain sudo[66669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:24 np0005625204.localdomain sshd[66672]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:24 np0005625204.localdomain python3[66671]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:09:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:24 np0005625204.localdomain sshd[66672]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:09:24 np0005625204.localdomain systemd-rc-local-generator[66700]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:24 np0005625204.localdomain systemd-sysv-generator[66704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:09:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:25 np0005625204.localdomain podman[66711]: 2026-02-20 08:09:25.011946875 +0000 UTC m=+0.095156175 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=)
Feb 20 08:09:25 np0005625204.localdomain sshd[66550]: Disconnecting invalid user odoo16 185.246.128.171 port 44041: Change of username or service not allowed: (odoo16,ssh-connection) -> (gitlab,ssh-connection) [preauth]
Feb 20 08:09:25 np0005625204.localdomain systemd-sysv-generator[66765]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:25 np0005625204.localdomain systemd-rc-local-generator[66762]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:25 np0005625204.localdomain podman[66711]: 2026-02-20 08:09:25.195558778 +0000 UTC m=+0.278767988 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1)
Feb 20 08:09:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:09:25 np0005625204.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Feb 20 08:09:25 np0005625204.localdomain sudo[66669]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:25 np0005625204.localdomain sudo[66791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahuztfgqkuquczlmivnpihdditdgswvo ; /usr/bin/python3
Feb 20 08:09:25 np0005625204.localdomain sudo[66791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:25 np0005625204.localdomain python3[66793]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 08:09:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:25 np0005625204.localdomain systemd-rc-local-generator[66817]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:25 np0005625204.localdomain systemd-sysv-generator[66823]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:25 np0005625204.localdomain sudo[66791]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:26 np0005625204.localdomain sudo[66875]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxlhojhdtzokflhrcegnzicgzgfilaag ; /usr/bin/python3
Feb 20 08:09:26 np0005625204.localdomain sudo[66875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:26 np0005625204.localdomain python3[66877]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:26 np0005625204.localdomain sudo[66875]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:26 np0005625204.localdomain sudo[66918]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohvozqlisgdlofrbjcjroeuuswdzkrdw ; /usr/bin/python3
Feb 20 08:09:26 np0005625204.localdomain sudo[66918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:26 np0005625204.localdomain python3[66920]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574966.126908-108187-151158764478628/source _original_basename=tmpn1vjzgxz follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:26 np0005625204.localdomain sudo[66918]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:27 np0005625204.localdomain sudo[66948]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mztyibtdhpohlxawrsyxwjcohfxbryng ; /usr/bin/python3
Feb 20 08:09:27 np0005625204.localdomain sudo[66948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:27 np0005625204.localdomain python3[66950]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:09:27 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:27 np0005625204.localdomain systemd-rc-local-generator[66972]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:27 np0005625204.localdomain systemd-sysv-generator[66976]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:27 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:27 np0005625204.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Feb 20 08:09:27 np0005625204.localdomain sshd[66989]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:27 np0005625204.localdomain sudo[66948]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:27 np0005625204.localdomain sudo[67003]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evoetyxpnvvnkrrfziwbsxngsddawxiq ; /usr/bin/python3
Feb 20 08:09:27 np0005625204.localdomain sudo[67003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:28 np0005625204.localdomain python3[67005]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:09:28 np0005625204.localdomain sudo[67003]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:28 np0005625204.localdomain sudo[67054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-latmcmfvrxnqzsbqbockzpxpwxrvaxcy ; /usr/bin/python3
Feb 20 08:09:28 np0005625204.localdomain sudo[67054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:28 np0005625204.localdomain sudo[67054]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:28 np0005625204.localdomain sudo[67072]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etzuvpikuiiaqjfigkucmvtebaydegjh ; /usr/bin/python3
Feb 20 08:09:28 np0005625204.localdomain sudo[67072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:28 np0005625204.localdomain sudo[67072]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:29 np0005625204.localdomain sshd[66989]: Invalid user gitlab from 185.246.128.171 port 39295
Feb 20 08:09:29 np0005625204.localdomain sudo[67176]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daqjavpbrlxwvtvbrczrfnmjqrbqwubu ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574969.1751366-108390-222623852624805/async_wrapper.py 933272446365 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574969.1751366-108390-222623852624805/AnsiballZ_command.py _
Feb 20 08:09:29 np0005625204.localdomain sudo[67176]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 08:09:29 np0005625204.localdomain sshd[66989]: Disconnecting invalid user gitlab 185.246.128.171 port 39295: Change of username or service not allowed: (gitlab,ssh-connection) -> (tomcat,ssh-connection) [preauth]
Feb 20 08:09:29 np0005625204.localdomain ansible-async_wrapper.py[67178]: Invoked with 933272446365 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771574969.1751366-108390-222623852624805/AnsiballZ_command.py _
Feb 20 08:09:29 np0005625204.localdomain ansible-async_wrapper.py[67181]: Starting module and watcher
Feb 20 08:09:29 np0005625204.localdomain ansible-async_wrapper.py[67181]: Start watching 67182 (3600)
Feb 20 08:09:29 np0005625204.localdomain ansible-async_wrapper.py[67182]: Start module (67182)
Feb 20 08:09:29 np0005625204.localdomain ansible-async_wrapper.py[67178]: Return async_wrapper task started.
Feb 20 08:09:29 np0005625204.localdomain sudo[67176]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:29 np0005625204.localdomain sshd[67198]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:29 np0005625204.localdomain sudo[67197]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmsxujoeelehygdyzotsaoldqzdzgufm ; /usr/bin/python3
Feb 20 08:09:29 np0005625204.localdomain sudo[67197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:30 np0005625204.localdomain python3[67204]: ansible-ansible.legacy.async_status Invoked with jid=933272446365.67178 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:09:30 np0005625204.localdomain sudo[67197]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:31 np0005625204.localdomain sshd[67198]: Invalid user tomcat from 185.246.128.171 port 2589
Feb 20 08:09:33 np0005625204.localdomain sshd[67198]: error: maximum authentication attempts exceeded for invalid user tomcat from 185.246.128.171 port 2589 ssh2 [preauth]
Feb 20 08:09:33 np0005625204.localdomain sshd[67198]: Disconnecting invalid user tomcat 185.246.128.171 port 2589: Too many authentication failures [preauth]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (file: /etc/puppet/hiera.yaml)
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: Undefined variable '::deploy_config_name';
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (file & line not available)
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (file & line not available)
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:09:33 np0005625204.localdomain sshd[67314]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 20 08:09:33 np0005625204.localdomain puppet-user[67203]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.27 seconds
Feb 20 08:09:34 np0005625204.localdomain sshd[67314]: Invalid user tomcat from 185.246.128.171 port 49157
Feb 20 08:09:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:09:34 np0005625204.localdomain systemd[1]: tmp-crun.VSDleg.mount: Deactivated successfully.
Feb 20 08:09:34 np0005625204.localdomain podman[67324]: 2026-02-20 08:09:34.698731867 +0000 UTC m=+0.105434712 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3)
Feb 20 08:09:34 np0005625204.localdomain podman[67324]: 2026-02-20 08:09:34.712985526 +0000 UTC m=+0.119688351 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:09:34 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:09:34 np0005625204.localdomain ansible-async_wrapper.py[67181]: 67182 still running (3600)
Feb 20 08:09:34 np0005625204.localdomain sshd[67314]: Disconnecting invalid user tomcat 185.246.128.171 port 49157: Change of username or service not allowed: (tomcat,ssh-connection) -> (prueba,ssh-connection) [preauth]
Feb 20 08:09:35 np0005625204.localdomain sshd[67344]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:36 np0005625204.localdomain sshd[67344]: Invalid user prueba from 185.246.128.171 port 4112
Feb 20 08:09:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:09:38 np0005625204.localdomain systemd[1]: tmp-crun.7MdOc4.mount: Deactivated successfully.
Feb 20 08:09:38 np0005625204.localdomain podman[67365]: 2026-02-20 08:09:38.15116062 +0000 UTC m=+0.081487015 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:09:38 np0005625204.localdomain podman[67365]: 2026-02-20 08:09:38.1900999 +0000 UTC m=+0.120426255 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, tcib_managed=true, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container)
Feb 20 08:09:38 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:09:38 np0005625204.localdomain sshd[67344]: Disconnecting invalid user prueba 185.246.128.171 port 4112: Change of username or service not allowed: (prueba,ssh-connection) -> (fatima,ssh-connection) [preauth]
Feb 20 08:09:39 np0005625204.localdomain ansible-async_wrapper.py[67181]: 67182 still running (3595)
Feb 20 08:09:40 np0005625204.localdomain sudo[67445]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjhfqerrtpfokdzkxzehuzwvmiilpfop ; /usr/bin/python3
Feb 20 08:09:40 np0005625204.localdomain sudo[67445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:40 np0005625204.localdomain python3[67447]: ansible-ansible.legacy.async_status Invoked with jid=933272446365.67178 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:09:40 np0005625204.localdomain sudo[67445]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:41 np0005625204.localdomain sshd[67462]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:41 np0005625204.localdomain systemd-rc-local-generator[67495]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:41 np0005625204.localdomain systemd-sysv-generator[67498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.001s CPU time.
Feb 20 08:09:41 np0005625204.localdomain systemd[1]: run-rc7dd249622d24037bd01b1ac0dc99c51.service: Deactivated successfully.
Feb 20 08:09:42 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Feb 20 08:09:42 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}6efe8182ad743f857ee22a2729211e2ffe44f4518a2bdbc1aeccaa84211394dc'
Feb 20 08:09:42 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Feb 20 08:09:42 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Feb 20 08:09:42 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Feb 20 08:09:42 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Feb 20 08:09:43 np0005625204.localdomain sshd[67462]: Invalid user fatima from 185.246.128.171 port 5144
Feb 20 08:09:43 np0005625204.localdomain sshd[67462]: Disconnecting invalid user fatima 185.246.128.171 port 5144: Change of username or service not allowed: (fatima,ssh-connection) -> (wso2,ssh-connection) [preauth]
Feb 20 08:09:43 np0005625204.localdomain sshd[68549]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:44 np0005625204.localdomain ansible-async_wrapper.py[67181]: 67182 still running (3590)
Feb 20 08:09:45 np0005625204.localdomain sshd[68549]: Invalid user wso2 from 185.246.128.171 port 38776
Feb 20 08:09:46 np0005625204.localdomain sshd[68549]: Disconnecting invalid user wso2 185.246.128.171 port 38776: Change of username or service not allowed: (wso2,ssh-connection) -> (vhserver,ssh-connection) [preauth]
Feb 20 08:09:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:09:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4463 writes, 20K keys, 4463 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4463 writes, 468 syncs, 9.54 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 227 writes, 681 keys, 227 commit groups, 1.0 writes per commit group, ingest: 0.63 MB, 0.00 MB/s
                                                          Interval WAL: 227 writes, 110 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:09:47 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Feb 20 08:09:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:47 np0005625204.localdomain systemd-rc-local-generator[68577]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:47 np0005625204.localdomain systemd-sysv-generator[68583]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:48 np0005625204.localdomain sshd[68591]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:48 np0005625204.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Feb 20 08:09:48 np0005625204.localdomain snmpd[68593]: Can't find directory of RPM packages
Feb 20 08:09:48 np0005625204.localdomain snmpd[68593]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Feb 20 08:09:48 np0005625204.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Feb 20 08:09:48 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:48 np0005625204.localdomain systemd-rc-local-generator[68619]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:48 np0005625204.localdomain systemd-sysv-generator[68624]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:48 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:48 np0005625204.localdomain systemd-sysv-generator[68660]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:48 np0005625204.localdomain systemd-rc-local-generator[68657]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Notice: Applied catalog in 15.10 seconds
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Application:
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:    Initial environment: production
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:    Converged environment: production
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:          Run mode: user
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Changes:
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:             Total: 8
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Events:
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:           Success: 8
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:             Total: 8
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Resources:
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:         Restarted: 1
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:           Changed: 8
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:       Out of sync: 8
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:             Total: 19
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Time:
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:        Filebucket: 0.00
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:          Schedule: 0.00
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:            Augeas: 0.01
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:              File: 0.07
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:    Config retrieval: 0.33
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:           Service: 1.15
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:    Transaction evaluation: 15.09
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:    Catalog application: 15.10
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:          Last run: 1771574988
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:              Exec: 5.06
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:           Package: 8.64
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:             Total: 15.10
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]: Version:
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:            Config: 1771574973
Feb 20 08:09:48 np0005625204.localdomain puppet-user[67203]:            Puppet: 7.10.0
Feb 20 08:09:48 np0005625204.localdomain ansible-async_wrapper.py[67182]: Module complete (67182)
Feb 20 08:09:49 np0005625204.localdomain ansible-async_wrapper.py[67181]: Done in kid B.
Feb 20 08:09:50 np0005625204.localdomain sudo[68681]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdzpwykkoayzbgivetqkopjnbfypdntb ; /usr/bin/python3
Feb 20 08:09:50 np0005625204.localdomain sudo[68681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:50 np0005625204.localdomain sshd[68591]: Invalid user vhserver from 185.246.128.171 port 21017
Feb 20 08:09:50 np0005625204.localdomain python3[68683]: ansible-ansible.legacy.async_status Invoked with jid=933272446365.67178 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:09:50 np0005625204.localdomain sudo[68681]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:50 np0005625204.localdomain sshd[68591]: Disconnecting invalid user vhserver 185.246.128.171 port 21017: Change of username or service not allowed: (vhserver,ssh-connection) -> (airflow,ssh-connection) [preauth]
Feb 20 08:09:51 np0005625204.localdomain sudo[68697]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bidgxctiufeaysthpnllsdzbaepfkwbu ; /usr/bin/python3
Feb 20 08:09:51 np0005625204.localdomain sudo[68697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:51 np0005625204.localdomain python3[68699]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 08:09:51 np0005625204.localdomain sudo[68697]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:09:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 5194 writes, 22K keys, 5194 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5194 writes, 621 syncs, 8.36 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 287 writes, 708 keys, 287 commit groups, 1.0 writes per commit group, ingest: 0.58 MB, 0.00 MB/s
                                                          Interval WAL: 287 writes, 141 syncs, 2.04 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:09:51 np0005625204.localdomain sshd[68700]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:51 np0005625204.localdomain sudo[68715]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpvkjdgmcppvzawmcvdyjqearwefnxme ; /usr/bin/python3
Feb 20 08:09:51 np0005625204.localdomain sudo[68715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:51 np0005625204.localdomain python3[68717]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:09:51 np0005625204.localdomain sudo[68715]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:52 np0005625204.localdomain sudo[68765]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pumfexgpfznaakpclydfjlbbbvfgwhfc ; /usr/bin/python3
Feb 20 08:09:52 np0005625204.localdomain sudo[68765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:52 np0005625204.localdomain python3[68767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:52 np0005625204.localdomain sudo[68765]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:52 np0005625204.localdomain sudo[68783]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcrphrhcdiqwvisoufscqcjzpnnbkemi ; /usr/bin/python3
Feb 20 08:09:52 np0005625204.localdomain sudo[68783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:52 np0005625204.localdomain python3[68785]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpwkq6eo1o recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 08:09:52 np0005625204.localdomain sudo[68783]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:52 np0005625204.localdomain sudo[68813]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qggricivgszilczyagdbqhxpazgckfzn ; /usr/bin/python3
Feb 20 08:09:52 np0005625204.localdomain sudo[68813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:53 np0005625204.localdomain python3[68815]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:53 np0005625204.localdomain sudo[68813]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:53 np0005625204.localdomain sudo[68829]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fznajbwgsfwiyocjxonkrxuioviqnwuy ; /usr/bin/python3
Feb 20 08:09:53 np0005625204.localdomain sudo[68829]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:53 np0005625204.localdomain sudo[68829]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:53 np0005625204.localdomain sshd[68700]: Invalid user airflow from 185.246.128.171 port 57377
Feb 20 08:09:53 np0005625204.localdomain sudo[68916]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxgwppgozqcwdhezonjjtpbpkorywedu ; /usr/bin/python3
Feb 20 08:09:53 np0005625204.localdomain sudo[68916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:54 np0005625204.localdomain python3[68918]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 20 08:09:54 np0005625204.localdomain sudo[68916]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:54 np0005625204.localdomain sshd[68700]: Disconnecting invalid user airflow 185.246.128.171 port 57377: Change of username or service not allowed: (airflow,ssh-connection) -> (gmod,ssh-connection) [preauth]
Feb 20 08:09:54 np0005625204.localdomain sudo[68935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwyvagvhrfmunorrftebsvhekogligjq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:54 np0005625204.localdomain sudo[68935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:54 np0005625204.localdomain python3[68937]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:54 np0005625204.localdomain sudo[68935]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:55 np0005625204.localdomain sudo[68951]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dostlmydnjjpuwuthtmfdxehynqxahpr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:55 np0005625204.localdomain sudo[68951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:55 np0005625204.localdomain sudo[68951]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:55 np0005625204.localdomain sudo[68967]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxairiesgdnhkazypetwfueffgdbmrue ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:55 np0005625204.localdomain sudo[68967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:09:55 np0005625204.localdomain podman[68970]: 2026-02-20 08:09:55.866340781 +0000 UTC m=+0.095570228 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1)
Feb 20 08:09:55 np0005625204.localdomain python3[68969]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:09:55 np0005625204.localdomain sudo[68967]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:56 np0005625204.localdomain podman[68970]: 2026-02-20 08:09:56.069045593 +0000 UTC m=+0.298275040 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510)
Feb 20 08:09:56 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:09:56 np0005625204.localdomain sudo[69045]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrtfeesptvggxdunoxdgajlgwwdeyfnw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:56 np0005625204.localdomain sudo[69045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:56 np0005625204.localdomain sshd[69048]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:56 np0005625204.localdomain python3[69047]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:56 np0005625204.localdomain sudo[69045]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:56 np0005625204.localdomain sudo[69065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niwlztoafvnoqltlbzacupldykigiyke ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:56 np0005625204.localdomain sudo[69065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:56 np0005625204.localdomain python3[69067]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:56 np0005625204.localdomain sudo[69065]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:57 np0005625204.localdomain sudo[69127]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icssanrandvskdggfmnitlgdkrcunieg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:57 np0005625204.localdomain sudo[69127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:57 np0005625204.localdomain python3[69129]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:57 np0005625204.localdomain sudo[69127]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:57 np0005625204.localdomain sudo[69145]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfcypbtlotosczmncblgjttgdlvupomz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:57 np0005625204.localdomain sudo[69145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:57 np0005625204.localdomain sshd[69048]: Invalid user gmod from 185.246.128.171 port 51794
Feb 20 08:09:57 np0005625204.localdomain python3[69147]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:57 np0005625204.localdomain sudo[69145]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:57 np0005625204.localdomain sudo[69207]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikpuvrmmkbatmcrmlqdfvyfwkdzcuiqd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:57 np0005625204.localdomain sudo[69207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:57 np0005625204.localdomain sshd[69048]: Disconnecting invalid user gmod 185.246.128.171 port 51794: Change of username or service not allowed: (gmod,ssh-connection) -> (sybase,ssh-connection) [preauth]
Feb 20 08:09:58 np0005625204.localdomain python3[69209]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:58 np0005625204.localdomain sudo[69207]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:58 np0005625204.localdomain sudo[69225]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgqbcxbyniltjkaawudvvaymqomdswcj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:58 np0005625204.localdomain sudo[69225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:58 np0005625204.localdomain python3[69227]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:58 np0005625204.localdomain sudo[69225]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:58 np0005625204.localdomain sudo[69287]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqakqwccrxsxkqujaiunkbioujoynmym ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:58 np0005625204.localdomain sudo[69287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:58 np0005625204.localdomain python3[69289]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:09:58 np0005625204.localdomain sudo[69287]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:59 np0005625204.localdomain sudo[69305]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osdwnaoatdqdgvjzjoakmxooenpezeje ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:59 np0005625204.localdomain sudo[69305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:59 np0005625204.localdomain sshd[69308]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:09:59 np0005625204.localdomain python3[69307]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:09:59 np0005625204.localdomain sudo[69305]: pam_unix(sudo:session): session closed for user root
Feb 20 08:09:59 np0005625204.localdomain sudo[69337]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciasbnuagwrhwiezufldizdvsggkpaji ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:09:59 np0005625204.localdomain sudo[69337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:09:59 np0005625204.localdomain python3[69339]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:09:59 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:09:59 np0005625204.localdomain systemd-rc-local-generator[69361]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:09:59 np0005625204.localdomain systemd-sysv-generator[69366]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:09:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:00 np0005625204.localdomain sshd[69308]: Invalid user sybase from 185.246.128.171 port 17120
Feb 20 08:10:00 np0005625204.localdomain sudo[69337]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:00 np0005625204.localdomain sudo[69423]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vligelffpcmguoyydegfxqfftjikmepe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:00 np0005625204.localdomain sudo[69423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:00 np0005625204.localdomain python3[69425]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:10:00 np0005625204.localdomain sudo[69423]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:00 np0005625204.localdomain sudo[69441]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eexzkxydeufivexcsvykawltjwnuubpg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:00 np0005625204.localdomain sudo[69441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:00 np0005625204.localdomain sshd[69308]: Disconnecting invalid user sybase 185.246.128.171 port 17120: Change of username or service not allowed: (sybase,ssh-connection) -> (dan,ssh-connection) [preauth]
Feb 20 08:10:00 np0005625204.localdomain python3[69443]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:00 np0005625204.localdomain sudo[69441]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:01 np0005625204.localdomain sudo[69458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:10:01 np0005625204.localdomain sudo[69458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:01 np0005625204.localdomain sudo[69458]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:01 np0005625204.localdomain sudo[69473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 08:10:01 np0005625204.localdomain sudo[69473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:01 np0005625204.localdomain sudo[69533]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loqjqxmhqoenkkvhismuibrzcxfapdmf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:01 np0005625204.localdomain sudo[69533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:01 np0005625204.localdomain python3[69535]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:10:01 np0005625204.localdomain sudo[69533]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:01 np0005625204.localdomain sudo[69572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfiyppmfgernqzgznwhischyrtaeklee ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:01 np0005625204.localdomain sudo[69572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:01 np0005625204.localdomain sudo[69473]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:01 np0005625204.localdomain sudo[69575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:10:01 np0005625204.localdomain sudo[69575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:01 np0005625204.localdomain sudo[69575]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:01 np0005625204.localdomain python3[69574]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:01 np0005625204.localdomain sudo[69572]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:01 np0005625204.localdomain sudo[69590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:10:01 np0005625204.localdomain sudo[69590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:02 np0005625204.localdomain sudo[69632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-haytupegdoloksuksdhcuditgrohpwrc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:02 np0005625204.localdomain sudo[69632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:02 np0005625204.localdomain sshd[69647]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:02 np0005625204.localdomain python3[69634]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:02 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:02 np0005625204.localdomain systemd-sysv-generator[69688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:02 np0005625204.localdomain systemd-rc-local-generator[69684]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:02 np0005625204.localdomain sudo[69590]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:02 np0005625204.localdomain sudo[69705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:10:02 np0005625204.localdomain sudo[69705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:02 np0005625204.localdomain sudo[69705]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:02 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 08:10:02 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 08:10:02 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 08:10:02 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 08:10:02 np0005625204.localdomain sudo[69632]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:02 np0005625204.localdomain sudo[69725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- inventory --format=json-pretty --filter-for-batch
Feb 20 08:10:02 np0005625204.localdomain sudo[69725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:02 np0005625204.localdomain sudo[69753]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fakcacqrepcomeqqsswonqzsyfwkcgdc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:02 np0005625204.localdomain sudo[69753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:03 np0005625204.localdomain python3[69755]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 20 08:10:03 np0005625204.localdomain sudo[69753]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 
Feb 20 08:10:03 np0005625204.localdomain sshd[69647]: Invalid user dan from 185.246.128.171 port 49927
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 2026-02-20 08:10:03.346467066 +0000 UTC m=+0.076107858 container create 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.buildah.version=1.42.2, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: Started libpod-conmon-8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6.scope.
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 2026-02-20 08:10:03.314427428 +0000 UTC m=+0.044068260 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 2026-02-20 08:10:03.418945711 +0000 UTC m=+0.148586553 container init 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, ceph=True, io.openshift.tags=rhceph ceph, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 2026-02-20 08:10:03.429256789 +0000 UTC m=+0.158897581 container start 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, name=rhceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=)
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 2026-02-20 08:10:03.429522538 +0000 UTC m=+0.159163370 container attach 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Feb 20 08:10:03 np0005625204.localdomain funny_pasteur[69809]: 167 167
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: libpod-8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6.scope: Deactivated successfully.
Feb 20 08:10:03 np0005625204.localdomain podman[69796]: 2026-02-20 08:10:03.433344655 +0000 UTC m=+0.162985447 container died 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, name=rhceph, ceph=True, architecture=x86_64, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, CEPH_POINT_RELEASE=)
Feb 20 08:10:03 np0005625204.localdomain sshd[69813]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:03 np0005625204.localdomain podman[69816]: 2026-02-20 08:10:03.517084747 +0000 UTC m=+0.075528600 container remove 8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_pasteur, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, release=1770267347, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: libpod-conmon-8755785fe99af4e30d7e237fef0288c3cfab31aeb4917835fb97962f3f2b7bd6.scope: Deactivated successfully.
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-affb0363b6af858faf35f8b44ac482767e10653be43d1f4616d5950413a5bebc-merged.mount: Deactivated successfully.
Feb 20 08:10:03 np0005625204.localdomain podman[69838]: 
Feb 20 08:10:03 np0005625204.localdomain podman[69838]: 2026-02-20 08:10:03.739042553 +0000 UTC m=+0.076520862 container create a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., distribution-scope=public, RELEASE=main, release=1770267347, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 08:10:03 np0005625204.localdomain sudo[69863]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmusrwumtxhvvbnvutuvdgdxmskkyfrw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:03 np0005625204.localdomain sudo[69863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: Started libpod-conmon-a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d.scope.
Feb 20 08:10:03 np0005625204.localdomain podman[69838]: 2026-02-20 08:10:03.708002945 +0000 UTC m=+0.045481324 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 08:10:03 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:03 np0005625204.localdomain podman[69838]: 2026-02-20 08:10:03.826683775 +0000 UTC m=+0.164162134 container init a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, io.buildah.version=1.42.2, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Feb 20 08:10:03 np0005625204.localdomain podman[69838]: 2026-02-20 08:10:03.839144879 +0000 UTC m=+0.176623188 container start a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, release=1770267347, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.42.2, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 08:10:03 np0005625204.localdomain podman[69838]: 2026-02-20 08:10:03.842386099 +0000 UTC m=+0.179864418 container attach a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, name=rhceph, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.expose-services=)
Feb 20 08:10:03 np0005625204.localdomain sshd[69647]: Disconnecting invalid user dan 185.246.128.171 port 49927: Change of username or service not allowed: (dan,ssh-connection) -> (sample,ssh-connection) [preauth]
Feb 20 08:10:04 np0005625204.localdomain sudo[69863]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]: [
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:     {
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "available": false,
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "ceph_device": false,
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "lsm_data": {},
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "lvs": [],
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "path": "/dev/sr0",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "rejected_reasons": [
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "Insufficient space (<5GB)",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "Has a FileSystem"
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         ],
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         "sys_api": {
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "actuators": null,
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "device_nodes": "sr0",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "human_readable_size": "482.00 KB",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "id_bus": "ata",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "model": "QEMU DVD-ROM",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "nr_requests": "2",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "partitions": {},
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "path": "/dev/sr0",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "removable": "1",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "rev": "2.5+",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "ro": "0",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "rotational": "1",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "sas_address": "",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "sas_device_handle": "",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "scheduler_mode": "mq-deadline",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "sectors": 0,
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "sectorsize": "2048",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "size": 493568.0,
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "support_discard": "0",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "type": "disk",
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:             "vendor": "QEMU"
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:         }
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]:     }
Feb 20 08:10:04 np0005625204.localdomain confident_gauss[69869]: ]
Feb 20 08:10:04 np0005625204.localdomain systemd[1]: libpod-a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d.scope: Deactivated successfully.
Feb 20 08:10:04 np0005625204.localdomain podman[69838]: 2026-02-20 08:10:04.805088646 +0000 UTC m=+1.142566965 container died a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.buildah.version=1.42.2, vcs-type=git)
Feb 20 08:10:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:10:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bf0e632f4d0b0113d16093d3f935c068270a53a091dba1bc45a4d4f299f29502-merged.mount: Deactivated successfully.
Feb 20 08:10:04 np0005625204.localdomain podman[71720]: 2026-02-20 08:10:04.875253219 +0000 UTC m=+0.064226141 container remove a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=confident_gauss, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 08:10:04 np0005625204.localdomain systemd[1]: libpod-conmon-a27262dbb66aead38ca38167a0cc4e8ab3133e39b037f7e9d316bdd91899ab9d.scope: Deactivated successfully.
Feb 20 08:10:04 np0005625204.localdomain sudo[69725]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:04 np0005625204.localdomain podman[71727]: 2026-02-20 08:10:04.928612695 +0000 UTC m=+0.094885857 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:10:04 np0005625204.localdomain podman[71727]: 2026-02-20 08:10:04.940746839 +0000 UTC m=+0.107020001 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:10:04 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:10:05 np0005625204.localdomain sudo[71770]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpozwbhnkewmjlaxdexsrqcpdvmethlo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:05 np0005625204.localdomain sudo[71770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:05 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 20 08:10:05 np0005625204.localdomain sshd[71878]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:05 np0005625204.localdomain podman[71934]: 2026-02-20 08:10:05.467506333 +0000 UTC m=+0.060081264 container create eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libpod-conmon-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3.scope.
Feb 20 08:10:05 np0005625204.localdomain podman[71953]: 2026-02-20 08:10:05.50470875 +0000 UTC m=+0.074664534 container create 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:05 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:05 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:05 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:05 np0005625204.localdomain podman[71934]: 2026-02-20 08:10:05.526320657 +0000 UTC m=+0.118895588 container init eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libpod-conmon-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.scope.
Feb 20 08:10:05 np0005625204.localdomain podman[71934]: 2026-02-20 08:10:05.438811808 +0000 UTC m=+0.031386759 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 20 08:10:05 np0005625204.localdomain podman[71934]: 2026-02-20 08:10:05.538328427 +0000 UTC m=+0.130903368 container start eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, managed_by=tripleo_ansible)
Feb 20 08:10:05 np0005625204.localdomain podman[71934]: 2026-02-20 08:10:05.538611785 +0000 UTC m=+0.131186737 container attach eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_libvirt_init_secret, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:05 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0b03ed83be81af8ca31d355d34bc84741adbeedeb0b33580fe27349115e799d7/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:05 np0005625204.localdomain podman[71935]: 2026-02-20 08:10:05.461337582 +0000 UTC m=+0.046658459 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 20 08:10:05 np0005625204.localdomain podman[71949]: 2026-02-20 08:10:05.46546459 +0000 UTC m=+0.040042486 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 20 08:10:05 np0005625204.localdomain podman[71953]: 2026-02-20 08:10:05.466151051 +0000 UTC m=+0.036106845 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 20 08:10:05 np0005625204.localdomain podman[71949]: 2026-02-20 08:10:05.565779154 +0000 UTC m=+0.140357050 container create 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, container_name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, release=1766032510)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:10:05 np0005625204.localdomain podman[71953]: 2026-02-20 08:10:05.572586373 +0000 UTC m=+0.142542197 container init 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, architecture=x86_64)
Feb 20 08:10:05 np0005625204.localdomain podman[71954]: 2026-02-20 08:10:05.475415876 +0000 UTC m=+0.040463488 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 20 08:10:05 np0005625204.localdomain podman[71935]: 2026-02-20 08:10:05.576526434 +0000 UTC m=+0.161847341 container create cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, container_name=ceilometer_agent_compute)
Feb 20 08:10:05 np0005625204.localdomain sudo[72023]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:10:05 np0005625204.localdomain sudo[72023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:10:05 np0005625204.localdomain podman[71954]: 2026-02-20 08:10:05.589819154 +0000 UTC m=+0.154866756 container create 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libpod-conmon-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39.scope.
Feb 20 08:10:05 np0005625204.localdomain podman[71953]: 2026-02-20 08:10:05.603092474 +0000 UTC m=+0.173048268 container start 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, version=17.1.13, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libpod-conmon-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope.
Feb 20 08:10:05 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libpod-conmon-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope.
Feb 20 08:10:05 np0005625204.localdomain sudo[72023]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:05 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e1c07b1bd08758bd14fb80cc901f6da6a3ccc5e5eba94f04ead08e95db5f3037/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:05 np0005625204.localdomain crond[72022]: (CRON) STARTUP (1.5.7)
Feb 20 08:10:05 np0005625204.localdomain podman[71949]: 2026-02-20 08:10:05.640355083 +0000 UTC m=+0.214932979 container init 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=configure_cms_options, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com)
Feb 20 08:10:05 np0005625204.localdomain crond[72022]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 30% if used.)
Feb 20 08:10:05 np0005625204.localdomain crond[72022]: (CRON) INFO (running with inotify support)
Feb 20 08:10:05 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/271fbe47d50a90f03735a26a1ff5b20e2027c13cb6e9d5c8a6a9112793cd7c92/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:05 np0005625204.localdomain podman[71949]: 2026-02-20 08:10:05.647085871 +0000 UTC m=+0.221663757 container start 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=configure_cms_options, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']})
Feb 20 08:10:05 np0005625204.localdomain podman[71949]: 2026-02-20 08:10:05.647286137 +0000 UTC m=+0.221864033 container attach 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, container_name=configure_cms_options, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, build-date=2026-01-12T22:36:40Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-type=git)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: libpod-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3.scope: Deactivated successfully.
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:10:05 np0005625204.localdomain podman[71954]: 2026-02-20 08:10:05.668862452 +0000 UTC m=+0.233910104 container init 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:10:05 np0005625204.localdomain sudo[72079]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:10:05 np0005625204.localdomain sudo[72079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 20 08:10:05 np0005625204.localdomain podman[71954]: 2026-02-20 08:10:05.683145652 +0000 UTC m=+0.248193264 container start 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1)
Feb 20 08:10:05 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ed809cd151e1fa8da7409fe229c809b7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 20 08:10:05 np0005625204.localdomain podman[71934]: 2026-02-20 08:10:05.70347175 +0000 UTC m=+0.296046711 container died eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, container_name=nova_libvirt_init_secret, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:10:05 np0005625204.localdomain podman[71935]: 2026-02-20 08:10:05.713436106 +0000 UTC m=+0.298756983 container init cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:10:05 np0005625204.localdomain sudo[72118]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:10:05 np0005625204.localdomain sudo[72118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:10:05 np0005625204.localdomain sudo[72079]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:10:05 np0005625204.localdomain podman[71935]: 2026-02-20 08:10:05.739887752 +0000 UTC m=+0.325208619 container start cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z)
Feb 20 08:10:05 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=ed809cd151e1fa8da7409fe229c809b7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 20 08:10:05 np0005625204.localdomain ovs-vsctl[72155]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Feb 20 08:10:05 np0005625204.localdomain sudo[72118]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:05 np0005625204.localdomain podman[72030]: 2026-02-20 08:10:05.824370467 +0000 UTC m=+0.217905320 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: libpod-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39.scope: Deactivated successfully.
Feb 20 08:10:05 np0005625204.localdomain podman[72030]: 2026-02-20 08:10:05.833141738 +0000 UTC m=+0.226676601 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:10:05 np0005625204.localdomain podman[71949]: 2026-02-20 08:10:05.884391138 +0000 UTC m=+0.458969024 container died 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, container_name=configure_cms_options, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 20 08:10:05 np0005625204.localdomain podman[72121]: 2026-02-20 08:10:05.8977512 +0000 UTC m=+0.158599902 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, config_id=tripleo_step4, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:10:05 np0005625204.localdomain podman[72174]: 2026-02-20 08:10:05.953246922 +0000 UTC m=+0.109311882 container cleanup 1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=configure_cms_options, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public)
Feb 20 08:10:05 np0005625204.localdomain systemd[1]: libpod-conmon-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39.scope: Deactivated successfully.
Feb 20 08:10:05 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Feb 20 08:10:05 np0005625204.localdomain podman[72066]: 2026-02-20 08:10:05.989875441 +0000 UTC m=+0.328168721 container cleanup eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z)
Feb 20 08:10:05 np0005625204.localdomain podman[72080]: 2026-02-20 08:10:05.99276393 +0000 UTC m=+0.307523594 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: libpod-conmon-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3.scope: Deactivated successfully.
Feb 20 08:10:06 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Feb 20 08:10:06 np0005625204.localdomain podman[72121]: 2026-02-20 08:10:06.03068843 +0000 UTC m=+0.291537152 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true)
Feb 20 08:10:06 np0005625204.localdomain podman[72121]: unhealthy
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed with result 'exit-code'.
Feb 20 08:10:06 np0005625204.localdomain podman[72080]: 2026-02-20 08:10:06.05569806 +0000 UTC m=+0.370457734 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:10:06 np0005625204.localdomain podman[72080]: unhealthy
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed with result 'exit-code'.
Feb 20 08:10:06 np0005625204.localdomain sshd[69813]: Invalid user x from 103.157.25.4 port 60482
Feb 20 08:10:06 np0005625204.localdomain podman[72230]: 2026-02-20 08:10:05.970031799 +0000 UTC m=+0.032033839 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:10:06 np0005625204.localdomain podman[72230]: 2026-02-20 08:10:06.110501301 +0000 UTC m=+0.172503341 container create b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 20 08:10:06 np0005625204.localdomain podman[72305]: 2026-02-20 08:10:06.135410819 +0000 UTC m=+0.055190663 container create 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=setup_ovs_manager, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4)
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: Started libpod-conmon-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope.
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: Started libpod-conmon-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope.
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:06 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:06 np0005625204.localdomain podman[72305]: 2026-02-20 08:10:06.191387935 +0000 UTC m=+0.111167779 container init 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, container_name=setup_ovs_manager, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:10:06 np0005625204.localdomain podman[72230]: 2026-02-20 08:10:06.195268875 +0000 UTC m=+0.257270935 container init b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target)
Feb 20 08:10:06 np0005625204.localdomain podman[72305]: 2026-02-20 08:10:06.201528318 +0000 UTC m=+0.121308132 container start 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=setup_ovs_manager, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 20 08:10:06 np0005625204.localdomain podman[72305]: 2026-02-20 08:10:06.201764856 +0000 UTC m=+0.121544720 container attach 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=setup_ovs_manager, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true)
Feb 20 08:10:06 np0005625204.localdomain podman[72305]: 2026-02-20 08:10:06.103547667 +0000 UTC m=+0.023327471 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 20 08:10:06 np0005625204.localdomain sudo[72345]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:10:06 np0005625204.localdomain sudo[72345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:10:06 np0005625204.localdomain podman[72230]: 2026-02-20 08:10:06.235608439 +0000 UTC m=+0.297610479 container start b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:10:06 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:10:06 np0005625204.localdomain sudo[72345]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:06 np0005625204.localdomain sshd[72368]: Server listening on 0.0.0.0 port 2022.
Feb 20 08:10:06 np0005625204.localdomain sshd[72368]: Server listening on :: port 2022.
Feb 20 08:10:06 np0005625204.localdomain podman[72348]: 2026-02-20 08:10:06.298853689 +0000 UTC m=+0.056854104 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:10:06 np0005625204.localdomain sshd[69813]: Received disconnect from 103.157.25.4 port 60482:11: Bye Bye [preauth]
Feb 20 08:10:06 np0005625204.localdomain sshd[69813]: Disconnected from invalid user x 103.157.25.4 port 60482 [preauth]
Feb 20 08:10:06 np0005625204.localdomain sudo[72391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:10:06 np0005625204.localdomain sudo[72391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:10:06 np0005625204.localdomain sudo[72391]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:06 np0005625204.localdomain sudo[72410]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmprbqrn23f/privsep.sock
Feb 20 08:10:06 np0005625204.localdomain sudo[72410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-33265afbb0ab1192cc35fd8be9e517c4969c8f23f7a1676738a90556ed12fe7c-merged.mount: Deactivated successfully.
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f271036dfcaa570d9f01e81828917d2c05e48286b23c86b2178860feee3ae39-userdata-shm.mount: Deactivated successfully.
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207-merged.mount: Deactivated successfully.
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eba018fa9b16677c69e1ea1738d9ee982706ef5501e9d0dc536de2ffcc7970b3-userdata-shm.mount: Deactivated successfully.
Feb 20 08:10:06 np0005625204.localdomain podman[72348]: 2026-02-20 08:10:06.653303809 +0000 UTC m=+0.411304284 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4)
Feb 20 08:10:06 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:10:07 np0005625204.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 20 08:10:07 np0005625204.localdomain sudo[72410]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:07 np0005625204.localdomain sshd[71878]: Invalid user sample from 185.246.128.171 port 21030
Feb 20 08:10:07 np0005625204.localdomain sshd[71878]: Disconnecting invalid user sample 185.246.128.171 port 21030: Change of username or service not allowed: (sample,ssh-connection) -> (office,ssh-connection) [preauth]
Feb 20 08:10:08 np0005625204.localdomain sshd[72528]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:08 np0005625204.localdomain sshd[72528]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:10:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:10:08 np0005625204.localdomain sshd[72546]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:08 np0005625204.localdomain ovs-vsctl[72547]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 20 08:10:08 np0005625204.localdomain systemd[1]: tmp-crun.H0m1wQ.mount: Deactivated successfully.
Feb 20 08:10:08 np0005625204.localdomain podman[72534]: 2026-02-20 08:10:08.949563449 +0000 UTC m=+0.131473535 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Feb 20 08:10:08 np0005625204.localdomain podman[72534]: 2026-02-20 08:10:08.987960253 +0000 UTC m=+0.169870309 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, container_name=iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:10:08 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: libpod-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope: Deactivated successfully.
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: libpod-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope: Consumed 2.827s CPU time.
Feb 20 08:10:09 np0005625204.localdomain podman[72559]: 2026-02-20 08:10:09.173525115 +0000 UTC m=+0.058391051 container died 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., container_name=setup_ovs_manager, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a-userdata-shm.mount: Deactivated successfully.
Feb 20 08:10:09 np0005625204.localdomain podman[72559]: 2026-02-20 08:10:09.209087582 +0000 UTC m=+0.093953468 container cleanup 658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=setup_ovs_manager, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: libpod-conmon-658ec508f9cf00f18f8882324103582314a172c3695c5879005ed1554cbd5d4a.scope: Deactivated successfully.
Feb 20 08:10:09 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1771573229 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1771573229'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Feb 20 08:10:09 np0005625204.localdomain podman[72672]: 2026-02-20 08:10:09.684043659 +0000 UTC m=+0.073002692 container create 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started libpod-conmon-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope.
Feb 20 08:10:09 np0005625204.localdomain podman[72678]: 2026-02-20 08:10:09.728168539 +0000 UTC m=+0.103698499 container create 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:09 np0005625204.localdomain podman[72672]: 2026-02-20 08:10:09.645690106 +0000 UTC m=+0.034649199 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 20 08:10:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started libpod-conmon-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope.
Feb 20 08:10:09 np0005625204.localdomain podman[72678]: 2026-02-20 08:10:09.678057034 +0000 UTC m=+0.053587044 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:10:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:10:09 np0005625204.localdomain podman[72672]: 2026-02-20 08:10:09.788112138 +0000 UTC m=+0.177071211 container init 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:10:09 np0005625204.localdomain podman[72672]: 2026-02-20 08:10:09.832582659 +0000 UTC m=+0.221541712 container start 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, distribution-scope=public, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 20 08:10:09 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 20 08:10:09 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:10:09 np0005625204.localdomain podman[72678]: 2026-02-20 08:10:09.84007436 +0000 UTC m=+0.215604360 container init 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 20 08:10:09 np0005625204.localdomain sudo[72728]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:10:09 np0005625204.localdomain podman[72678]: 2026-02-20 08:10:09.88124944 +0000 UTC m=+0.256779430 container start 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Feb 20 08:10:09 np0005625204.localdomain sudo[72728]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 20 08:10:09 np0005625204.localdomain python3[71772]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=684ebb6e94768a0a31a4d8592f0686b3 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 20 08:10:09 np0005625204.localdomain systemd[72731]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ef9cc1375c4a3e979779fde9a22d44caa1f8d54d9be8e432ea85c98c54294ad4-merged.mount: Deactivated successfully.
Feb 20 08:10:09 np0005625204.localdomain podman[72711]: 2026-02-20 08:10:09.945286854 +0000 UTC m=+0.104797292 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:10:09 np0005625204.localdomain podman[72711]: 2026-02-20 08:10:09.962049601 +0000 UTC m=+0.121560029 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 20 08:10:09 np0005625204.localdomain sudo[72728]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:09 np0005625204.localdomain podman[72711]: unhealthy
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:10:09 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:10:10 np0005625204.localdomain podman[72729]: 2026-02-20 08:10:10.04242155 +0000 UTC m=+0.156557619 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5)
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Queued start job for default target Main User Target.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Created slice User Application Slice.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Reached target Paths.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Reached target Timers.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Starting D-Bus User Message Bus Socket...
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Starting Create User's Volatile Files and Directories...
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Finished Create User's Volatile Files and Directories.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Listening on D-Bus User Message Bus Socket.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Reached target Sockets.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Reached target Basic System.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Reached target Main User Target.
Feb 20 08:10:10 np0005625204.localdomain systemd[72731]: Startup finished in 137ms.
Feb 20 08:10:10 np0005625204.localdomain systemd[1]: Started User Manager for UID 0.
Feb 20 08:10:10 np0005625204.localdomain systemd[1]: Started Session c9 of User root.
Feb 20 08:10:10 np0005625204.localdomain podman[72729]: 2026-02-20 08:10:10.108891589 +0000 UTC m=+0.223027678 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 20 08:10:10 np0005625204.localdomain podman[72729]: unhealthy
Feb 20 08:10:10 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:10:10 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:10:10 np0005625204.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Feb 20 08:10:10 np0005625204.localdomain kernel: device br-int entered promiscuous mode
Feb 20 08:10:10 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575010.2048] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Feb 20 08:10:10 np0005625204.localdomain systemd-udevd[72825]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 08:10:10 np0005625204.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Feb 20 08:10:10 np0005625204.localdomain systemd-udevd[72830]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 08:10:10 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575010.2400] device (genev_sys_6081): carrier: link connected
Feb 20 08:10:10 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575010.2403] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Feb 20 08:10:10 np0005625204.localdomain sudo[71770]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:10 np0005625204.localdomain sudo[72847]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckxzfwskueqcaokamwxbxuawsocsvgxd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:10 np0005625204.localdomain sudo[72847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:10 np0005625204.localdomain python3[72849]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:10 np0005625204.localdomain sudo[72847]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:10 np0005625204.localdomain sudo[72863]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkqrjsjxazhufsbjkknywlolkrpevpyl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:10 np0005625204.localdomain sudo[72863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:10 np0005625204.localdomain python3[72865]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:10 np0005625204.localdomain sudo[72863]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:11 np0005625204.localdomain sudo[72879]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uctlqoyhcvonwzryjppicfqyaazhtwwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:11 np0005625204.localdomain sudo[72879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:11 np0005625204.localdomain sshd[72546]: Invalid user office from 185.246.128.171 port 59513
Feb 20 08:10:11 np0005625204.localdomain python3[72881]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:11 np0005625204.localdomain sudo[72879]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:11 np0005625204.localdomain sudo[72895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgyevmdnmfyxxaauuzumuwgrsjvfhzkd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:11 np0005625204.localdomain sudo[72895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:11 np0005625204.localdomain python3[72897]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:11 np0005625204.localdomain sudo[72895]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:11 np0005625204.localdomain sudo[72911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piqufsaymvuvnaymktyqtsvqekgkrlzu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:11 np0005625204.localdomain sudo[72911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:11 np0005625204.localdomain sudo[72915]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmp8pqogdy8/privsep.sock
Feb 20 08:10:11 np0005625204.localdomain sudo[72915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 20 08:10:11 np0005625204.localdomain python3[72913]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:11 np0005625204.localdomain sudo[72911]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:11 np0005625204.localdomain sudo[72931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huhpkfwbwcqppavvkuwhjbekbcahuihi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:11 np0005625204.localdomain sudo[72931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:11 np0005625204.localdomain sshd[72546]: Disconnecting invalid user office 185.246.128.171 port 59513: Change of username or service not allowed: (office,ssh-connection) -> (sol,ssh-connection) [preauth]
Feb 20 08:10:11 np0005625204.localdomain python3[72933]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:11 np0005625204.localdomain sudo[72931]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:12 np0005625204.localdomain sudo[72947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umkmvxmyfanywsrkiphfkhjbzltkewql ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:12 np0005625204.localdomain sudo[72947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:12 np0005625204.localdomain python3[72949]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:10:12 np0005625204.localdomain sudo[72947]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:12 np0005625204.localdomain sudo[72915]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:12 np0005625204.localdomain sudo[72964]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lotgeepslarxtlrcurprdkskgppjnrrk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:12 np0005625204.localdomain sudo[72964]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:12 np0005625204.localdomain python3[72967]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:10:12 np0005625204.localdomain sudo[72964]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:12 np0005625204.localdomain sudo[72981]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpbnljfscjbjgbnszptqgmuyqfvcgbwg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:12 np0005625204.localdomain sudo[72981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:12 np0005625204.localdomain python3[72985]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:10:12 np0005625204.localdomain sudo[72981]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:12 np0005625204.localdomain sudo[72999]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-regazdoayanbejkcpqrimwxoykgkhuli ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:12 np0005625204.localdomain sudo[72999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:12 np0005625204.localdomain python3[73001]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:10:12 np0005625204.localdomain sudo[72999]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:13 np0005625204.localdomain sudo[73015]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kftmizacemhlgwknnpbuehjfbmbmxdoi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:13 np0005625204.localdomain sudo[73015]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:13 np0005625204.localdomain python3[73017]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:10:13 np0005625204.localdomain sudo[73015]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:13 np0005625204.localdomain sudo[73031]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxrboqswoewjvsokwcugqteemvxzsqna ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:13 np0005625204.localdomain sudo[73031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:13 np0005625204.localdomain python3[73033]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:10:13 np0005625204.localdomain sudo[73031]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:13 np0005625204.localdomain sudo[73092]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpdciyqhqkfavwgmrhvaqvxodiaocfvw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:13 np0005625204.localdomain sudo[73092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:14 np0005625204.localdomain python3[73094]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:14 np0005625204.localdomain sudo[73092]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:14 np0005625204.localdomain sshd[73095]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:14 np0005625204.localdomain sudo[73123]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qldquksscfkjpnnptndzwylberwcnptl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:14 np0005625204.localdomain sudo[73123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:14 np0005625204.localdomain python3[73125]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:14 np0005625204.localdomain sudo[73123]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:14 np0005625204.localdomain sudo[73153]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sruenxmbbgtwffxdwuuivyrlyeuyujhk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:14 np0005625204.localdomain sudo[73153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:15 np0005625204.localdomain python3[73155]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:15 np0005625204.localdomain sudo[73153]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:15 np0005625204.localdomain sudo[73182]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxhsjtkydylgdxvdmapynjdkbtmnadng ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:15 np0005625204.localdomain sudo[73182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:15 np0005625204.localdomain python3[73184]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:15 np0005625204.localdomain sudo[73182]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:16 np0005625204.localdomain sudo[73211]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpanxyuelhllqyafmfeltdtfhypppxxr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:16 np0005625204.localdomain sudo[73211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:16 np0005625204.localdomain python3[73213]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:16 np0005625204.localdomain sudo[73211]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:16 np0005625204.localdomain sudo[73240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zpdzkgwyitqhxxdjdbdurjfofpmggsec ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:16 np0005625204.localdomain sudo[73240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:16 np0005625204.localdomain python3[73242]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575013.5437937-109710-275037097296287/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:16 np0005625204.localdomain sudo[73240]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:16 np0005625204.localdomain sudo[73256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwxuryxwfawuvlyojjoohpspqplvmjng ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:16 np0005625204.localdomain sudo[73256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:17 np0005625204.localdomain python3[73258]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 08:10:17 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:17 np0005625204.localdomain systemd-rc-local-generator[73279]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:17 np0005625204.localdomain systemd-sysv-generator[73284]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:17 np0005625204.localdomain sshd[73095]: Invalid user sol from 185.246.128.171 port 49469
Feb 20 08:10:17 np0005625204.localdomain sudo[73256]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:17 np0005625204.localdomain sudo[73308]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdujnvlhuslcecpvmwhrxajtagcbybup ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:17 np0005625204.localdomain sudo[73308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:18 np0005625204.localdomain python3[73310]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:19 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:19 np0005625204.localdomain sshd[73095]: Disconnecting invalid user sol 185.246.128.171 port 49469: Change of username or service not allowed: (sol,ssh-connection) -> (alan,ssh-connection) [preauth]
Feb 20 08:10:19 np0005625204.localdomain systemd-rc-local-generator[73336]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:19 np0005625204.localdomain systemd-sysv-generator[73339]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:19 np0005625204.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Feb 20 08:10:19 np0005625204.localdomain tripleo-start-podman-container[73350]: Creating additional drop-in dependency for "ceilometer_agent_compute" (cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a)
Feb 20 08:10:19 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:19 np0005625204.localdomain systemd-rc-local-generator[73407]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:19 np0005625204.localdomain systemd-sysv-generator[73411]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:19 np0005625204.localdomain systemd[1]: Started ceilometer_agent_compute container.
Feb 20 08:10:19 np0005625204.localdomain sudo[73308]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:20 np0005625204.localdomain sudo[73432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztirkccnzlvpakzyrcemxcqnlocfsrpk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:20 np0005625204.localdomain sudo[73432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Activating special unit Exit the Session...
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped target Main User Target.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped target Basic System.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped target Paths.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped target Sockets.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped target Timers.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Closed D-Bus User Message Bus Socket.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Stopped Create User's Volatile Files and Directories.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Removed slice User Application Slice.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Reached target Shutdown.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Finished Exit the Session.
Feb 20 08:10:20 np0005625204.localdomain systemd[72731]: Reached target Exit the Session.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 20 08:10:20 np0005625204.localdomain python3[73434]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:20 np0005625204.localdomain systemd-rc-local-generator[73459]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:20 np0005625204.localdomain systemd-sysv-generator[73466]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:20 np0005625204.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Feb 20 08:10:21 np0005625204.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Feb 20 08:10:21 np0005625204.localdomain sudo[73432]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:21 np0005625204.localdomain sshd[73489]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:21 np0005625204.localdomain sudo[73504]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adjbqrvuyiktcqfjzvbbzuoxmxxvyrju ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:21 np0005625204.localdomain sudo[73504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:21 np0005625204.localdomain python3[73506]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:21 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:22 np0005625204.localdomain systemd-sysv-generator[73537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:22 np0005625204.localdomain systemd-rc-local-generator[73531]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:22 np0005625204.localdomain systemd[1]: Starting logrotate_crond container...
Feb 20 08:10:22 np0005625204.localdomain systemd[1]: Started logrotate_crond container.
Feb 20 08:10:22 np0005625204.localdomain sudo[73504]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:22 np0005625204.localdomain sudo[73570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnnbcxsyvleetvrvmybgmhzccqengbvf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:22 np0005625204.localdomain sudo[73570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:22 np0005625204.localdomain python3[73572]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:23 np0005625204.localdomain systemd-rc-local-generator[73600]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:23 np0005625204.localdomain systemd-sysv-generator[73606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:23 np0005625204.localdomain sshd[73489]: Invalid user alan from 185.246.128.171 port 62506
Feb 20 08:10:23 np0005625204.localdomain systemd[1]: Starting nova_migration_target container...
Feb 20 08:10:23 np0005625204.localdomain systemd[1]: Started nova_migration_target container.
Feb 20 08:10:23 np0005625204.localdomain sudo[73570]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:23 np0005625204.localdomain sudo[73638]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbgqmxsdzaikeapvjjaohgabpyjuakam ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:23 np0005625204.localdomain sudo[73638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:24 np0005625204.localdomain sshd[73489]: Disconnecting invalid user alan 185.246.128.171 port 62506: Change of username or service not allowed: (alan,ssh-connection) -> (dbadmin,ssh-connection) [preauth]
Feb 20 08:10:24 np0005625204.localdomain python3[73640]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:24 np0005625204.localdomain systemd-sysv-generator[73671]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:24 np0005625204.localdomain systemd-rc-local-generator[73666]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:24 np0005625204.localdomain systemd[1]: Starting ovn_controller container...
Feb 20 08:10:24 np0005625204.localdomain tripleo-start-podman-container[73680]: Creating additional drop-in dependency for "ovn_controller" (0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850)
Feb 20 08:10:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:24 np0005625204.localdomain systemd-rc-local-generator[73735]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:24 np0005625204.localdomain systemd-sysv-generator[73739]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:24 np0005625204.localdomain systemd[1]: Started ovn_controller container.
Feb 20 08:10:24 np0005625204.localdomain sudo[73638]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:25 np0005625204.localdomain sudo[73762]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psmckrmongjsonjaqoqfwmrpmryuxysp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:10:25 np0005625204.localdomain sudo[73762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:25 np0005625204.localdomain python3[73764]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:10:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:10:25 np0005625204.localdomain systemd-rc-local-generator[73790]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:10:25 np0005625204.localdomain systemd-sysv-generator[73793]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:10:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:10:25 np0005625204.localdomain systemd[1]: Starting ovn_metadata_agent container...
Feb 20 08:10:26 np0005625204.localdomain systemd[1]: Started ovn_metadata_agent container.
Feb 20 08:10:26 np0005625204.localdomain sshd[73816]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:26 np0005625204.localdomain sudo[73762]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:26 np0005625204.localdomain sudo[73845]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-erafmlhngzavplnrzircexodcfqlhprf ; /usr/bin/python3
Feb 20 08:10:26 np0005625204.localdomain sudo[73845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:10:26 np0005625204.localdomain podman[73848]: 2026-02-20 08:10:26.493527701 +0000 UTC m=+0.086128388 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, url=https://www.redhat.com, release=1766032510)
Feb 20 08:10:26 np0005625204.localdomain python3[73847]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:26 np0005625204.localdomain sudo[73845]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:26 np0005625204.localdomain podman[73848]: 2026-02-20 08:10:26.691002431 +0000 UTC m=+0.283603118 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:10:26 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:10:27 np0005625204.localdomain sudo[73923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zorfdkfoktcxfejmvddkfpjpoirqrerj ; /usr/bin/python3
Feb 20 08:10:27 np0005625204.localdomain sudo[73923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:27 np0005625204.localdomain sudo[73923]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:27 np0005625204.localdomain sudo[73966]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxcweksefrafmigofjasyrovhlvglluf ; /usr/bin/python3
Feb 20 08:10:27 np0005625204.localdomain sudo[73966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:27 np0005625204.localdomain sudo[73966]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:27 np0005625204.localdomain sudo[73996]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbxcuhjemdojwbzszkqbzhfzeieosiow ; /usr/bin/python3
Feb 20 08:10:27 np0005625204.localdomain sudo[73996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:28 np0005625204.localdomain python3[73998]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005625204 step=4 update_config_hash_only=False
Feb 20 08:10:28 np0005625204.localdomain sudo[73996]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:28 np0005625204.localdomain sshd[73816]: Invalid user dbadmin from 185.246.128.171 port 51443
Feb 20 08:10:28 np0005625204.localdomain sudo[74013]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdyqyszqszngkbgyzbltgphumoixyqvr ; /usr/bin/python3
Feb 20 08:10:28 np0005625204.localdomain sudo[74013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:28 np0005625204.localdomain python3[74015]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:10:28 np0005625204.localdomain sudo[74013]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:28 np0005625204.localdomain sudo[74029]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkzejerysbrxulrladakjxlzfbajjtmd ; /usr/bin/python3
Feb 20 08:10:28 np0005625204.localdomain sudo[74029]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:10:28 np0005625204.localdomain python3[74031]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 20 08:10:28 np0005625204.localdomain sudo[74029]: pam_unix(sudo:session): session closed for user root
Feb 20 08:10:30 np0005625204.localdomain sshd[74032]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:30 np0005625204.localdomain sshd[73816]: Disconnecting invalid user dbadmin 185.246.128.171 port 51443: Change of username or service not allowed: (dbadmin,ssh-connection) -> (dqi,ssh-connection) [preauth]
Feb 20 08:10:32 np0005625204.localdomain sshd[74033]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:10:35 np0005625204.localdomain podman[74036]: 2026-02-20 08:10:35.167883002 +0000 UTC m=+0.099789859 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:10:35 np0005625204.localdomain podman[74036]: 2026-02-20 08:10:35.184203484 +0000 UTC m=+0.116110351 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 20 08:10:35 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: tmp-crun.bf5KdM.mount: Deactivated successfully.
Feb 20 08:10:36 np0005625204.localdomain podman[74057]: 2026-02-20 08:10:36.137849723 +0000 UTC m=+0.072198198 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:10:36 np0005625204.localdomain sshd[74033]: Invalid user dqi from 185.246.128.171 port 55691
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:10:36 np0005625204.localdomain podman[74057]: 2026-02-20 08:10:36.212438292 +0000 UTC m=+0.146786747 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:10:36 np0005625204.localdomain podman[74091]: 2026-02-20 08:10:36.226180816 +0000 UTC m=+0.066246743 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true)
Feb 20 08:10:36 np0005625204.localdomain podman[74056]: 2026-02-20 08:10:36.194970044 +0000 UTC m=+0.133749986 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, architecture=x86_64, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:10:36 np0005625204.localdomain podman[74056]: 2026-02-20 08:10:36.274075593 +0000 UTC m=+0.212855555 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:10:36 np0005625204.localdomain podman[74091]: 2026-02-20 08:10:36.325754647 +0000 UTC m=+0.165820574 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=)
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:10:36 np0005625204.localdomain sshd[74033]: Disconnecting invalid user dqi 185.246.128.171 port 55691: Change of username or service not allowed: (dqi,ssh-connection) -> (azure,ssh-connection) [preauth]
Feb 20 08:10:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:10:36 np0005625204.localdomain podman[74128]: 2026-02-20 08:10:36.883007511 +0000 UTC m=+0.089537442 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:10:37 np0005625204.localdomain podman[74128]: 2026-02-20 08:10:37.223713717 +0000 UTC m=+0.430243638 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Feb 20 08:10:37 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:10:37 np0005625204.localdomain sshd[74150]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:10:39 np0005625204.localdomain podman[74152]: 2026-02-20 08:10:39.150492683 +0000 UTC m=+0.086650233 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:10:39 np0005625204.localdomain podman[74152]: 2026-02-20 08:10:39.163568467 +0000 UTC m=+0.099726027 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git)
Feb 20 08:10:39 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:10:39 np0005625204.localdomain sshd[74150]: Invalid user azure from 185.246.128.171 port 45955
Feb 20 08:10:39 np0005625204.localdomain sshd[74150]: Disconnecting invalid user azure 185.246.128.171 port 45955: Change of username or service not allowed: (azure,ssh-connection) -> (alexandra,ssh-connection) [preauth]
Feb 20 08:10:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:10:40 np0005625204.localdomain podman[74172]: 2026-02-20 08:10:40.130724291 +0000 UTC m=+0.067286536 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, release=1766032510)
Feb 20 08:10:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:10:40 np0005625204.localdomain podman[74172]: 2026-02-20 08:10:40.183318693 +0000 UTC m=+0.119880928 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:10:40 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:10:40 np0005625204.localdomain podman[74192]: 2026-02-20 08:10:40.239543047 +0000 UTC m=+0.085429896 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z)
Feb 20 08:10:40 np0005625204.localdomain podman[74192]: 2026-02-20 08:10:40.288291339 +0000 UTC m=+0.134178198 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:10:40 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:10:40 np0005625204.localdomain sshd[74032]: error: kex_exchange_identification: read: Connection timed out
Feb 20 08:10:40 np0005625204.localdomain sshd[74032]: banner exchange: Connection from 115.190.172.63 port 60208: Connection timed out
Feb 20 08:10:41 np0005625204.localdomain sshd[74218]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:43 np0005625204.localdomain sshd[74218]: Invalid user alexandra from 185.246.128.171 port 18856
Feb 20 08:10:43 np0005625204.localdomain sshd[74218]: Disconnecting invalid user alexandra 185.246.128.171 port 18856: Change of username or service not allowed: (alexandra,ssh-connection) -> (USER3,ssh-connection) [preauth]
Feb 20 08:10:46 np0005625204.localdomain sshd[74220]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 08:10:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 08:10:49 np0005625204.localdomain sshd[74220]: Invalid user USER3 from 185.246.128.171 port 6353
Feb 20 08:10:50 np0005625204.localdomain sshd[74220]: Disconnecting invalid user USER3 185.246.128.171 port 6353: Change of username or service not allowed: (USER3,ssh-connection) -> (user11,ssh-connection) [preauth]
Feb 20 08:10:51 np0005625204.localdomain sshd[74222]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:54 np0005625204.localdomain sshd[74224]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:10:54 np0005625204.localdomain sshd[74224]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:10:55 np0005625204.localdomain sshd[74222]: Invalid user user11 from 185.246.128.171 port 62795
Feb 20 08:10:55 np0005625204.localdomain sshd[74222]: Disconnecting invalid user user11 185.246.128.171 port 62795: Change of username or service not allowed: (user11,ssh-connection) -> (syncthing,ssh-connection) [preauth]
Feb 20 08:10:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:10:57 np0005625204.localdomain podman[74226]: 2026-02-20 08:10:57.146133586 +0000 UTC m=+0.079565865 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vendor=Red Hat, Inc., container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:10:57 np0005625204.localdomain podman[74226]: 2026-02-20 08:10:57.365277733 +0000 UTC m=+0.298710032 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=metrics_qdr, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:10:57 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:10:57 np0005625204.localdomain sshd[74256]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:04 np0005625204.localdomain sshd[74256]: Invalid user syncthing from 185.246.128.171 port 1376
Feb 20 08:11:05 np0005625204.localdomain sshd[74256]: Disconnecting invalid user syncthing 185.246.128.171 port 1376: Change of username or service not allowed: (syncthing,ssh-connection) -> (uploader,ssh-connection) [preauth]
Feb 20 08:11:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:11:05 np0005625204.localdomain podman[74258]: 2026-02-20 08:11:05.73431542 +0000 UTC m=+0.076618734 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z)
Feb 20 08:11:05 np0005625204.localdomain podman[74258]: 2026-02-20 08:11:05.741834211 +0000 UTC m=+0.084137525 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, distribution-scope=public, container_name=collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:11:05 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:11:06 np0005625204.localdomain sudo[74280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:11:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:11:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:11:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:11:06 np0005625204.localdomain sudo[74280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:11:06 np0005625204.localdomain sudo[74280]: pam_unix(sudo:session): session closed for user root
Feb 20 08:11:06 np0005625204.localdomain sudo[74313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:11:06 np0005625204.localdomain sudo[74313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:11:06 np0005625204.localdomain podman[74296]: 2026-02-20 08:11:06.810746974 +0000 UTC m=+0.095744503 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64)
Feb 20 08:11:06 np0005625204.localdomain podman[74295]: 2026-02-20 08:11:06.86866277 +0000 UTC m=+0.153808944 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:11:06 np0005625204.localdomain podman[74296]: 2026-02-20 08:11:06.895141476 +0000 UTC m=+0.180138985 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, release=1766032510, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:11:06 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:11:06 np0005625204.localdomain podman[74295]: 2026-02-20 08:11:06.914146223 +0000 UTC m=+0.199292397 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:11:06 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:11:06 np0005625204.localdomain podman[74294]: 2026-02-20 08:11:06.929229647 +0000 UTC m=+0.214732012 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 20 08:11:06 np0005625204.localdomain podman[74294]: 2026-02-20 08:11:06.958018775 +0000 UTC m=+0.243521180 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4)
Feb 20 08:11:06 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:11:07 np0005625204.localdomain sudo[74313]: pam_unix(sudo:session): session closed for user root
Feb 20 08:11:07 np0005625204.localdomain sshd[74414]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:11:08 np0005625204.localdomain systemd[1]: tmp-crun.VDjnuj.mount: Deactivated successfully.
Feb 20 08:11:08 np0005625204.localdomain podman[74415]: 2026-02-20 08:11:08.16288251 +0000 UTC m=+0.098263672 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:11:08 np0005625204.localdomain sudo[74427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:11:08 np0005625204.localdomain sudo[74427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:11:08 np0005625204.localdomain sudo[74427]: pam_unix(sudo:session): session closed for user root
Feb 20 08:11:08 np0005625204.localdomain podman[74415]: 2026-02-20 08:11:08.530215448 +0000 UTC m=+0.465596550 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:11:08 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:11:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:11:10 np0005625204.localdomain podman[74453]: 2026-02-20 08:11:10.132693554 +0000 UTC m=+0.075949633 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git)
Feb 20 08:11:10 np0005625204.localdomain podman[74453]: 2026-02-20 08:11:10.143617821 +0000 UTC m=+0.086873880 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:11:10 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:11:10 np0005625204.localdomain sshd[74414]: Invalid user uploader from 185.246.128.171 port 44235
Feb 20 08:11:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:11:10 np0005625204.localdomain podman[74473]: 2026-02-20 08:11:10.315228222 +0000 UTC m=+0.070312499 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ovn_controller, batch=17.1_20260112.1)
Feb 20 08:11:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:11:10 np0005625204.localdomain podman[74494]: 2026-02-20 08:11:10.405855487 +0000 UTC m=+0.072593500 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:11:10 np0005625204.localdomain podman[74473]: 2026-02-20 08:11:10.4163222 +0000 UTC m=+0.171406437 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 20 08:11:10 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:11:10 np0005625204.localdomain podman[74494]: 2026-02-20 08:11:10.453125774 +0000 UTC m=+0.119863797 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public)
Feb 20 08:11:10 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:11:11 np0005625204.localdomain sshd[74414]: Disconnecting invalid user uploader 185.246.128.171 port 44235: Change of username or service not allowed: (uploader,ssh-connection) -> (teste,ssh-connection) [preauth]
Feb 20 08:11:13 np0005625204.localdomain sshd[74522]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:15 np0005625204.localdomain sshd[74522]: Invalid user teste from 185.246.128.171 port 38146
Feb 20 08:11:16 np0005625204.localdomain sshd[74522]: Disconnecting invalid user teste 185.246.128.171 port 38146: Change of username or service not allowed: (teste,ssh-connection) -> (Nova,ssh-connection) [preauth]
Feb 20 08:11:19 np0005625204.localdomain sshd[74524]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:21 np0005625204.localdomain sshd[74526]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:22 np0005625204.localdomain sshd[74524]: Invalid user Nova from 185.246.128.171 port 48988
Feb 20 08:11:22 np0005625204.localdomain sshd[74524]: Disconnecting invalid user Nova 185.246.128.171 port 48988: Change of username or service not allowed: (Nova,ssh-connection) -> (fran,ssh-connection) [preauth]
Feb 20 08:11:22 np0005625204.localdomain sshd[74526]: Received disconnect from 101.36.109.176 port 58624:11: Bye Bye [preauth]
Feb 20 08:11:22 np0005625204.localdomain sshd[74526]: Disconnected from authenticating user root 101.36.109.176 port 58624 [preauth]
Feb 20 08:11:23 np0005625204.localdomain sshd[74528]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:26 np0005625204.localdomain sshd[74528]: Invalid user fran from 185.246.128.171 port 23931
Feb 20 08:11:26 np0005625204.localdomain sshd[74528]: Disconnecting invalid user fran 185.246.128.171 port 23931: Change of username or service not allowed: (fran,ssh-connection) -> (vendas,ssh-connection) [preauth]
Feb 20 08:11:27 np0005625204.localdomain sshd[74530]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:11:28 np0005625204.localdomain podman[74531]: 2026-02-20 08:11:28.159384502 +0000 UTC m=+0.093876727 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:11:28 np0005625204.localdomain podman[74531]: 2026-02-20 08:11:28.392333825 +0000 UTC m=+0.326826080 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:11:28 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:11:30 np0005625204.localdomain sshd[74530]: Invalid user vendas from 185.246.128.171 port 6270
Feb 20 08:11:30 np0005625204.localdomain sshd[74530]: Disconnecting invalid user vendas 185.246.128.171 port 6270: Change of username or service not allowed: (vendas,ssh-connection) -> (lucy,ssh-connection) [preauth]
Feb 20 08:11:32 np0005625204.localdomain sshd[74562]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:11:36 np0005625204.localdomain podman[74564]: 2026-02-20 08:11:36.144505808 +0000 UTC m=+0.084074633 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 20 08:11:36 np0005625204.localdomain podman[74564]: 2026-02-20 08:11:36.153413233 +0000 UTC m=+0.092982068 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:11:36 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:11:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:11:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:11:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:11:37 np0005625204.localdomain podman[74587]: 2026-02-20 08:11:37.150565722 +0000 UTC m=+0.078355527 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:11:37 np0005625204.localdomain sshd[74562]: Invalid user lucy from 185.246.128.171 port 61767
Feb 20 08:11:37 np0005625204.localdomain podman[74585]: 2026-02-20 08:11:37.205761464 +0000 UTC m=+0.140582846 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T23:07:30Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 20 08:11:37 np0005625204.localdomain podman[74586]: 2026-02-20 08:11:37.26171497 +0000 UTC m=+0.191739854 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container)
Feb 20 08:11:37 np0005625204.localdomain podman[74585]: 2026-02-20 08:11:37.262496964 +0000 UTC m=+0.197318366 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, release=1766032510, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 20 08:11:37 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:11:37 np0005625204.localdomain podman[74587]: 2026-02-20 08:11:37.28316029 +0000 UTC m=+0.210950165 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:11:37 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:11:37 np0005625204.localdomain podman[74586]: 2026-02-20 08:11:37.341153149 +0000 UTC m=+0.271178033 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron)
Feb 20 08:11:37 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:11:38 np0005625204.localdomain systemd[1]: tmp-crun.VHnWWz.mount: Deactivated successfully.
Feb 20 08:11:38 np0005625204.localdomain sshd[74562]: Disconnecting invalid user lucy 185.246.128.171 port 61767: Change of username or service not allowed: (lucy,ssh-connection) -> (cq,ssh-connection) [preauth]
Feb 20 08:11:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:11:39 np0005625204.localdomain podman[74656]: 2026-02-20 08:11:39.135553103 +0000 UTC m=+0.074944162 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5)
Feb 20 08:11:39 np0005625204.localdomain sshd[74679]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:39 np0005625204.localdomain podman[74656]: 2026-02-20 08:11:39.511349362 +0000 UTC m=+0.450740421 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:11:39 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:11:39 np0005625204.localdomain sshd[74679]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:11:39 np0005625204.localdomain sshd[74681]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:11:41 np0005625204.localdomain podman[74685]: 2026-02-20 08:11:41.133669279 +0000 UTC m=+0.073455166 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible)
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: tmp-crun.oe8bfj.mount: Deactivated successfully.
Feb 20 08:11:41 np0005625204.localdomain podman[74685]: 2026-02-20 08:11:41.183984231 +0000 UTC m=+0.123770078 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:11:41 np0005625204.localdomain podman[74684]: 2026-02-20 08:11:41.234735876 +0000 UTC m=+0.171678155 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64)
Feb 20 08:11:41 np0005625204.localdomain podman[74683]: 2026-02-20 08:11:41.18623322 +0000 UTC m=+0.128256186 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:11:41 np0005625204.localdomain podman[74684]: 2026-02-20 08:11:41.243951559 +0000 UTC m=+0.180893868 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1)
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:11:41 np0005625204.localdomain podman[74683]: 2026-02-20 08:11:41.264296998 +0000 UTC m=+0.206319964 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:11:41 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:11:43 np0005625204.localdomain sshd[74681]: Invalid user cq from 185.246.128.171 port 8908
Feb 20 08:11:43 np0005625204.localdomain sshd[74681]: Disconnecting invalid user cq 185.246.128.171 port 8908: Change of username or service not allowed: (cq,ssh-connection) -> (odoo,ssh-connection) [preauth]
Feb 20 08:11:45 np0005625204.localdomain sshd[74743]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:47 np0005625204.localdomain sshd[74743]: Invalid user odoo from 185.246.128.171 port 3405
Feb 20 08:11:48 np0005625204.localdomain sshd[74743]: Disconnecting invalid user odoo 185.246.128.171 port 3405: Change of username or service not allowed: (odoo,ssh-connection) -> (osmc,ssh-connection) [preauth]
Feb 20 08:11:49 np0005625204.localdomain sshd[74745]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:52 np0005625204.localdomain sshd[74745]: Invalid user osmc from 185.246.128.171 port 54264
Feb 20 08:11:53 np0005625204.localdomain sshd[74745]: Disconnecting invalid user osmc 185.246.128.171 port 54264: Change of username or service not allowed: (osmc,ssh-connection) -> (postgres,ssh-connection) [preauth]
Feb 20 08:11:54 np0005625204.localdomain sshd[74747]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:55 np0005625204.localdomain sshd[74747]: Invalid user postgres from 185.246.128.171 port 38649
Feb 20 08:11:58 np0005625204.localdomain sshd[74747]: error: maximum authentication attempts exceeded for invalid user postgres from 185.246.128.171 port 38649 ssh2 [preauth]
Feb 20 08:11:58 np0005625204.localdomain sshd[74747]: Disconnecting invalid user postgres 185.246.128.171 port 38649: Too many authentication failures [preauth]
Feb 20 08:11:58 np0005625204.localdomain sshd[74749]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:58 np0005625204.localdomain sshd[74751]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:11:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:11:59 np0005625204.localdomain systemd[1]: tmp-crun.kwtMEV.mount: Deactivated successfully.
Feb 20 08:11:59 np0005625204.localdomain podman[74752]: 2026-02-20 08:11:59.15370739 +0000 UTC m=+0.092399109 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 20 08:11:59 np0005625204.localdomain podman[74752]: 2026-02-20 08:11:59.373131115 +0000 UTC m=+0.311822804 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, batch=17.1_20260112.1)
Feb 20 08:11:59 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:11:59 np0005625204.localdomain sshd[74749]: Received disconnect from 178.217.173.50 port 34274:11: Bye Bye [preauth]
Feb 20 08:11:59 np0005625204.localdomain sshd[74749]: Disconnected from authenticating user root 178.217.173.50 port 34274 [preauth]
Feb 20 08:12:00 np0005625204.localdomain sshd[74751]: Invalid user postgres from 185.246.128.171 port 26400
Feb 20 08:12:01 np0005625204.localdomain sshd[74751]: Disconnecting invalid user postgres 185.246.128.171 port 26400: Change of username or service not allowed: (postgres,ssh-connection) -> (sipv,ssh-connection) [preauth]
Feb 20 08:12:02 np0005625204.localdomain sshd[74782]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:12:07 np0005625204.localdomain podman[74784]: 2026-02-20 08:12:07.157343387 +0000 UTC m=+0.090371208 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:15Z, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:12:07 np0005625204.localdomain podman[74784]: 2026-02-20 08:12:07.17315224 +0000 UTC m=+0.106179971 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:12:07 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:12:07 np0005625204.localdomain sshd[74782]: Invalid user sipv from 185.246.128.171 port 9191
Feb 20 08:12:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:12:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:12:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:12:07 np0005625204.localdomain podman[74805]: 2026-02-20 08:12:07.978539961 +0000 UTC m=+0.082179757 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:12:07 np0005625204.localdomain podman[74805]: 2026-02-20 08:12:07.988235267 +0000 UTC m=+0.091875073 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:12:08 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:12:08 np0005625204.localdomain podman[74804]: 2026-02-20 08:12:08.036416432 +0000 UTC m=+0.142176122 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:12:08 np0005625204.localdomain podman[74804]: 2026-02-20 08:12:08.090096805 +0000 UTC m=+0.195856475 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T23:07:30Z, release=1766032510)
Feb 20 08:12:08 np0005625204.localdomain podman[74806]: 2026-02-20 08:12:08.099307847 +0000 UTC m=+0.193224205 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:12:08 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:12:08 np0005625204.localdomain podman[74806]: 2026-02-20 08:12:08.151202015 +0000 UTC m=+0.245118363 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:12:08 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:12:08 np0005625204.localdomain sudo[74874]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:12:08 np0005625204.localdomain sudo[74874]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:12:08 np0005625204.localdomain sudo[74874]: pam_unix(sudo:session): session closed for user root
Feb 20 08:12:08 np0005625204.localdomain sudo[74889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 08:12:08 np0005625204.localdomain sudo[74889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:12:09 np0005625204.localdomain sshd[74782]: Disconnecting invalid user sipv 185.246.128.171 port 9191: Change of username or service not allowed: (sipv,ssh-connection) -> (healthnet,ssh-connection) [preauth]
Feb 20 08:12:09 np0005625204.localdomain podman[74976]: 2026-02-20 08:12:09.139333009 +0000 UTC m=+0.086484988 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True)
Feb 20 08:12:09 np0005625204.localdomain podman[74976]: 2026-02-20 08:12:09.281354866 +0000 UTC m=+0.228506885 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, vcs-type=git, architecture=x86_64, ceph=True, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7)
Feb 20 08:12:09 np0005625204.localdomain sudo[74889]: pam_unix(sudo:session): session closed for user root
Feb 20 08:12:09 np0005625204.localdomain sudo[75041]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:12:09 np0005625204.localdomain sudo[75041]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:12:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:12:09 np0005625204.localdomain sudo[75041]: pam_unix(sudo:session): session closed for user root
Feb 20 08:12:09 np0005625204.localdomain sudo[75057]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:12:09 np0005625204.localdomain sudo[75057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:12:09 np0005625204.localdomain podman[75056]: 2026-02-20 08:12:09.772356474 +0000 UTC m=+0.092525883 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 20 08:12:10 np0005625204.localdomain podman[75056]: 2026-02-20 08:12:10.146123433 +0000 UTC m=+0.466292812 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:12:10 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:12:10 np0005625204.localdomain sudo[75057]: pam_unix(sudo:session): session closed for user root
Feb 20 08:12:10 np0005625204.localdomain sshd[75125]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:11 np0005625204.localdomain sudo[75126]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:12:11 np0005625204.localdomain sudo[75126]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:12:11 np0005625204.localdomain sudo[75126]: pam_unix(sudo:session): session closed for user root
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: tmp-crun.Hh4NOE.mount: Deactivated successfully.
Feb 20 08:12:12 np0005625204.localdomain podman[75142]: 2026-02-20 08:12:12.150708098 +0000 UTC m=+0.091343707 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:12:12 np0005625204.localdomain podman[75142]: 2026-02-20 08:12:12.181997695 +0000 UTC m=+0.122633384 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=)
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:12:12 np0005625204.localdomain podman[75143]: 2026-02-20 08:12:12.202212194 +0000 UTC m=+0.143238525 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public)
Feb 20 08:12:12 np0005625204.localdomain podman[75144]: 2026-02-20 08:12:12.248518942 +0000 UTC m=+0.185160698 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 20 08:12:12 np0005625204.localdomain podman[75143]: 2026-02-20 08:12:12.253061251 +0000 UTC m=+0.194087572 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:12:12 np0005625204.localdomain podman[75144]: 2026-02-20 08:12:12.293950542 +0000 UTC m=+0.230592208 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, release=1766032510, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:12:12 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:12:13 np0005625204.localdomain sshd[75125]: Invalid user healthnet from 185.246.128.171 port 29161
Feb 20 08:12:14 np0005625204.localdomain sshd[75125]: Disconnecting invalid user healthnet 185.246.128.171 port 29161: Change of username or service not allowed: (healthnet,ssh-connection) -> (Samuel,ssh-connection) [preauth]
Feb 20 08:12:15 np0005625204.localdomain sshd[75208]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:17 np0005625204.localdomain sshd[75208]: Invalid user Samuel from 185.246.128.171 port 21481
Feb 20 08:12:18 np0005625204.localdomain sshd[75208]: Disconnecting invalid user Samuel 185.246.128.171 port 21481: Change of username or service not allowed: (Samuel,ssh-connection) -> (anon,ssh-connection) [preauth]
Feb 20 08:12:19 np0005625204.localdomain sshd[75210]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:21 np0005625204.localdomain sshd[75210]: Invalid user anon from 185.246.128.171 port 1664
Feb 20 08:12:22 np0005625204.localdomain sshd[75210]: Disconnecting invalid user anon 185.246.128.171 port 1664: Change of username or service not allowed: (anon,ssh-connection) -> (system,ssh-connection) [preauth]
Feb 20 08:12:24 np0005625204.localdomain sshd[75212]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:25 np0005625204.localdomain sshd[75214]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:25 np0005625204.localdomain sshd[75214]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:12:25 np0005625204.localdomain sshd[75212]: Invalid user system from 185.246.128.171 port 2402
Feb 20 08:12:27 np0005625204.localdomain sshd[75212]: error: maximum authentication attempts exceeded for invalid user system from 185.246.128.171 port 2402 ssh2 [preauth]
Feb 20 08:12:27 np0005625204.localdomain sshd[75212]: Disconnecting invalid user system 185.246.128.171 port 2402: Too many authentication failures [preauth]
Feb 20 08:12:27 np0005625204.localdomain sshd[75216]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:12:30 np0005625204.localdomain podman[75218]: 2026-02-20 08:12:30.120358624 +0000 UTC m=+0.068507028 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, distribution-scope=public, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr)
Feb 20 08:12:30 np0005625204.localdomain podman[75218]: 2026-02-20 08:12:30.349458556 +0000 UTC m=+0.297607020 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 20 08:12:30 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:12:31 np0005625204.localdomain sshd[75216]: Invalid user system from 185.246.128.171 port 42873
Feb 20 08:12:31 np0005625204.localdomain sshd[75216]: Disconnecting invalid user system 185.246.128.171 port 42873: Change of username or service not allowed: (system,ssh-connection) -> (denis,ssh-connection) [preauth]
Feb 20 08:12:32 np0005625204.localdomain sshd[75249]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:34 np0005625204.localdomain sshd[75249]: Invalid user denis from 185.246.128.171 port 39559
Feb 20 08:12:34 np0005625204.localdomain sshd[75249]: Disconnecting invalid user denis 185.246.128.171 port 39559: Change of username or service not allowed: (denis,ssh-connection) -> (giovanni,ssh-connection) [preauth]
Feb 20 08:12:34 np0005625204.localdomain sshd[75251]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:12:38 np0005625204.localdomain podman[75253]: 2026-02-20 08:12:38.139021411 +0000 UTC m=+0.080779134 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:12:38 np0005625204.localdomain podman[75253]: 2026-02-20 08:12:38.158139196 +0000 UTC m=+0.099896959 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, release=1766032510)
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:12:38 np0005625204.localdomain podman[75292]: 2026-02-20 08:12:38.292738116 +0000 UTC m=+0.092909075 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:12:38 np0005625204.localdomain podman[75254]: 2026-02-20 08:12:38.250899195 +0000 UTC m=+0.188094618 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:12:38 np0005625204.localdomain podman[75282]: 2026-02-20 08:12:38.311292883 +0000 UTC m=+0.146443453 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:12:38 np0005625204.localdomain podman[75292]: 2026-02-20 08:12:38.322038662 +0000 UTC m=+0.122209611 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:12:38 np0005625204.localdomain podman[75254]: 2026-02-20 08:12:38.331030257 +0000 UTC m=+0.268225710 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, release=1766032510, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com)
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:12:38 np0005625204.localdomain podman[75282]: 2026-02-20 08:12:38.373174607 +0000 UTC m=+0.208325127 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1)
Feb 20 08:12:38 np0005625204.localdomain sshd[75251]: Invalid user giovanni from 185.246.128.171 port 63465
Feb 20 08:12:38 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:12:40 np0005625204.localdomain sshd[75251]: Disconnecting invalid user giovanni 185.246.128.171 port 63465: Change of username or service not allowed: (giovanni,ssh-connection) -> (Admin,ssh-connection) [preauth]
Feb 20 08:12:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:12:41 np0005625204.localdomain podman[75345]: 2026-02-20 08:12:41.15172971 +0000 UTC m=+0.088859240 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4)
Feb 20 08:12:41 np0005625204.localdomain podman[75345]: 2026-02-20 08:12:41.548996109 +0000 UTC m=+0.486125619 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Feb 20 08:12:41 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:12:41 np0005625204.localdomain sshd[75368]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:12:43 np0005625204.localdomain podman[75370]: 2026-02-20 08:12:43.139084767 +0000 UTC m=+0.076272746 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, version=17.1.13, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc.)
Feb 20 08:12:43 np0005625204.localdomain podman[75370]: 2026-02-20 08:12:43.166193746 +0000 UTC m=+0.103381715 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:12:43 np0005625204.localdomain podman[75372]: 2026-02-20 08:12:43.258819121 +0000 UTC m=+0.185603962 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: tmp-crun.CA4bBr.mount: Deactivated successfully.
Feb 20 08:12:43 np0005625204.localdomain podman[75371]: 2026-02-20 08:12:43.310239935 +0000 UTC m=+0.241180793 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:12:43 np0005625204.localdomain podman[75371]: 2026-02-20 08:12:43.320821299 +0000 UTC m=+0.251762107 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:12:43 np0005625204.localdomain podman[75372]: 2026-02-20 08:12:43.361622967 +0000 UTC m=+0.288407808 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:12:43 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:12:44 np0005625204.localdomain sshd[75368]: Invalid user Admin from 185.246.128.171 port 24533
Feb 20 08:12:46 np0005625204.localdomain sshd[75368]: error: maximum authentication attempts exceeded for invalid user Admin from 185.246.128.171 port 24533 ssh2 [preauth]
Feb 20 08:12:46 np0005625204.localdomain sshd[75368]: Disconnecting invalid user Admin 185.246.128.171 port 24533: Too many authentication failures [preauth]
Feb 20 08:12:49 np0005625204.localdomain sshd[75436]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:52 np0005625204.localdomain sshd[75436]: Invalid user Admin from 185.246.128.171 port 12699
Feb 20 08:12:54 np0005625204.localdomain sshd[75436]: Disconnecting invalid user Admin 185.246.128.171 port 12699: Change of username or service not allowed: (Admin,ssh-connection) -> (gitlab-runner,ssh-connection) [preauth]
Feb 20 08:12:56 np0005625204.localdomain sshd[75438]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:12:57 np0005625204.localdomain sshd[75438]: Invalid user gitlab-runner from 185.246.128.171 port 53314
Feb 20 08:12:59 np0005625204.localdomain sshd[75438]: Disconnecting invalid user gitlab-runner 185.246.128.171 port 53314: Change of username or service not allowed: (gitlab-runner,ssh-connection) -> (cloudera,ssh-connectio [preauth]
Feb 20 08:13:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:13:01 np0005625204.localdomain podman[75440]: 2026-02-20 08:13:01.154353108 +0000 UTC m=+0.087194481 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step1, distribution-scope=public)
Feb 20 08:13:01 np0005625204.localdomain sshd[75470]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:01 np0005625204.localdomain podman[75440]: 2026-02-20 08:13:01.316246892 +0000 UTC m=+0.249088265 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:13:01 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:13:03 np0005625204.localdomain sshd[75470]: Invalid user cloudera from 185.246.128.171 port 56423
Feb 20 08:13:04 np0005625204.localdomain sshd[75470]: Disconnecting invalid user cloudera 185.246.128.171 port 56423: Change of username or service not allowed: (cloudera,ssh-connection) -> (123456,ssh-connection) [preauth]
Feb 20 08:13:05 np0005625204.localdomain sshd[75472]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:08 np0005625204.localdomain sshd[75472]: Invalid user 123456 from 185.246.128.171 port 2012
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: tmp-crun.EWKxml.mount: Deactivated successfully.
Feb 20 08:13:09 np0005625204.localdomain podman[75474]: 2026-02-20 08:13:09.127552973 +0000 UTC m=+0.068829438 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 20 08:13:09 np0005625204.localdomain podman[75476]: 2026-02-20 08:13:09.142873192 +0000 UTC m=+0.075324447 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 20 08:13:09 np0005625204.localdomain podman[75474]: 2026-02-20 08:13:09.148930857 +0000 UTC m=+0.090207322 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:13:09 np0005625204.localdomain podman[75482]: 2026-02-20 08:13:09.190751307 +0000 UTC m=+0.121026325 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc.)
Feb 20 08:13:09 np0005625204.localdomain podman[75476]: 2026-02-20 08:13:09.200067473 +0000 UTC m=+0.132518748 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:10:15Z, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:13:09 np0005625204.localdomain podman[75482]: 2026-02-20 08:13:09.218010632 +0000 UTC m=+0.148285660 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:13:09 np0005625204.localdomain podman[75475]: 2026-02-20 08:13:09.171275661 +0000 UTC m=+0.109072069 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc.)
Feb 20 08:13:09 np0005625204.localdomain podman[75475]: 2026-02-20 08:13:09.304990624 +0000 UTC m=+0.242787042 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.13, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=)
Feb 20 08:13:09 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:13:10 np0005625204.localdomain sshd[75472]: Disconnecting invalid user 123456 185.246.128.171 port 2012: Change of username or service not allowed: (123456,ssh-connection) -> (adsl,ssh-connection) [preauth]
Feb 20 08:13:11 np0005625204.localdomain sudo[75565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:13:11 np0005625204.localdomain sudo[75565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:13:11 np0005625204.localdomain sudo[75565]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:11 np0005625204.localdomain sudo[75580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:13:11 np0005625204.localdomain sudo[75580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:13:11 np0005625204.localdomain sudo[75580]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:13:12 np0005625204.localdomain podman[75627]: 2026-02-20 08:13:12.144581775 +0000 UTC m=+0.082170456 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, release=1766032510)
Feb 20 08:13:12 np0005625204.localdomain sshd[75647]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:12 np0005625204.localdomain sshd[75649]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:12 np0005625204.localdomain sshd[75647]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:13:12 np0005625204.localdomain podman[75627]: 2026-02-20 08:13:12.535451668 +0000 UTC m=+0.473040309 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team)
Feb 20 08:13:12 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:13:12 np0005625204.localdomain sudo[75652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:13:12 np0005625204.localdomain sudo[75652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:13:12 np0005625204.localdomain sudo[75652]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:14 np0005625204.localdomain sshd[75649]: Invalid user adsl from 185.246.128.171 port 39804
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: tmp-crun.X29CHf.mount: Deactivated successfully.
Feb 20 08:13:14 np0005625204.localdomain podman[75668]: 2026-02-20 08:13:14.15245425 +0000 UTC m=+0.096865276 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: tmp-crun.4eQ0EK.mount: Deactivated successfully.
Feb 20 08:13:14 np0005625204.localdomain podman[75669]: 2026-02-20 08:13:14.18805948 +0000 UTC m=+0.132860547 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid)
Feb 20 08:13:14 np0005625204.localdomain podman[75668]: 2026-02-20 08:13:14.199120848 +0000 UTC m=+0.143531854 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:13:14 np0005625204.localdomain podman[75670]: 2026-02-20 08:13:14.245918991 +0000 UTC m=+0.184194539 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:13:14 np0005625204.localdomain podman[75669]: 2026-02-20 08:13:14.271918037 +0000 UTC m=+0.216719134 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:13:14 np0005625204.localdomain sshd[75649]: Disconnecting invalid user adsl 185.246.128.171 port 39804: Change of username or service not allowed: (adsl,ssh-connection) -> (fusion,ssh-connection) [preauth]
Feb 20 08:13:14 np0005625204.localdomain podman[75670]: 2026-02-20 08:13:14.31614356 +0000 UTC m=+0.254419088 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, release=1766032510)
Feb 20 08:13:14 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:13:15 np0005625204.localdomain sshd[75732]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:16 np0005625204.localdomain sudo[75779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmblrookejrbzebrsyslmolwdtjloxga ; /usr/bin/python3
Feb 20 08:13:16 np0005625204.localdomain sudo[75779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:16 np0005625204.localdomain python3[75781]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:16 np0005625204.localdomain sudo[75779]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:16 np0005625204.localdomain sudo[75824]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cthzbupjwtgddcwiwgsjtmuzczjrtfqj ; /usr/bin/python3
Feb 20 08:13:16 np0005625204.localdomain sudo[75824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:16 np0005625204.localdomain python3[75826]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575196.2527015-114047-222902698684906/source _original_basename=tmptlhqj4sy follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:16 np0005625204.localdomain sudo[75824]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:17 np0005625204.localdomain sudo[75854]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svnmqiukrpnfucihauqohcywmxfrszey ; /usr/bin/python3
Feb 20 08:13:17 np0005625204.localdomain sudo[75854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:17 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:13:17 np0005625204.localdomain recover_tripleo_nova_virtqemud[75858]: 63005
Feb 20 08:13:17 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:13:17 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:13:17 np0005625204.localdomain python3[75856]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:13:17 np0005625204.localdomain sudo[75854]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:18 np0005625204.localdomain sudo[75906]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otahcakydfgsxdpzngzckhrfydapcyme ; /usr/bin/python3
Feb 20 08:13:18 np0005625204.localdomain sudo[75906]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:18 np0005625204.localdomain sudo[75906]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:18 np0005625204.localdomain sudo[75924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xnzlikktozvlghaegageokenvriidxpq ; /usr/bin/python3
Feb 20 08:13:18 np0005625204.localdomain sudo[75924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:18 np0005625204.localdomain sudo[75924]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:18 np0005625204.localdomain sshd[75732]: Invalid user fusion from 185.246.128.171 port 27797
Feb 20 08:13:19 np0005625204.localdomain sudo[76028]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbeorxksfyerbhbhwlkgyotmncbbweil ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575199.0307937-114221-39317160651077/async_wrapper.py 129177955356 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575199.0307937-114221-39317160651077/AnsiballZ_command.py _
Feb 20 08:13:19 np0005625204.localdomain sudo[76028]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 08:13:19 np0005625204.localdomain sshd[75732]: Disconnecting invalid user fusion 185.246.128.171 port 27797: Change of username or service not allowed: (fusion,ssh-connection) -> (centos,ssh-connection) [preauth]
Feb 20 08:13:19 np0005625204.localdomain ansible-async_wrapper.py[76030]: Invoked with 129177955356 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575199.0307937-114221-39317160651077/AnsiballZ_command.py _
Feb 20 08:13:19 np0005625204.localdomain ansible-async_wrapper.py[76033]: Starting module and watcher
Feb 20 08:13:19 np0005625204.localdomain ansible-async_wrapper.py[76033]: Start watching 76034 (3600)
Feb 20 08:13:19 np0005625204.localdomain ansible-async_wrapper.py[76034]: Start module (76034)
Feb 20 08:13:19 np0005625204.localdomain ansible-async_wrapper.py[76030]: Return async_wrapper task started.
Feb 20 08:13:19 np0005625204.localdomain sudo[76028]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:19 np0005625204.localdomain sudo[76052]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxkgaspfzwqznnofjowpbgvgownzhkct ; /usr/bin/python3
Feb 20 08:13:19 np0005625204.localdomain sudo[76052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:19 np0005625204.localdomain python3[76054]: ansible-ansible.legacy.async_status Invoked with jid=129177955356.76030 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:13:19 np0005625204.localdomain sudo[76052]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:21 np0005625204.localdomain sshd[76074]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (file: /etc/puppet/hiera.yaml)
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: Undefined variable '::deploy_config_name';
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (file & line not available)
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (file & line not available)
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 20 08:13:23 np0005625204.localdomain sshd[76074]: Invalid user centos from 185.246.128.171 port 58153
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 20 08:13:23 np0005625204.localdomain puppet-user[76051]: Notice: Compiled catalog for np0005625204.localdomain in environment production in 0.26 seconds
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Notice: Applied catalog in 0.28 seconds
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Application:
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:    Initial environment: production
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:    Converged environment: production
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:          Run mode: user
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Changes:
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Events:
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Resources:
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:             Total: 19
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Time:
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:          Schedule: 0.00
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:           Package: 0.00
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:              Exec: 0.01
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:            Augeas: 0.01
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:              File: 0.03
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:           Service: 0.08
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:    Transaction evaluation: 0.27
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:    Catalog application: 0.28
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:    Config retrieval: 0.33
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:          Last run: 1771575204
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:        Filebucket: 0.00
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:             Total: 0.28
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]: Version:
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:            Config: 1771575203
Feb 20 08:13:24 np0005625204.localdomain puppet-user[76051]:            Puppet: 7.10.0
Feb 20 08:13:24 np0005625204.localdomain ansible-async_wrapper.py[76034]: Module complete (76034)
Feb 20 08:13:24 np0005625204.localdomain ansible-async_wrapper.py[76033]: Done in kid B.
Feb 20 08:13:24 np0005625204.localdomain sshd[76074]: Disconnecting invalid user centos 185.246.128.171 port 58153: Change of username or service not allowed: (centos,ssh-connection) -> (sysadmin,ssh-connection) [preauth]
Feb 20 08:13:26 np0005625204.localdomain sshd[76179]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:27 np0005625204.localdomain sshd[76179]: Invalid user sysadmin from 185.246.128.171 port 12487
Feb 20 08:13:28 np0005625204.localdomain sshd[76179]: Disconnecting invalid user sysadmin 185.246.128.171 port 12487: Change of username or service not allowed: (sysadmin,ssh-connection) -> (dmdba,ssh-connection) [preauth]
Feb 20 08:13:28 np0005625204.localdomain sshd[76181]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:30 np0005625204.localdomain sudo[76196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brkdwskgkcjasjtmeqfcttftujcsgizg ; /usr/bin/python3
Feb 20 08:13:30 np0005625204.localdomain sudo[76196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:30 np0005625204.localdomain python3[76198]: ansible-ansible.legacy.async_status Invoked with jid=129177955356.76030 mode=status _async_dir=/tmp/.ansible_async
Feb 20 08:13:30 np0005625204.localdomain sudo[76196]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:30 np0005625204.localdomain sshd[76181]: Invalid user dmdba from 185.246.128.171 port 50441
Feb 20 08:13:30 np0005625204.localdomain sshd[76181]: Disconnecting invalid user dmdba 185.246.128.171 port 50441: Change of username or service not allowed: (dmdba,ssh-connection) -> (cirros,ssh-connection) [preauth]
Feb 20 08:13:30 np0005625204.localdomain sudo[76212]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohyjschrezsonpwcgwsxwytnvonueeoz ; /usr/bin/python3
Feb 20 08:13:30 np0005625204.localdomain sudo[76212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:30 np0005625204.localdomain sshd[76215]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:30 np0005625204.localdomain python3[76214]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 08:13:30 np0005625204.localdomain sudo[76212]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:31 np0005625204.localdomain sudo[76230]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-reaarmvimdlhqryxvjrxydzlzzizfkuk ; /usr/bin/python3
Feb 20 08:13:31 np0005625204.localdomain sudo[76230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:31 np0005625204.localdomain python3[76232]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:13:31 np0005625204.localdomain sudo[76230]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:31 np0005625204.localdomain sudo[76280]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idrlisqdgbdrojbkgrliiufjhfzmtydd ; /usr/bin/python3
Feb 20 08:13:31 np0005625204.localdomain sudo[76280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:13:31 np0005625204.localdomain podman[76283]: 2026-02-20 08:13:31.686492894 +0000 UTC m=+0.091545062 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, container_name=metrics_qdr, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public)
Feb 20 08:13:31 np0005625204.localdomain python3[76282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:31 np0005625204.localdomain sudo[76280]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:31 np0005625204.localdomain sshd[76215]: Invalid user cirros from 185.246.128.171 port 12845
Feb 20 08:13:31 np0005625204.localdomain sudo[76326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hudulrluwlndkwehmkaavwaqqbdxgbbv ; /usr/bin/python3
Feb 20 08:13:31 np0005625204.localdomain sudo[76326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:31 np0005625204.localdomain podman[76283]: 2026-02-20 08:13:31.903584869 +0000 UTC m=+0.308637007 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team)
Feb 20 08:13:31 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:13:32 np0005625204.localdomain python3[76328]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpgolb4ma6 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 08:13:32 np0005625204.localdomain sudo[76326]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:32 np0005625204.localdomain sshd[76215]: Disconnecting invalid user cirros 185.246.128.171 port 12845: Change of username or service not allowed: (cirros,ssh-connection) -> (device,ssh-connection) [preauth]
Feb 20 08:13:32 np0005625204.localdomain sudo[76356]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htracdtnqorzjbowhyofzfnwccwkmzmy ; /usr/bin/python3
Feb 20 08:13:32 np0005625204.localdomain sudo[76356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:32 np0005625204.localdomain python3[76358]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:32 np0005625204.localdomain sudo[76356]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:32 np0005625204.localdomain sshd[76359]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:32 np0005625204.localdomain sudo[76373]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wezpqfffivrbkzkcuolsbdatilduywss ; /usr/bin/python3
Feb 20 08:13:32 np0005625204.localdomain sudo[76373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:34 np0005625204.localdomain sudo[76373]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:34 np0005625204.localdomain sudo[76463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwstowtxhlcdzujvqgenrxesnktzklsz ; /usr/bin/python3
Feb 20 08:13:34 np0005625204.localdomain sudo[76463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:34 np0005625204.localdomain python3[76465]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 20 08:13:34 np0005625204.localdomain sudo[76463]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:35 np0005625204.localdomain sudo[76482]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atzzdauyntmvjtzqdqracghjceezurjr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:35 np0005625204.localdomain sudo[76482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:35 np0005625204.localdomain sshd[76359]: Invalid user device from 185.246.128.171 port 45178
Feb 20 08:13:35 np0005625204.localdomain python3[76484]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:35 np0005625204.localdomain sudo[76482]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:35 np0005625204.localdomain sshd[76359]: Disconnecting invalid user device 185.246.128.171 port 45178: Change of username or service not allowed: (device,ssh-connection) -> (nc,ssh-connection) [preauth]
Feb 20 08:13:35 np0005625204.localdomain sudo[76498]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riccnxoqfibmsnwttpnexobvpuzbflxh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:35 np0005625204.localdomain sudo[76498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:35 np0005625204.localdomain sudo[76498]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:35 np0005625204.localdomain sudo[76514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyazpqwuqzzfjdnrexcuznnyjdyrlfdq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:35 np0005625204.localdomain sudo[76514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:36 np0005625204.localdomain sshd[76517]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:36 np0005625204.localdomain python3[76516]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:13:36 np0005625204.localdomain sudo[76514]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:36 np0005625204.localdomain sudo[76566]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtxcciujgydiukyfizwtubloeldtzkpq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:36 np0005625204.localdomain sudo[76566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:36 np0005625204.localdomain python3[76568]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:36 np0005625204.localdomain sudo[76566]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:36 np0005625204.localdomain sudo[76584]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fblxdargppaquvqxhlyovfpjnruluury ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:36 np0005625204.localdomain sudo[76584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:36 np0005625204.localdomain python3[76586]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:36 np0005625204.localdomain sudo[76584]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:37 np0005625204.localdomain sudo[76646]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeekcxsunahysmxpnvxzuddptmjpnlse ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:37 np0005625204.localdomain sudo[76646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:37 np0005625204.localdomain python3[76648]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:37 np0005625204.localdomain sudo[76646]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:37 np0005625204.localdomain sudo[76664]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ninipuyckdwbrurxhlwezxltcbenjkqb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:37 np0005625204.localdomain sudo[76664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:37 np0005625204.localdomain python3[76666]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:37 np0005625204.localdomain sudo[76664]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:38 np0005625204.localdomain sudo[76726]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qydegljdfsdbvzbsbjejfknbkvworbrj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:38 np0005625204.localdomain sudo[76726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:38 np0005625204.localdomain sshd[76517]: Invalid user nc from 185.246.128.171 port 41696
Feb 20 08:13:38 np0005625204.localdomain python3[76728]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:38 np0005625204.localdomain sudo[76726]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:38 np0005625204.localdomain sudo[76744]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwyyjrmfmroydnrsrffwdtmjgbljqyqo ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:38 np0005625204.localdomain sudo[76744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:38 np0005625204.localdomain sshd[76517]: Disconnecting invalid user nc 185.246.128.171 port 41696: Change of username or service not allowed: (nc,ssh-connection) -> (nodemanager,ssh-connection) [preauth]
Feb 20 08:13:38 np0005625204.localdomain python3[76746]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:38 np0005625204.localdomain sudo[76744]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:38 np0005625204.localdomain sshd[76807]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:38 np0005625204.localdomain sudo[76806]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysviyudkamcpyocqkubfiksdatdyxqwd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:38 np0005625204.localdomain sudo[76806]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:39 np0005625204.localdomain python3[76809]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:39 np0005625204.localdomain sudo[76806]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:39 np0005625204.localdomain sudo[76826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvhmgqhjqfcvdeehpryerqluhprxbqwz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:13:39 np0005625204.localdomain sudo[76826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:39 np0005625204.localdomain podman[76828]: 2026-02-20 08:13:39.317242167 +0000 UTC m=+0.072386305 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, vcs-type=git, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 20 08:13:39 np0005625204.localdomain podman[76828]: 2026-02-20 08:13:39.326568154 +0000 UTC m=+0.081712312 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13)
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:13:39 np0005625204.localdomain python3[76829]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: tmp-crun.yZ36nV.mount: Deactivated successfully.
Feb 20 08:13:39 np0005625204.localdomain podman[76856]: 2026-02-20 08:13:39.424296485 +0000 UTC m=+0.077645088 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510)
Feb 20 08:13:39 np0005625204.localdomain sudo[76826]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:39 np0005625204.localdomain podman[76849]: 2026-02-20 08:13:39.467845507 +0000 UTC m=+0.130147594 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, architecture=x86_64, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:13:39 np0005625204.localdomain podman[76856]: 2026-02-20 08:13:39.488967464 +0000 UTC m=+0.142316017 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=logrotate_crond, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:13:39 np0005625204.localdomain podman[76849]: 2026-02-20 08:13:39.522183381 +0000 UTC m=+0.184485498 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, vcs-type=git)
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:13:39 np0005625204.localdomain podman[76850]: 2026-02-20 08:13:39.546537366 +0000 UTC m=+0.199448966 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 20 08:13:39 np0005625204.localdomain podman[76850]: 2026-02-20 08:13:39.578078161 +0000 UTC m=+0.230989751 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:13:39 np0005625204.localdomain sudo[76949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpujakszxxocbleapxfkezmcdmvsjgxa ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:39 np0005625204.localdomain sudo[76949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:39 np0005625204.localdomain python3[76951]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:13:39 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:13:40 np0005625204.localdomain systemd-rc-local-generator[76972]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:13:40 np0005625204.localdomain systemd-sysv-generator[76977]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:13:40 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:13:40 np0005625204.localdomain systemd[1]: tmp-crun.acIsiJ.mount: Deactivated successfully.
Feb 20 08:13:40 np0005625204.localdomain sshd[76807]: Invalid user nodemanager from 185.246.128.171 port 38168
Feb 20 08:13:40 np0005625204.localdomain sudo[76949]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:40 np0005625204.localdomain sudo[77035]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-raizycyjlkcfkvlhdlzmuegbfsqbofya ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:40 np0005625204.localdomain sudo[77035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:40 np0005625204.localdomain sshd[76807]: Disconnecting invalid user nodemanager 185.246.128.171 port 38168: Change of username or service not allowed: (nodemanager,ssh-connection) -> (sepehr,ssh-connection) [preauth]
Feb 20 08:13:40 np0005625204.localdomain python3[77037]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:40 np0005625204.localdomain sudo[77035]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:40 np0005625204.localdomain sudo[77053]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmxtwaobfsrazrqznuoruxlfvqibdstq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:40 np0005625204.localdomain sudo[77053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:41 np0005625204.localdomain python3[77055]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:41 np0005625204.localdomain sudo[77053]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:41 np0005625204.localdomain sudo[77115]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sojxthzzigzicslvkckirodwxaahsqad ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:41 np0005625204.localdomain sudo[77115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:41 np0005625204.localdomain python3[77117]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 20 08:13:41 np0005625204.localdomain sudo[77115]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:41 np0005625204.localdomain sudo[77133]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icivwzadyehbbsieazsfqvorkkonckkf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:41 np0005625204.localdomain sudo[77133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:41 np0005625204.localdomain sshd[77136]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:41 np0005625204.localdomain python3[77135]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:13:41 np0005625204.localdomain sudo[77133]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:42 np0005625204.localdomain sudo[77164]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yauqiwzfvizbubqoxdtotgaweuyvrrev ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:42 np0005625204.localdomain sudo[77164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:42 np0005625204.localdomain python3[77166]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:13:42 np0005625204.localdomain systemd-rc-local-generator[77188]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:13:42 np0005625204.localdomain systemd-sysv-generator[77194]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 08:13:42 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 08:13:42 np0005625204.localdomain podman[77205]: 2026-02-20 08:13:42.801568862 +0000 UTC m=+0.077774821 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, tcib_managed=true, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public)
Feb 20 08:13:42 np0005625204.localdomain sudo[77164]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:43 np0005625204.localdomain sudo[77246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghqtzcbfeupoomrrtptehlouzyerengz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:43 np0005625204.localdomain sudo[77246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:43 np0005625204.localdomain podman[77205]: 2026-02-20 08:13:43.188446683 +0000 UTC m=+0.464652722 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:13:43 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:13:43 np0005625204.localdomain python3[77248]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 20 08:13:43 np0005625204.localdomain sudo[77246]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:43 np0005625204.localdomain sshd[77136]: Invalid user sepehr from 185.246.128.171 port 37669
Feb 20 08:13:43 np0005625204.localdomain sudo[77262]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unwfuwkniwnfdyxlxeeprttbrbqxeywi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:43 np0005625204.localdomain sudo[77262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:43 np0005625204.localdomain sshd[77136]: Disconnecting invalid user sepehr 185.246.128.171 port 37669: Change of username or service not allowed: (sepehr,ssh-connection) -> (jumpserver,ssh-connection) [preauth]
Feb 20 08:13:44 np0005625204.localdomain sudo[77262]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:13:45 np0005625204.localdomain podman[77291]: 2026-02-20 08:13:45.149848715 +0000 UTC m=+0.085906869 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1)
Feb 20 08:13:45 np0005625204.localdomain sudo[77345]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnfltqhvoyxgtzofhzwwiffzmdphlzbg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:13:45 np0005625204.localdomain sudo[77345]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:13:45 np0005625204.localdomain podman[77292]: 2026-02-20 08:13:45.209919314 +0000 UTC m=+0.143482192 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, tcib_managed=true, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:13:45 np0005625204.localdomain podman[77292]: 2026-02-20 08:13:45.218095725 +0000 UTC m=+0.151658603 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:13:45 np0005625204.localdomain podman[77291]: 2026-02-20 08:13:45.227059039 +0000 UTC m=+0.163117203 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller)
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:13:45 np0005625204.localdomain podman[77293]: 2026-02-20 08:13:45.308903524 +0000 UTC m=+0.239784470 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:13:45 np0005625204.localdomain podman[77293]: 2026-02-20 08:13:45.359077619 +0000 UTC m=+0.289958605 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:13:45 np0005625204.localdomain python3[77355]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 20 08:13:45 np0005625204.localdomain sshd[77398]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:45 np0005625204.localdomain podman[77410]: 2026-02-20 08:13:45.708846465 +0000 UTC m=+0.089003125 container create a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started libpod-conmon-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope.
Feb 20 08:13:45 np0005625204.localdomain podman[77410]: 2026-02-20 08:13:45.664789106 +0000 UTC m=+0.044945796 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:13:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:45 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:13:45 np0005625204.localdomain podman[77410]: 2026-02-20 08:13:45.820200373 +0000 UTC m=+0.200357063 container init a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:13:45 np0005625204.localdomain sudo[77431]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:13:45 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:13:45 np0005625204.localdomain podman[77410]: 2026-02-20 08:13:45.868554903 +0000 UTC m=+0.248711553 container start a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step5, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:13:45 np0005625204.localdomain python3[77355]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 20 08:13:45 np0005625204.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 20 08:13:45 np0005625204.localdomain systemd[77446]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:13:45 np0005625204.localdomain podman[77432]: 2026-02-20 08:13:45.96223974 +0000 UTC m=+0.087887651 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:13:46 np0005625204.localdomain podman[77432]: 2026-02-20 08:13:46.010452406 +0000 UTC m=+0.136100317 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:13:46 np0005625204.localdomain podman[77432]: unhealthy
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Queued start job for default target Main User Target.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Created slice User Application Slice.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Reached target Paths.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Reached target Timers.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Starting D-Bus User Message Bus Socket...
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Starting Create User's Volatile Files and Directories...
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Listening on D-Bus User Message Bus Socket.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Reached target Sockets.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Finished Create User's Volatile Files and Directories.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Reached target Basic System.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Reached target Main User Target.
Feb 20 08:13:46 np0005625204.localdomain systemd[77446]: Startup finished in 139ms.
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: Started User Manager for UID 0.
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: Started Session c10 of User root.
Feb 20 08:13:46 np0005625204.localdomain sudo[77431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 20 08:13:46 np0005625204.localdomain sudo[77431]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Feb 20 08:13:46 np0005625204.localdomain podman[77535]: 2026-02-20 08:13:46.37880894 +0000 UTC m=+0.085534498 container create 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: Started libpod-conmon-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2.scope.
Feb 20 08:13:46 np0005625204.localdomain podman[77535]: 2026-02-20 08:13:46.336084703 +0000 UTC m=+0.042810311 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:13:46 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:13:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 08:13:46 np0005625204.localdomain podman[77535]: 2026-02-20 08:13:46.48631994 +0000 UTC m=+0.193045478 container init 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_wait_for_compute_service, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, distribution-scope=public, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible)
Feb 20 08:13:46 np0005625204.localdomain podman[77535]: 2026-02-20 08:13:46.498442852 +0000 UTC m=+0.205168380 container start 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Feb 20 08:13:46 np0005625204.localdomain podman[77535]: 2026-02-20 08:13:46.498991469 +0000 UTC m=+0.205717047 container attach 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_wait_for_compute_service, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5)
Feb 20 08:13:46 np0005625204.localdomain sudo[77555]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 08:13:46 np0005625204.localdomain sudo[77555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 20 08:13:46 np0005625204.localdomain sudo[77555]: pam_unix(sudo:session): session closed for user root
Feb 20 08:13:47 np0005625204.localdomain sshd[77398]: Invalid user jumpserver from 185.246.128.171 port 31391
Feb 20 08:13:47 np0005625204.localdomain sshd[77398]: Disconnecting invalid user jumpserver 185.246.128.171 port 31391: Change of username or service not allowed: (jumpserver,ssh-connection) -> (vyos,ssh-connection) [preauth]
Feb 20 08:13:48 np0005625204.localdomain sshd[77559]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:50 np0005625204.localdomain sshd[77560]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:52 np0005625204.localdomain sshd[77560]: Invalid user vyos from 185.246.128.171 port 12639
Feb 20 08:13:53 np0005625204.localdomain sshd[77560]: Disconnecting invalid user vyos 185.246.128.171 port 12639: Change of username or service not allowed: (vyos,ssh-connection) -> (matrix,ssh-connection) [preauth]
Feb 20 08:13:54 np0005625204.localdomain sshd[77562]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:56 np0005625204.localdomain sshd[77562]: Invalid user matrix from 185.246.128.171 port 25127
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Activating special unit Exit the Session...
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped target Main User Target.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped target Basic System.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped target Paths.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped target Sockets.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped target Timers.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Closed D-Bus User Message Bus Socket.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Stopped Create User's Volatile Files and Directories.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Removed slice User Application Slice.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Reached target Shutdown.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Finished Exit the Session.
Feb 20 08:13:56 np0005625204.localdomain systemd[77446]: Reached target Exit the Session.
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 20 08:13:56 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 20 08:13:56 np0005625204.localdomain sshd[77562]: Disconnecting invalid user matrix 185.246.128.171 port 25127: Change of username or service not allowed: (matrix,ssh-connection) -> (openvswitch,ssh-connection) [preauth]
Feb 20 08:13:58 np0005625204.localdomain sshd[77565]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:59 np0005625204.localdomain sshd[77559]: error: kex_exchange_identification: read: Connection timed out
Feb 20 08:13:59 np0005625204.localdomain sshd[77559]: banner exchange: Connection from 115.190.172.63 port 36706: Connection timed out
Feb 20 08:13:59 np0005625204.localdomain sshd[77567]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:13:59 np0005625204.localdomain sshd[77567]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:14:01 np0005625204.localdomain sshd[77565]: Disconnecting authenticating user openvswitch 185.246.128.171 port 55004: Change of username or service not allowed: (openvswitch,ssh-connection) -> (loginuser,ssh-connection [preauth]
Feb 20 08:14:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:14:02 np0005625204.localdomain podman[77569]: 2026-02-20 08:14:02.16243874 +0000 UTC m=+0.097870806 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64)
Feb 20 08:14:02 np0005625204.localdomain podman[77569]: 2026-02-20 08:14:02.367561299 +0000 UTC m=+0.302993325 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64)
Feb 20 08:14:02 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:14:02 np0005625204.localdomain sshd[77598]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:03 np0005625204.localdomain sshd[77598]: Invalid user loginuser from 185.246.128.171 port 8879
Feb 20 08:14:04 np0005625204.localdomain sshd[77598]: Disconnecting invalid user loginuser 185.246.128.171 port 8879: Change of username or service not allowed: (loginuser,ssh-connection) -> (zomboid,ssh-connection) [preauth]
Feb 20 08:14:06 np0005625204.localdomain sshd[77600]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:07 np0005625204.localdomain sshd[77600]: Invalid user zomboid from 185.246.128.171 port 27185
Feb 20 08:14:08 np0005625204.localdomain sshd[77600]: Disconnecting invalid user zomboid 185.246.128.171 port 27185: Change of username or service not allowed: (zomboid,ssh-connection) -> (marco,ssh-connection) [preauth]
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:14:10 np0005625204.localdomain podman[77602]: 2026-02-20 08:14:10.146969004 +0000 UTC m=+0.087020355 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:14:10 np0005625204.localdomain podman[77603]: 2026-02-20 08:14:10.200104179 +0000 UTC m=+0.137041224 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:14:10 np0005625204.localdomain podman[77603]: 2026-02-20 08:14:10.236147223 +0000 UTC m=+0.173084278 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true)
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:14:10 np0005625204.localdomain podman[77604]: 2026-02-20 08:14:10.256053622 +0000 UTC m=+0.190985666 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:14:10 np0005625204.localdomain podman[77604]: 2026-02-20 08:14:10.269962348 +0000 UTC m=+0.204894392 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:14:10 np0005625204.localdomain podman[77605]: 2026-02-20 08:14:10.312848951 +0000 UTC m=+0.242412091 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64)
Feb 20 08:14:10 np0005625204.localdomain podman[77602]: 2026-02-20 08:14:10.330118729 +0000 UTC m=+0.270170070 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git)
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:14:10 np0005625204.localdomain podman[77605]: 2026-02-20 08:14:10.372204658 +0000 UTC m=+0.301767788 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Feb 20 08:14:10 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:14:10 np0005625204.localdomain sshd[77691]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:11 np0005625204.localdomain systemd[1]: tmp-crun.egEuxt.mount: Deactivated successfully.
Feb 20 08:14:12 np0005625204.localdomain sudo[77693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:14:12 np0005625204.localdomain sudo[77693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:14:12 np0005625204.localdomain sudo[77693]: pam_unix(sudo:session): session closed for user root
Feb 20 08:14:12 np0005625204.localdomain sudo[77708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:14:12 np0005625204.localdomain sudo[77708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:14:13 np0005625204.localdomain sudo[77708]: pam_unix(sudo:session): session closed for user root
Feb 20 08:14:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:14:14 np0005625204.localdomain systemd[1]: tmp-crun.CUanUo.mount: Deactivated successfully.
Feb 20 08:14:14 np0005625204.localdomain podman[77754]: 2026-02-20 08:14:14.186628325 +0000 UTC m=+0.093408551 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:14:14 np0005625204.localdomain sudo[77770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:14:14 np0005625204.localdomain sudo[77770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:14:14 np0005625204.localdomain sudo[77770]: pam_unix(sudo:session): session closed for user root
Feb 20 08:14:14 np0005625204.localdomain sshd[77691]: Invalid user marco from 185.246.128.171 port 4180
Feb 20 08:14:14 np0005625204.localdomain podman[77754]: 2026-02-20 08:14:14.59425018 +0000 UTC m=+0.501030396 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc.)
Feb 20 08:14:14 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:14:15 np0005625204.localdomain sshd[77691]: Disconnecting invalid user marco 185.246.128.171 port 4180: Change of username or service not allowed: (marco,ssh-connection) -> (dspace,ssh-connection) [preauth]
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: tmp-crun.A4LGtX.mount: Deactivated successfully.
Feb 20 08:14:15 np0005625204.localdomain podman[77794]: 2026-02-20 08:14:15.481091294 +0000 UTC m=+0.090960476 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 20 08:14:15 np0005625204.localdomain podman[77792]: 2026-02-20 08:14:15.444806803 +0000 UTC m=+0.066963660 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 20 08:14:15 np0005625204.localdomain podman[77793]: 2026-02-20 08:14:15.516436245 +0000 UTC m=+0.135975212 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=)
Feb 20 08:14:15 np0005625204.localdomain podman[77793]: 2026-02-20 08:14:15.531999792 +0000 UTC m=+0.151538779 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:14:15 np0005625204.localdomain sshd[77856]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:15 np0005625204.localdomain podman[77792]: 2026-02-20 08:14:15.582407575 +0000 UTC m=+0.204564472 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team)
Feb 20 08:14:15 np0005625204.localdomain podman[77794]: 2026-02-20 08:14:15.619268083 +0000 UTC m=+0.229137275 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:14:15 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:14:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:14:16 np0005625204.localdomain podman[77857]: 2026-02-20 08:14:16.132187521 +0000 UTC m=+0.066953450 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.buildah.version=1.41.5, container_name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1)
Feb 20 08:14:16 np0005625204.localdomain podman[77857]: 2026-02-20 08:14:16.192170447 +0000 UTC m=+0.126936426 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:14:16 np0005625204.localdomain podman[77857]: unhealthy
Feb 20 08:14:16 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:14:16 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:14:17 np0005625204.localdomain sshd[77856]: Invalid user dspace from 185.246.128.171 port 6283
Feb 20 08:14:17 np0005625204.localdomain sshd[77856]: Disconnecting invalid user dspace 185.246.128.171 port 6283: Change of username or service not allowed: (dspace,ssh-connection) -> (orangepi,ssh-connection) [preauth]
Feb 20 08:14:18 np0005625204.localdomain sshd[77880]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:20 np0005625204.localdomain sshd[77880]: Invalid user orangepi from 185.246.128.171 port 58587
Feb 20 08:14:20 np0005625204.localdomain sshd[77880]: Disconnecting invalid user orangepi 185.246.128.171 port 58587: Change of username or service not allowed: (orangepi,ssh-connection) -> (es2,ssh-connection) [preauth]
Feb 20 08:14:22 np0005625204.localdomain sshd[77882]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:24 np0005625204.localdomain sshd[77882]: Invalid user es2 from 185.246.128.171 port 20671
Feb 20 08:14:25 np0005625204.localdomain sshd[77882]: Disconnecting invalid user es2 185.246.128.171 port 20671: Change of username or service not allowed: (es2,ssh-connection) -> (sshadmin,ssh-connection) [preauth]
Feb 20 08:14:27 np0005625204.localdomain sshd[77884]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:30 np0005625204.localdomain sshd[77884]: Invalid user sshadmin from 185.246.128.171 port 29200
Feb 20 08:14:30 np0005625204.localdomain sshd[77884]: Disconnecting invalid user sshadmin 185.246.128.171 port 29200: Change of username or service not allowed: (sshadmin,ssh-connection) -> (debian,ssh-connection) [preauth]
Feb 20 08:14:31 np0005625204.localdomain sshd[77886]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:14:33 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:14:33 np0005625204.localdomain recover_tripleo_nova_virtqemud[77895]: 63005
Feb 20 08:14:33 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:14:33 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:14:33 np0005625204.localdomain podman[77888]: 2026-02-20 08:14:33.163429035 +0000 UTC m=+0.099967811 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z)
Feb 20 08:14:33 np0005625204.localdomain podman[77888]: 2026-02-20 08:14:33.356936468 +0000 UTC m=+0.293475194 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, container_name=metrics_qdr, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step1)
Feb 20 08:14:33 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:14:34 np0005625204.localdomain sshd[77886]: Invalid user debian from 185.246.128.171 port 63136
Feb 20 08:14:38 np0005625204.localdomain sshd[77886]: error: maximum authentication attempts exceeded for invalid user debian from 185.246.128.171 port 63136 ssh2 [preauth]
Feb 20 08:14:38 np0005625204.localdomain sshd[77886]: Disconnecting invalid user debian 185.246.128.171 port 63136: Too many authentication failures [preauth]
Feb 20 08:14:39 np0005625204.localdomain sshd[77920]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:40 np0005625204.localdomain sshd[77920]: Invalid user debian from 185.246.128.171 port 61705
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: tmp-crun.Vl5qvP.mount: Deactivated successfully.
Feb 20 08:14:41 np0005625204.localdomain podman[77922]: 2026-02-20 08:14:41.109596254 +0000 UTC m=+0.093350658 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: tmp-crun.pOAWx1.mount: Deactivated successfully.
Feb 20 08:14:41 np0005625204.localdomain podman[77926]: 2026-02-20 08:14:41.167600139 +0000 UTC m=+0.142150322 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible)
Feb 20 08:14:41 np0005625204.localdomain podman[77926]: 2026-02-20 08:14:41.197923608 +0000 UTC m=+0.172473771 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:14:41 np0005625204.localdomain podman[77924]: 2026-02-20 08:14:41.211779262 +0000 UTC m=+0.191701219 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, container_name=collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container)
Feb 20 08:14:41 np0005625204.localdomain podman[77922]: 2026-02-20 08:14:41.220383075 +0000 UTC m=+0.204137439 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:14:41 np0005625204.localdomain podman[77924]: 2026-02-20 08:14:41.275195112 +0000 UTC m=+0.255117079 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:14:41 np0005625204.localdomain podman[77923]: 2026-02-20 08:14:41.310902565 +0000 UTC m=+0.290892574 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git)
Feb 20 08:14:41 np0005625204.localdomain podman[77923]: 2026-02-20 08:14:41.325153291 +0000 UTC m=+0.305143300 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com)
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:14:41 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:14:41 np0005625204.localdomain sshd[77920]: Disconnecting invalid user debian 185.246.128.171 port 61705: Change of username or service not allowed: (debian,ssh-connection) -> (iman,ssh-connection) [preauth]
Feb 20 08:14:42 np0005625204.localdomain systemd[1]: tmp-crun.ar2SZw.mount: Deactivated successfully.
Feb 20 08:14:42 np0005625204.localdomain sshd[78013]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:43 np0005625204.localdomain sshd[78015]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:44 np0005625204.localdomain sshd[78013]: Invalid user iman from 185.246.128.171 port 36653
Feb 20 08:14:44 np0005625204.localdomain sshd[78013]: Disconnecting invalid user iman 185.246.128.171 port 36653: Change of username or service not allowed: (iman,ssh-connection) -> (telegram,ssh-connection) [preauth]
Feb 20 08:14:44 np0005625204.localdomain sshd[78015]: Invalid user justine from 101.36.109.176 port 41292
Feb 20 08:14:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:14:44 np0005625204.localdomain podman[78017]: 2026-02-20 08:14:44.894302152 +0000 UTC m=+0.079726400 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, container_name=nova_migration_target)
Feb 20 08:14:45 np0005625204.localdomain sshd[78015]: Received disconnect from 101.36.109.176 port 41292:11: Bye Bye [preauth]
Feb 20 08:14:45 np0005625204.localdomain sshd[78015]: Disconnected from invalid user justine 101.36.109.176 port 41292 [preauth]
Feb 20 08:14:45 np0005625204.localdomain sshd[78041]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:45 np0005625204.localdomain podman[78017]: 2026-02-20 08:14:45.271298901 +0000 UTC m=+0.456723109 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:14:45 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:14:46 np0005625204.localdomain podman[78042]: 2026-02-20 08:14:46.142142636 +0000 UTC m=+0.080668651 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:14:46 np0005625204.localdomain podman[78042]: 2026-02-20 08:14:46.171113042 +0000 UTC m=+0.109639087 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13)
Feb 20 08:14:46 np0005625204.localdomain podman[78043]: 2026-02-20 08:14:46.196151188 +0000 UTC m=+0.131772144 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:14:46 np0005625204.localdomain podman[78043]: 2026-02-20 08:14:46.235945766 +0000 UTC m=+0.171566672 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, architecture=x86_64, vcs-type=git, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:14:46 np0005625204.localdomain podman[78044]: 2026-02-20 08:14:46.248916313 +0000 UTC m=+0.181463125 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:14:46 np0005625204.localdomain podman[78044]: 2026-02-20 08:14:46.29259616 +0000 UTC m=+0.225142952 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4)
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:14:46 np0005625204.localdomain podman[78104]: 2026-02-20 08:14:46.340226848 +0000 UTC m=+0.079764602 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 20 08:14:46 np0005625204.localdomain podman[78104]: 2026-02-20 08:14:46.417623557 +0000 UTC m=+0.157161261 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:14:46 np0005625204.localdomain podman[78104]: unhealthy
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:14:46 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:14:46 np0005625204.localdomain sshd[78133]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:46 np0005625204.localdomain sshd[78133]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:14:48 np0005625204.localdomain sshd[78041]: Invalid user telegram from 185.246.128.171 port 42607
Feb 20 08:14:48 np0005625204.localdomain sshd[78041]: Disconnecting invalid user telegram 185.246.128.171 port 42607: Change of username or service not allowed: (telegram,ssh-connection) -> (client,ssh-connection) [preauth]
Feb 20 08:14:49 np0005625204.localdomain sshd[78135]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:51 np0005625204.localdomain sshd[78135]: Invalid user client from 185.246.128.171 port 57242
Feb 20 08:14:52 np0005625204.localdomain sshd[78135]: Disconnecting invalid user client 185.246.128.171 port 57242: Change of username or service not allowed: (client,ssh-connection) -> (app,ssh-connection) [preauth]
Feb 20 08:14:52 np0005625204.localdomain sshd[78137]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:53 np0005625204.localdomain sshd[78137]: Invalid user app from 185.246.128.171 port 17627
Feb 20 08:14:54 np0005625204.localdomain sshd[78137]: Disconnecting invalid user app 185.246.128.171 port 17627: Change of username or service not allowed: (app,ssh-connection) -> (1,ssh-connection) [preauth]
Feb 20 08:14:55 np0005625204.localdomain sshd[36412]: Received disconnect from 192.168.122.100 port 60408:11: disconnected by user
Feb 20 08:14:55 np0005625204.localdomain sshd[36412]: Disconnected from user zuul 192.168.122.100 port 60408
Feb 20 08:14:55 np0005625204.localdomain sshd[36409]: pam_unix(sshd:session): session closed for user zuul
Feb 20 08:14:55 np0005625204.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Feb 20 08:14:55 np0005625204.localdomain systemd[1]: session-28.scope: Consumed 3.011s CPU time.
Feb 20 08:14:55 np0005625204.localdomain systemd-logind[759]: Session 28 logged out. Waiting for processes to exit.
Feb 20 08:14:55 np0005625204.localdomain systemd-logind[759]: Removed session 28.
Feb 20 08:14:56 np0005625204.localdomain sshd[78139]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:14:58 np0005625204.localdomain sshd[78139]: Invalid user 1 from 185.246.128.171 port 2821
Feb 20 08:14:59 np0005625204.localdomain sshd[78139]: Disconnecting invalid user 1 185.246.128.171 port 2821: Change of username or service not allowed: (1,ssh-connection) -> (abe,ssh-connection) [preauth]
Feb 20 08:15:00 np0005625204.localdomain sshd[78141]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:02 np0005625204.localdomain sshd[78141]: Invalid user abe from 185.246.128.171 port 25791
Feb 20 08:15:02 np0005625204.localdomain sshd[78141]: Disconnecting invalid user abe 185.246.128.171 port 25791: Change of username or service not allowed: (abe,ssh-connection) -> (,ssh-connection) [preauth]
Feb 20 08:15:03 np0005625204.localdomain sshd[78143]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:15:04 np0005625204.localdomain systemd[1]: tmp-crun.RdiBXl.mount: Deactivated successfully.
Feb 20 08:15:04 np0005625204.localdomain podman[78144]: 2026-02-20 08:15:04.152202547 +0000 UTC m=+0.087957232 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 20 08:15:04 np0005625204.localdomain podman[78144]: 2026-02-20 08:15:04.386312153 +0000 UTC m=+0.322066848 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1)
Feb 20 08:15:04 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:15:06 np0005625204.localdomain sshd[78143]: Invalid user  from 185.246.128.171 port 57567
Feb 20 08:15:08 np0005625204.localdomain sshd[78143]: Disconnecting invalid user  185.246.128.171 port 57567: Change of username or service not allowed: (,ssh-connection) -> (solr,ssh-connection) [preauth]
Feb 20 08:15:08 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:15:08 np0005625204.localdomain recover_tripleo_nova_virtqemud[78173]: 63005
Feb 20 08:15:08 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:15:08 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:15:10 np0005625204.localdomain sshd[78174]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:11 np0005625204.localdomain sshd[78174]: Invalid user solr from 185.246.128.171 port 43550
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:15:11 np0005625204.localdomain podman[78176]: 2026-02-20 08:15:11.401120615 +0000 UTC m=+0.085422554 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:15:11 np0005625204.localdomain sshd[78174]: Disconnecting invalid user solr 185.246.128.171 port 43550: Change of username or service not allowed: (solr,ssh-connection) -> (manager,ssh-connection) [preauth]
Feb 20 08:15:11 np0005625204.localdomain podman[78177]: 2026-02-20 08:15:11.455015515 +0000 UTC m=+0.134567580 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:15:11 np0005625204.localdomain podman[78176]: 2026-02-20 08:15:11.467510048 +0000 UTC m=+0.151811957 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true)
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:15:11 np0005625204.localdomain podman[78177]: 2026-02-20 08:15:11.484040864 +0000 UTC m=+0.163592989 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:15:11 np0005625204.localdomain podman[78208]: 2026-02-20 08:15:11.543207364 +0000 UTC m=+0.122876792 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=logrotate_crond, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:15:11 np0005625204.localdomain podman[78208]: 2026-02-20 08:15:11.550763126 +0000 UTC m=+0.130432544 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:15:11 np0005625204.localdomain podman[78209]: 2026-02-20 08:15:11.610587777 +0000 UTC m=+0.185984244 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:15:11 np0005625204.localdomain podman[78209]: 2026-02-20 08:15:11.626040819 +0000 UTC m=+0.201437346 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5)
Feb 20 08:15:11 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:15:11 np0005625204.localdomain sshd[78269]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:13 np0005625204.localdomain sshd[78271]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:13 np0005625204.localdomain sshd[78273]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:14 np0005625204.localdomain sshd[78271]: Invalid user n8n from 83.235.16.111 port 45832
Feb 20 08:15:14 np0005625204.localdomain sshd[78271]: Received disconnect from 83.235.16.111 port 45832:11: Bye Bye [preauth]
Feb 20 08:15:14 np0005625204.localdomain sshd[78271]: Disconnected from invalid user n8n 83.235.16.111 port 45832 [preauth]
Feb 20 08:15:14 np0005625204.localdomain sudo[78275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:15:14 np0005625204.localdomain sudo[78275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:15:14 np0005625204.localdomain sudo[78275]: pam_unix(sudo:session): session closed for user root
Feb 20 08:15:14 np0005625204.localdomain sudo[78290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:15:14 np0005625204.localdomain sudo[78290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:15:14 np0005625204.localdomain sshd[78269]: Invalid user manager from 185.246.128.171 port 36391
Feb 20 08:15:15 np0005625204.localdomain sudo[78290]: pam_unix(sudo:session): session closed for user root
Feb 20 08:15:15 np0005625204.localdomain sudo[78337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:15:15 np0005625204.localdomain sudo[78337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:15:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:15:15 np0005625204.localdomain sudo[78337]: pam_unix(sudo:session): session closed for user root
Feb 20 08:15:15 np0005625204.localdomain podman[78352]: 2026-02-20 08:15:15.963034382 +0000 UTC m=+0.093032119 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, release=1766032510, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 20 08:15:16 np0005625204.localdomain podman[78352]: 2026-02-20 08:15:16.312140456 +0000 UTC m=+0.442138163 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13)
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: tmp-crun.z9niV7.mount: Deactivated successfully.
Feb 20 08:15:16 np0005625204.localdomain podman[78375]: 2026-02-20 08:15:16.408611779 +0000 UTC m=+0.062407361 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, io.buildah.version=1.41.5)
Feb 20 08:15:16 np0005625204.localdomain podman[78377]: 2026-02-20 08:15:16.426403124 +0000 UTC m=+0.072242183 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, release=1766032510, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 20 08:15:16 np0005625204.localdomain sshd[78269]: Disconnecting invalid user manager 185.246.128.171 port 36391: Change of username or service not allowed: (manager,ssh-connection) -> (ddd,ssh-connection) [preauth]
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:15:16 np0005625204.localdomain podman[78375]: 2026-02-20 08:15:16.461051074 +0000 UTC m=+0.114846656 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, vcs-type=git, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:15:16 np0005625204.localdomain podman[78376]: 2026-02-20 08:15:16.462071006 +0000 UTC m=+0.112046311 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, distribution-scope=public, container_name=iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510)
Feb 20 08:15:16 np0005625204.localdomain podman[78433]: 2026-02-20 08:15:16.526369114 +0000 UTC m=+0.063459884 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:15:16 np0005625204.localdomain podman[78376]: 2026-02-20 08:15:16.543306082 +0000 UTC m=+0.193281407 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 20 08:15:16 np0005625204.localdomain podman[78377]: 2026-02-20 08:15:16.550704148 +0000 UTC m=+0.196543257 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:15:16 np0005625204.localdomain podman[78433]: 2026-02-20 08:15:16.608889029 +0000 UTC m=+0.145979809 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 20 08:15:16 np0005625204.localdomain podman[78433]: unhealthy
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:15:16 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:15:16 np0005625204.localdomain sshd[78273]: Received disconnect from 103.157.25.4 port 38996:11: Bye Bye [preauth]
Feb 20 08:15:16 np0005625204.localdomain sshd[78273]: Disconnected from authenticating user root 103.157.25.4 port 38996 [preauth]
Feb 20 08:15:18 np0005625204.localdomain sshd[78464]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:22 np0005625204.localdomain sshd[78464]: Invalid user ddd from 185.246.128.171 port 42168
Feb 20 08:15:22 np0005625204.localdomain sshd[78464]: Disconnecting invalid user ddd 185.246.128.171 port 42168: Change of username or service not allowed: (ddd,ssh-connection) -> (teamspeak,ssh-connection) [preauth]
Feb 20 08:15:23 np0005625204.localdomain sshd[78466]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:26 np0005625204.localdomain sshd[78466]: Invalid user teamspeak from 185.246.128.171 port 52436
Feb 20 08:15:26 np0005625204.localdomain sshd[78466]: Disconnecting invalid user teamspeak 185.246.128.171 port 52436: Change of username or service not allowed: (teamspeak,ssh-connection) -> (sync,ssh-connection) [preauth]
Feb 20 08:15:27 np0005625204.localdomain sshd[78468]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:30 np0005625204.localdomain sshd[78468]: Disconnecting authenticating user sync 185.246.128.171 port 26298: Change of username or service not allowed: (sync,ssh-connection) -> (binance,ssh-connection) [preauth]
Feb 20 08:15:30 np0005625204.localdomain sshd[78470]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:31 np0005625204.localdomain sshd[78472]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:32 np0005625204.localdomain sshd[78474]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:32 np0005625204.localdomain sshd[78470]: Invalid user binance from 185.246.128.171 port 44762
Feb 20 08:15:33 np0005625204.localdomain sshd[78476]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:33 np0005625204.localdomain sshd[78470]: Disconnecting invalid user binance 185.246.128.171 port 44762: Change of username or service not allowed: (binance,ssh-connection) -> (amit,ssh-connection) [preauth]
Feb 20 08:15:33 np0005625204.localdomain sshd[78476]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:15:33 np0005625204.localdomain sshd[78474]: Invalid user claude from 178.217.173.50 port 40818
Feb 20 08:15:33 np0005625204.localdomain sshd[78474]: Received disconnect from 178.217.173.50 port 40818:11: Bye Bye [preauth]
Feb 20 08:15:33 np0005625204.localdomain sshd[78474]: Disconnected from invalid user claude 178.217.173.50 port 40818 [preauth]
Feb 20 08:15:34 np0005625204.localdomain sshd[78478]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:15:35 np0005625204.localdomain podman[78479]: 2026-02-20 08:15:35.14229273 +0000 UTC m=+0.079463493 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:15:35 np0005625204.localdomain podman[78479]: 2026-02-20 08:15:35.362409687 +0000 UTC m=+0.299580400 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container)
Feb 20 08:15:35 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:15:37 np0005625204.localdomain sshd[78478]: Invalid user amit from 185.246.128.171 port 15524
Feb 20 08:15:37 np0005625204.localdomain sshd[78478]: Disconnecting invalid user amit 185.246.128.171 port 15524: Change of username or service not allowed: (amit,ssh-connection) -> (test2,ssh-connection) [preauth]
Feb 20 08:15:38 np0005625204.localdomain sshd[78509]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:15:42 np0005625204.localdomain podman[78513]: 2026-02-20 08:15:42.290942879 +0000 UTC m=+0.087756937 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:15:42 np0005625204.localdomain podman[78513]: 2026-02-20 08:15:42.32856028 +0000 UTC m=+0.125374298 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1)
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:15:42 np0005625204.localdomain podman[78514]: 2026-02-20 08:15:42.335922256 +0000 UTC m=+0.129720582 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, release=1766032510, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:15:42 np0005625204.localdomain sshd[78509]: Invalid user test2 from 185.246.128.171 port 10518
Feb 20 08:15:42 np0005625204.localdomain podman[78511]: 2026-02-20 08:15:42.390850877 +0000 UTC m=+0.192090300 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:15:42 np0005625204.localdomain podman[78512]: 2026-02-20 08:15:42.450983128 +0000 UTC m=+0.244478054 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com)
Feb 20 08:15:42 np0005625204.localdomain podman[78512]: 2026-02-20 08:15:42.462809979 +0000 UTC m=+0.256304905 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:15:42 np0005625204.localdomain podman[78511]: 2026-02-20 08:15:42.473344522 +0000 UTC m=+0.274583905 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:30Z)
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:15:42 np0005625204.localdomain podman[78514]: 2026-02-20 08:15:42.524855868 +0000 UTC m=+0.318654124 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 20 08:15:42 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:15:45 np0005625204.localdomain sshd[78509]: Disconnecting invalid user test2 185.246.128.171 port 10518: Change of username or service not allowed: (test2,ssh-connection) -> (mc1,ssh-connection) [preauth]
Feb 20 08:15:46 np0005625204.localdomain sshd[78607]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:15:47 np0005625204.localdomain podman[78608]: 2026-02-20 08:15:47.139748106 +0000 UTC m=+0.082641221 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5)
Feb 20 08:15:47 np0005625204.localdomain podman[78608]: 2026-02-20 08:15:47.191942153 +0000 UTC m=+0.134835238 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4)
Feb 20 08:15:47 np0005625204.localdomain podman[78611]: 2026-02-20 08:15:47.191415147 +0000 UTC m=+0.129332429 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_id=tripleo_step5, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:15:47 np0005625204.localdomain podman[78612]: 2026-02-20 08:15:47.239620672 +0000 UTC m=+0.174168491 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 20 08:15:47 np0005625204.localdomain podman[78610]: 2026-02-20 08:15:47.286067174 +0000 UTC m=+0.224897985 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:15:47 np0005625204.localdomain podman[78610]: 2026-02-20 08:15:47.316058132 +0000 UTC m=+0.254888953 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:15:47 np0005625204.localdomain podman[78611]: 2026-02-20 08:15:47.32514536 +0000 UTC m=+0.263062582 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:15:47 np0005625204.localdomain podman[78611]: unhealthy
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:15:47 np0005625204.localdomain podman[78609]: 2026-02-20 08:15:47.402404205 +0000 UTC m=+0.341922197 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5)
Feb 20 08:15:47 np0005625204.localdomain podman[78609]: 2026-02-20 08:15:47.408426319 +0000 UTC m=+0.347944341 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:15:47 np0005625204.localdomain podman[78612]: 2026-02-20 08:15:47.635897662 +0000 UTC m=+0.570445531 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.buildah.version=1.41.5, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 20 08:15:47 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:15:48 np0005625204.localdomain sshd[78607]: Invalid user mc1 from 185.246.128.171 port 15576
Feb 20 08:15:48 np0005625204.localdomain sshd[78607]: Disconnecting invalid user mc1 185.246.128.171 port 15576: Change of username or service not allowed: (mc1,ssh-connection) -> (onlime_r,ssh-connection) [preauth]
Feb 20 08:15:49 np0005625204.localdomain sshd[78716]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:51 np0005625204.localdomain sshd[78716]: Invalid user onlime_r from 185.246.128.171 port 19076
Feb 20 08:15:52 np0005625204.localdomain sshd[78716]: Disconnecting invalid user onlime_r 185.246.128.171 port 19076: Change of username or service not allowed: (onlime_r,ssh-connection) -> (inspur,ssh-connection) [preauth]
Feb 20 08:15:54 np0005625204.localdomain sshd[78718]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:15:56 np0005625204.localdomain sshd[78718]: Invalid user inspur from 185.246.128.171 port 56525
Feb 20 08:15:57 np0005625204.localdomain sshd[78718]: Disconnecting invalid user inspur 185.246.128.171 port 56525: Change of username or service not allowed: (inspur,ssh-connection) -> (maroof,ssh-connection) [preauth]
Feb 20 08:15:59 np0005625204.localdomain sshd[78720]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:01 np0005625204.localdomain sshd[78720]: Invalid user maroof from 185.246.128.171 port 52758
Feb 20 08:16:01 np0005625204.localdomain sshd[78720]: Disconnecting invalid user maroof 185.246.128.171 port 52758: Change of username or service not allowed: (maroof,ssh-connection) -> (aovalle,ssh-connection) [preauth]
Feb 20 08:16:02 np0005625204.localdomain sshd[78722]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:04 np0005625204.localdomain sshd[78722]: Invalid user aovalle from 185.246.128.171 port 61811
Feb 20 08:16:04 np0005625204.localdomain sshd[78722]: Disconnecting invalid user aovalle 185.246.128.171 port 61811: Change of username or service not allowed: (aovalle,ssh-connection) -> (instrument,ssh-connection) [preauth]
Feb 20 08:16:04 np0005625204.localdomain sshd[78724]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:16:06 np0005625204.localdomain systemd[1]: tmp-crun.LCp2Iw.mount: Deactivated successfully.
Feb 20 08:16:06 np0005625204.localdomain podman[78726]: 2026-02-20 08:16:06.156075058 +0000 UTC m=+0.093552785 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64)
Feb 20 08:16:06 np0005625204.localdomain sshd[78724]: Invalid user instrument from 185.246.128.171 port 41539
Feb 20 08:16:06 np0005625204.localdomain podman[78726]: 2026-02-20 08:16:06.36427446 +0000 UTC m=+0.301752187 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:16:06 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:16:06 np0005625204.localdomain sshd[78724]: Disconnecting invalid user instrument 185.246.128.171 port 41539: Change of username or service not allowed: (instrument,ssh-connection) -> (cindy,ssh-connection) [preauth]
Feb 20 08:16:07 np0005625204.localdomain sshd[78755]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:08 np0005625204.localdomain sshd[78755]: Invalid user cindy from 185.246.128.171 port 47899
Feb 20 08:16:09 np0005625204.localdomain sshd[78755]: Disconnecting invalid user cindy 185.246.128.171 port 47899: Change of username or service not allowed: (cindy,ssh-connection) -> (elasticsearch,ssh-connection) [preauth]
Feb 20 08:16:10 np0005625204.localdomain sshd[78757]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:12 np0005625204.localdomain sshd[78757]: Invalid user elasticsearch from 185.246.128.171 port 50644
Feb 20 08:16:12 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:16:12 np0005625204.localdomain recover_tripleo_nova_virtqemud[78760]: 63005
Feb 20 08:16:12 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:16:12 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:16:13 np0005625204.localdomain podman[78762]: 2026-02-20 08:16:13.156116045 +0000 UTC m=+0.080651890 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com)
Feb 20 08:16:13 np0005625204.localdomain podman[78762]: 2026-02-20 08:16:13.17008699 +0000 UTC m=+0.094622875 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron)
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:16:13 np0005625204.localdomain podman[78761]: 2026-02-20 08:16:13.207281011 +0000 UTC m=+0.134303262 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 20 08:16:13 np0005625204.localdomain podman[78763]: 2026-02-20 08:16:13.265805759 +0000 UTC m=+0.185378794 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5)
Feb 20 08:16:13 np0005625204.localdomain podman[78761]: 2026-02-20 08:16:13.283273599 +0000 UTC m=+0.210295830 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:16:13 np0005625204.localdomain podman[78763]: 2026-02-20 08:16:13.307397391 +0000 UTC m=+0.226970396 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:16:13 np0005625204.localdomain podman[78764]: 2026-02-20 08:16:13.374477969 +0000 UTC m=+0.291635610 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:16:13 np0005625204.localdomain podman[78764]: 2026-02-20 08:16:13.40180843 +0000 UTC m=+0.318966101 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git)
Feb 20 08:16:13 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:16:15 np0005625204.localdomain sshd[78757]: Disconnecting invalid user elasticsearch 185.246.128.171 port 50644: Change of username or service not allowed: (elasticsearch,ssh-connection) -> (config,ssh-connection) [preauth]
Feb 20 08:16:15 np0005625204.localdomain sudo[78853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:16:15 np0005625204.localdomain sudo[78853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:16:15 np0005625204.localdomain sudo[78853]: pam_unix(sudo:session): session closed for user root
Feb 20 08:16:16 np0005625204.localdomain sudo[78868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:16:16 np0005625204.localdomain sudo[78868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:16:16 np0005625204.localdomain sudo[78868]: pam_unix(sudo:session): session closed for user root
Feb 20 08:16:17 np0005625204.localdomain sshd[78914]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:17 np0005625204.localdomain sudo[78916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:16:17 np0005625204.localdomain sudo[78916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:16:17 np0005625204.localdomain sudo[78916]: pam_unix(sudo:session): session closed for user root
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:16:17 np0005625204.localdomain podman[78932]: 2026-02-20 08:16:17.51169016 +0000 UTC m=+0.075136844 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, url=https://www.redhat.com)
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: tmp-crun.jVbho7.mount: Deactivated successfully.
Feb 20 08:16:17 np0005625204.localdomain podman[78933]: 2026-02-20 08:16:17.571671333 +0000 UTC m=+0.129489156 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:16:17 np0005625204.localdomain podman[78937]: 2026-02-20 08:16:17.613327538 +0000 UTC m=+0.167627964 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public)
Feb 20 08:16:17 np0005625204.localdomain podman[78937]: 2026-02-20 08:16:17.62130677 +0000 UTC m=+0.175607226 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc.)
Feb 20 08:16:17 np0005625204.localdomain podman[78932]: 2026-02-20 08:16:17.623869488 +0000 UTC m=+0.187316202 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent)
Feb 20 08:16:17 np0005625204.localdomain podman[78933]: 2026-02-20 08:16:17.634019177 +0000 UTC m=+0.191837050 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:16:17 np0005625204.localdomain podman[78933]: unhealthy
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:16:17 np0005625204.localdomain podman[79007]: 2026-02-20 08:16:17.731143578 +0000 UTC m=+0.064517901 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vcs-type=git, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 20 08:16:17 np0005625204.localdomain podman[78931]: 2026-02-20 08:16:17.709466629 +0000 UTC m=+0.273271844 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 20 08:16:17 np0005625204.localdomain podman[78931]: 2026-02-20 08:16:17.789459978 +0000 UTC m=+0.353265163 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 20 08:16:17 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:16:18 np0005625204.localdomain podman[79007]: 2026-02-20 08:16:18.143246237 +0000 UTC m=+0.476620560 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5)
Feb 20 08:16:18 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:16:19 np0005625204.localdomain sshd[79043]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:19 np0005625204.localdomain sshd[79043]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:16:19 np0005625204.localdomain sshd[78914]: Invalid user config from 185.246.128.171 port 15080
Feb 20 08:16:20 np0005625204.localdomain sshd[78914]: Disconnecting invalid user config 185.246.128.171 port 15080: Change of username or service not allowed: (config,ssh-connection) -> (avax,ssh-connection) [preauth]
Feb 20 08:16:20 np0005625204.localdomain sshd[79045]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:21 np0005625204.localdomain sshd[79045]: Invalid user avax from 185.246.128.171 port 46429
Feb 20 08:16:22 np0005625204.localdomain sshd[79045]: Disconnecting invalid user avax 185.246.128.171 port 46429: Change of username or service not allowed: (avax,ssh-connection) -> (apache,ssh-connection) [preauth]
Feb 20 08:16:22 np0005625204.localdomain sshd[79047]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:24 np0005625204.localdomain sshd[79047]: Invalid user apache from 185.246.128.171 port 30572
Feb 20 08:16:24 np0005625204.localdomain sshd[79047]: Disconnecting invalid user apache 185.246.128.171 port 30572: Change of username or service not allowed: (apache,ssh-connection) -> (ftpusr,ssh-connection) [preauth]
Feb 20 08:16:26 np0005625204.localdomain sshd[79049]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:28 np0005625204.localdomain sshd[79049]: Invalid user ftpusr from 185.246.128.171 port 48418
Feb 20 08:16:29 np0005625204.localdomain sshd[79049]: Disconnecting invalid user ftpusr 185.246.128.171 port 48418: Change of username or service not allowed: (ftpusr,ssh-connection) -> (pritchard,ssh-connection) [preauth]
Feb 20 08:16:30 np0005625204.localdomain sshd[79051]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:31 np0005625204.localdomain sshd[79051]: Invalid user pasi from 152.32.189.21 port 56490
Feb 20 08:16:31 np0005625204.localdomain sshd[79051]: Received disconnect from 152.32.189.21 port 56490:11: Bye Bye [preauth]
Feb 20 08:16:31 np0005625204.localdomain sshd[79051]: Disconnected from invalid user pasi 152.32.189.21 port 56490 [preauth]
Feb 20 08:16:31 np0005625204.localdomain sshd[79053]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:33 np0005625204.localdomain sshd[79053]: Invalid user pritchard from 185.246.128.171 port 48444
Feb 20 08:16:33 np0005625204.localdomain sshd[79053]: Disconnecting invalid user pritchard 185.246.128.171 port 48444: Change of username or service not allowed: (pritchard,ssh-connection) -> (steven,ssh-connection) [preauth]
Feb 20 08:16:35 np0005625204.localdomain sshd[79055]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:16:37 np0005625204.localdomain podman[79057]: 2026-02-20 08:16:37.146493442 +0000 UTC m=+0.081304221 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13)
Feb 20 08:16:37 np0005625204.localdomain podman[79057]: 2026-02-20 08:16:37.336961938 +0000 UTC m=+0.271772697 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container)
Feb 20 08:16:37 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:16:37 np0005625204.localdomain sshd[79055]: Invalid user steven from 185.246.128.171 port 9466
Feb 20 08:16:37 np0005625204.localdomain sshd[79055]: Disconnecting invalid user steven 185.246.128.171 port 9466: Change of username or service not allowed: (steven,ssh-connection) -> (itadmin,ssh-connection) [preauth]
Feb 20 08:16:39 np0005625204.localdomain sshd[79088]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:42 np0005625204.localdomain sshd[79088]: Invalid user itadmin from 185.246.128.171 port 38459
Feb 20 08:16:43 np0005625204.localdomain sshd[79088]: Disconnecting invalid user itadmin 185.246.128.171 port 38459: Change of username or service not allowed: (itadmin,ssh-connection) -> (openmediavault,ssh-connectio [preauth]
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:16:43 np0005625204.localdomain podman[79091]: 2026-02-20 08:16:43.418041365 +0000 UTC m=+0.087794758 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:16:43 np0005625204.localdomain podman[79090]: 2026-02-20 08:16:43.47646835 +0000 UTC m=+0.147456851 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:16:43 np0005625204.localdomain podman[79090]: 2026-02-20 08:16:43.483372729 +0000 UTC m=+0.154361230 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:16:43 np0005625204.localdomain podman[79091]: 2026-02-20 08:16:43.504880323 +0000 UTC m=+0.174633706 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:07:30Z, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:16:43 np0005625204.localdomain podman[79120]: 2026-02-20 08:16:43.537789473 +0000 UTC m=+0.096857074 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, container_name=ceilometer_agent_compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:16:43 np0005625204.localdomain podman[79119]: 2026-02-20 08:16:43.571006222 +0000 UTC m=+0.134753746 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, container_name=collectd, io.openshift.expose-services=, version=17.1.13, tcib_managed=true)
Feb 20 08:16:43 np0005625204.localdomain podman[79119]: 2026-02-20 08:16:43.582909624 +0000 UTC m=+0.146657128 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510)
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:16:43 np0005625204.localdomain podman[79120]: 2026-02-20 08:16:43.62227932 +0000 UTC m=+0.181346961 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1)
Feb 20 08:16:43 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:16:43 np0005625204.localdomain sshd[79180]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:44 np0005625204.localdomain systemd[1]: tmp-crun.aryN3w.mount: Deactivated successfully.
Feb 20 08:16:45 np0005625204.localdomain sshd[79180]: Invalid user openmediavault from 185.246.128.171 port 16176
Feb 20 08:16:45 np0005625204.localdomain sshd[79180]: Disconnecting invalid user openmediavault 185.246.128.171 port 16176: Change of username or service not allowed: (openmediavault,ssh-connection) -> (fa,ssh-connection) [preauth]
Feb 20 08:16:46 np0005625204.localdomain sshd[79182]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: tmp-crun.Toh5w9.mount: Deactivated successfully.
Feb 20 08:16:48 np0005625204.localdomain sshd[79182]: Invalid user fa from 185.246.128.171 port 13021
Feb 20 08:16:48 np0005625204.localdomain podman[79189]: 2026-02-20 08:16:48.154566073 +0000 UTC m=+0.081473877 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public)
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: tmp-crun.m5FSiX.mount: Deactivated successfully.
Feb 20 08:16:48 np0005625204.localdomain podman[79187]: 2026-02-20 08:16:48.213306097 +0000 UTC m=+0.146341398 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:16:48 np0005625204.localdomain podman[79187]: 2026-02-20 08:16:48.220915318 +0000 UTC m=+0.153950609 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:16:48 np0005625204.localdomain podman[79189]: 2026-02-20 08:16:48.229140128 +0000 UTC m=+0.156047942 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20260112.1)
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:16:48 np0005625204.localdomain podman[79189]: unhealthy
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:16:48 np0005625204.localdomain podman[79242]: 2026-02-20 08:16:48.299697572 +0000 UTC m=+0.090271084 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, version=17.1.13, container_name=nova_migration_target, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 20 08:16:48 np0005625204.localdomain podman[79186]: 2026-02-20 08:16:48.314903693 +0000 UTC m=+0.247696945 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 20 08:16:48 np0005625204.localdomain podman[79186]: 2026-02-20 08:16:48.340040367 +0000 UTC m=+0.272833649 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:16:48 np0005625204.localdomain podman[79188]: 2026-02-20 08:16:48.412588681 +0000 UTC m=+0.341869437 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z)
Feb 20 08:16:48 np0005625204.localdomain podman[79188]: 2026-02-20 08:16:48.45697893 +0000 UTC m=+0.386259736 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, distribution-scope=public, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:16:48 np0005625204.localdomain podman[79242]: 2026-02-20 08:16:48.654120939 +0000 UTC m=+0.444694421 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=nova_migration_target, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64)
Feb 20 08:16:48 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:16:49 np0005625204.localdomain sshd[79182]: Disconnecting invalid user fa 185.246.128.171 port 13021: Change of username or service not allowed: (fa,ssh-connection) -> (alireza,ssh-connection) [preauth]
Feb 20 08:16:49 np0005625204.localdomain sshd[79336]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:50 np0005625204.localdomain sshd[79336]: Received disconnect from 77.232.138.190 port 54334:11: Bye Bye [preauth]
Feb 20 08:16:50 np0005625204.localdomain sshd[79336]: Disconnected from authenticating user root 77.232.138.190 port 54334 [preauth]
Feb 20 08:16:51 np0005625204.localdomain sshd[79387]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:54 np0005625204.localdomain sshd[79387]: Invalid user alireza from 185.246.128.171 port 63969
Feb 20 08:16:54 np0005625204.localdomain sshd[79387]: Disconnecting invalid user alireza 185.246.128.171 port 63969: Change of username or service not allowed: (alireza,ssh-connection) -> (btf,ssh-connection) [preauth]
Feb 20 08:16:55 np0005625204.localdomain sshd[79389]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:57 np0005625204.localdomain sshd[79389]: Invalid user btf from 185.246.128.171 port 30464
Feb 20 08:16:58 np0005625204.localdomain sshd[79389]: Disconnecting invalid user btf 185.246.128.171 port 30464: Change of username or service not allowed: (btf,ssh-connection) -> (useradmin,ssh-connection) [preauth]
Feb 20 08:16:58 np0005625204.localdomain systemd[1]: libpod-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2.scope: Deactivated successfully.
Feb 20 08:16:58 np0005625204.localdomain podman[77535]: 2026-02-20 08:16:58.603604776 +0000 UTC m=+192.310330374 container died 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, url=https://www.redhat.com, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.openshift.expose-services=, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z)
Feb 20 08:16:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2-userdata-shm.mount: Deactivated successfully.
Feb 20 08:16:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c-merged.mount: Deactivated successfully.
Feb 20 08:16:58 np0005625204.localdomain podman[79392]: 2026-02-20 08:16:58.704850902 +0000 UTC m=+0.085362175 container cleanup 9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_wait_for_compute_service, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git)
Feb 20 08:16:58 np0005625204.localdomain systemd[1]: libpod-conmon-9d096a68756bb73516628d7a97e85f9cebefab3d105574ed0061dd00a8ec71a2.scope: Deactivated successfully.
Feb 20 08:16:58 np0005625204.localdomain python3[77355]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=6f2a8ada21c5a8beb0844e05e372be87 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 20 08:16:58 np0005625204.localdomain sudo[77345]: pam_unix(sudo:session): session closed for user root
Feb 20 08:16:59 np0005625204.localdomain sudo[79439]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apvifayrxdnjfvjdhjvkegvmoowbhjet ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:16:59 np0005625204.localdomain sudo[79439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:16:59 np0005625204.localdomain python3[79441]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:16:59 np0005625204.localdomain sudo[79439]: pam_unix(sudo:session): session closed for user root
Feb 20 08:16:59 np0005625204.localdomain sudo[79455]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khetvlwxhddwhkmignkugszuxunoxipq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:16:59 np0005625204.localdomain sudo[79455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:16:59 np0005625204.localdomain sshd[79457]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:16:59 np0005625204.localdomain python3[79458]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 20 08:16:59 np0005625204.localdomain sudo[79455]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:00 np0005625204.localdomain sudo[79518]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koyyobghfordvezldthbqsawnhivagnv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:17:00 np0005625204.localdomain sudo[79518]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:00 np0005625204.localdomain python3[79520]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1771575419.6333683-118857-145106558304638/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:17:00 np0005625204.localdomain sudo[79518]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:00 np0005625204.localdomain sudo[79534]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cocnocdxhseljgdcugiwcgwshdjypzwb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:17:00 np0005625204.localdomain sudo[79534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:00 np0005625204.localdomain python3[79536]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 08:17:00 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:17:00 np0005625204.localdomain systemd-sysv-generator[79564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:17:00 np0005625204.localdomain systemd-rc-local-generator[79558]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:17:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:17:00 np0005625204.localdomain sudo[79534]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:01 np0005625204.localdomain sudo[79586]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afygquwozzlluuxcwdsrvmczqidhsbfu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 20 08:17:01 np0005625204.localdomain sudo[79586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:01 np0005625204.localdomain python3[79588]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:17:01 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:17:01 np0005625204.localdomain systemd-rc-local-generator[79613]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:17:01 np0005625204.localdomain systemd-sysv-generator[79617]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:17:01 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:17:01 np0005625204.localdomain systemd[1]: Starting nova_compute container...
Feb 20 08:17:02 np0005625204.localdomain tripleo-start-podman-container[79627]: Creating additional drop-in dependency for "nova_compute" (a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380)
Feb 20 08:17:02 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:17:02 np0005625204.localdomain systemd-sysv-generator[79682]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:17:02 np0005625204.localdomain systemd-rc-local-generator[79678]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:17:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:17:02 np0005625204.localdomain systemd[1]: Started nova_compute container.
Feb 20 08:17:02 np0005625204.localdomain sudo[79586]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:02 np0005625204.localdomain sshd[79457]: Invalid user useradmin from 185.246.128.171 port 50230
Feb 20 08:17:02 np0005625204.localdomain sudo[79721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srzlcewwnizvblarhwbcvqzjioyqixha ; /usr/bin/python3
Feb 20 08:17:02 np0005625204.localdomain sudo[79721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:02 np0005625204.localdomain python3[79723]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:17:02 np0005625204.localdomain sudo[79721]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:03 np0005625204.localdomain sshd[79724]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:03 np0005625204.localdomain sshd[79724]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:17:03 np0005625204.localdomain sudo[79771]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvzvpgnuotnnkeltbkbdrvyofmwxakfz ; /usr/bin/python3
Feb 20 08:17:03 np0005625204.localdomain sudo[79771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:03 np0005625204.localdomain sudo[79771]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:03 np0005625204.localdomain sudo[79814]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqvmugqivcqutjvjmxiymtnondkstfjb ; /usr/bin/python3
Feb 20 08:17:03 np0005625204.localdomain sudo[79814]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:03 np0005625204.localdomain sudo[79814]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:03 np0005625204.localdomain sshd[79457]: Disconnecting invalid user useradmin 185.246.128.171 port 50230: Change of username or service not allowed: (useradmin,ssh-connection) -> (nexus,ssh-connection) [preauth]
Feb 20 08:17:04 np0005625204.localdomain sudo[79844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfbtebyrgjncgytlpmeeuwgyyiwqhzrk ; /usr/bin/python3
Feb 20 08:17:04 np0005625204.localdomain sudo[79844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:04 np0005625204.localdomain python3[79846]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005625204 step=5 update_config_hash_only=False
Feb 20 08:17:04 np0005625204.localdomain sudo[79844]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:04 np0005625204.localdomain sudo[79860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfahefxfsusujssdwknyepupixxgnmkb ; /usr/bin/python3
Feb 20 08:17:04 np0005625204.localdomain sudo[79860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:04 np0005625204.localdomain python3[79862]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 08:17:04 np0005625204.localdomain sudo[79860]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:05 np0005625204.localdomain sudo[79876]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhyvueqxrzmixfnxggccfnqchatbzxvy ; /usr/bin/python3
Feb 20 08:17:05 np0005625204.localdomain sudo[79876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 20 08:17:05 np0005625204.localdomain sshd[79879]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:05 np0005625204.localdomain python3[79878]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 20 08:17:05 np0005625204.localdomain sudo[79876]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:07 np0005625204.localdomain sshd[79879]: Invalid user nexus from 185.246.128.171 port 45569
Feb 20 08:17:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:17:07 np0005625204.localdomain sshd[79893]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:08 np0005625204.localdomain podman[79881]: 2026-02-20 08:17:08.002314603 +0000 UTC m=+0.091513132 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13)
Feb 20 08:17:08 np0005625204.localdomain podman[79881]: 2026-02-20 08:17:08.157849368 +0000 UTC m=+0.247047887 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 20 08:17:08 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:17:09 np0005625204.localdomain sshd[79879]: Disconnecting invalid user nexus 185.246.128.171 port 45569: Change of username or service not allowed: (nexus,ssh-connection) -> (aa,ssh-connection) [preauth]
Feb 20 08:17:10 np0005625204.localdomain sshd[79912]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:12 np0005625204.localdomain sshd[79912]: Invalid user aa from 185.246.128.171 port 40112
Feb 20 08:17:13 np0005625204.localdomain sshd[79912]: Disconnecting invalid user aa 185.246.128.171 port 40112: Change of username or service not allowed: (aa,ssh-connection) -> (jrodrig,ssh-connection) [preauth]
Feb 20 08:17:13 np0005625204.localdomain sshd[79914]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:17:14 np0005625204.localdomain podman[79918]: 2026-02-20 08:17:14.156764557 +0000 UTC m=+0.086611792 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, url=https://www.redhat.com)
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: tmp-crun.beurMQ.mount: Deactivated successfully.
Feb 20 08:17:14 np0005625204.localdomain podman[79917]: 2026-02-20 08:17:14.209424166 +0000 UTC m=+0.142529080 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, tcib_managed=true, name=rhosp-rhel9/openstack-cron, distribution-scope=public)
Feb 20 08:17:14 np0005625204.localdomain podman[79917]: 2026-02-20 08:17:14.221229476 +0000 UTC m=+0.154334390 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:17:14 np0005625204.localdomain podman[79918]: 2026-02-20 08:17:14.221418731 +0000 UTC m=+0.151265976 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-collectd-container, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com)
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:17:14 np0005625204.localdomain podman[79919]: 2026-02-20 08:17:14.265025786 +0000 UTC m=+0.188720304 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, url=https://www.redhat.com, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public)
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:17:14 np0005625204.localdomain podman[79919]: 2026-02-20 08:17:14.2961191 +0000 UTC m=+0.219813558 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:17:14 np0005625204.localdomain podman[79916]: 2026-02-20 08:17:14.354110482 +0000 UTC m=+0.287940089 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:30Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:17:14 np0005625204.localdomain podman[79916]: 2026-02-20 08:17:14.405157093 +0000 UTC m=+0.338986700 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:17:14 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:17:14 np0005625204.localdomain sshd[79914]: Invalid user jrodrig from 185.246.128.171 port 48130
Feb 20 08:17:15 np0005625204.localdomain sshd[79914]: Disconnecting invalid user jrodrig 185.246.128.171 port 48130: Change of username or service not allowed: (jrodrig,ssh-connection) -> (uucp,ssh-connection) [preauth]
Feb 20 08:17:16 np0005625204.localdomain sshd[80010]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:17 np0005625204.localdomain sudo[80012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:17:17 np0005625204.localdomain sudo[80012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:17:17 np0005625204.localdomain sudo[80012]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:17 np0005625204.localdomain sudo[80027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:17:17 np0005625204.localdomain sudo[80027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:17:18 np0005625204.localdomain sudo[80027]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:18 np0005625204.localdomain sshd[79893]: error: kex_exchange_identification: read: Connection timed out
Feb 20 08:17:18 np0005625204.localdomain sshd[79893]: banner exchange: Connection from 115.190.172.63 port 56112: Connection timed out
Feb 20 08:17:18 np0005625204.localdomain sudo[80074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:17:18 np0005625204.localdomain sudo[80074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:17:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:17:18 np0005625204.localdomain sudo[80074]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:17:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:17:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:17:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:17:19 np0005625204.localdomain sshd[80010]: Invalid user uucp from 185.246.128.171 port 45016
Feb 20 08:17:19 np0005625204.localdomain podman[80090]: 2026-02-20 08:17:19.04252943 +0000 UTC m=+0.083353344 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:17:19 np0005625204.localdomain systemd[1]: tmp-crun.CeviBW.mount: Deactivated successfully.
Feb 20 08:17:19 np0005625204.localdomain podman[80090]: 2026-02-20 08:17:19.083997409 +0000 UTC m=+0.124821313 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510)
Feb 20 08:17:19 np0005625204.localdomain podman[80092]: 2026-02-20 08:17:19.090356693 +0000 UTC m=+0.122221155 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:17:19 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:17:19 np0005625204.localdomain podman[80092]: 2026-02-20 08:17:19.113179856 +0000 UTC m=+0.145044338 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_compute)
Feb 20 08:17:19 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:17:19 np0005625204.localdomain podman[80103]: 2026-02-20 08:17:19.065841918 +0000 UTC m=+0.092627485 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 20 08:17:19 np0005625204.localdomain podman[80091]: 2026-02-20 08:17:19.250360864 +0000 UTC m=+0.285139194 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64)
Feb 20 08:17:19 np0005625204.localdomain podman[80089]: 2026-02-20 08:17:19.296423852 +0000 UTC m=+0.337526534 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, release=1766032510, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 20 08:17:19 np0005625204.localdomain podman[80091]: 2026-02-20 08:17:19.308558042 +0000 UTC m=+0.343336392 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, tcib_managed=true, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 20 08:17:19 np0005625204.localdomain podman[80089]: 2026-02-20 08:17:19.31837663 +0000 UTC m=+0.359479342 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64)
Feb 20 08:17:19 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:17:19 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:17:19 np0005625204.localdomain sshd[80010]: Disconnecting invalid user uucp 185.246.128.171 port 45016: Change of username or service not allowed: (uucp,ssh-connection) -> (stack,ssh-connection) [preauth]
Feb 20 08:17:19 np0005625204.localdomain podman[80103]: 2026-02-20 08:17:19.437002303 +0000 UTC m=+0.463787890 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:17:19 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:17:21 np0005625204.localdomain sshd[80198]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:24 np0005625204.localdomain sshd[80198]: Invalid user stack from 185.246.128.171 port 3866
Feb 20 08:17:24 np0005625204.localdomain sshd[80198]: Disconnecting invalid user stack 185.246.128.171 port 3866: Change of username or service not allowed: (stack,ssh-connection) -> (musicbot,ssh-connection) [preauth]
Feb 20 08:17:25 np0005625204.localdomain sshd[80200]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:26 np0005625204.localdomain sshd[80200]: Invalid user musicbot from 185.246.128.171 port 44432
Feb 20 08:17:27 np0005625204.localdomain sshd[80200]: Disconnecting invalid user musicbot 185.246.128.171 port 44432: Change of username or service not allowed: (musicbot,ssh-connection) -> (sftpuser,ssh-connection) [preauth]
Feb 20 08:17:29 np0005625204.localdomain sshd[80202]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:31 np0005625204.localdomain sshd[80202]: Invalid user sftpuser from 185.246.128.171 port 9717
Feb 20 08:17:31 np0005625204.localdomain sshd[78472]: fatal: Timeout before authentication for 115.190.172.63 port 53688
Feb 20 08:17:32 np0005625204.localdomain sshd[80202]: Disconnecting invalid user sftpuser 185.246.128.171 port 9717: Change of username or service not allowed: (sftpuser,ssh-connection) -> (station6,ssh-connection) [preauth]
Feb 20 08:17:33 np0005625204.localdomain sshd[80204]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:34 np0005625204.localdomain sshd[80206]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:34 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:17:34 np0005625204.localdomain sshd[80206]: Accepted publickey for zuul from 192.168.122.100 port 47080 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 08:17:34 np0005625204.localdomain recover_tripleo_nova_virtqemud[80209]: 63005
Feb 20 08:17:34 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:17:34 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:17:34 np0005625204.localdomain systemd-logind[759]: New session 34 of user zuul.
Feb 20 08:17:34 np0005625204.localdomain systemd[1]: Started Session 34 of User zuul.
Feb 20 08:17:34 np0005625204.localdomain sshd[80206]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 08:17:34 np0005625204.localdomain sshd[80204]: Invalid user station6 from 185.246.128.171 port 36276
Feb 20 08:17:34 np0005625204.localdomain sudo[80315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrajugiyezlyfbbkbffjafblxckaydui ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771575454.2928488-42473-278104069017077/AnsiballZ_setup.py
Feb 20 08:17:34 np0005625204.localdomain sudo[80315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:17:34 np0005625204.localdomain sshd[80204]: Disconnecting invalid user station6 185.246.128.171 port 36276: Change of username or service not allowed: (station6,ssh-connection) -> (data,ssh-connection) [preauth]
Feb 20 08:17:35 np0005625204.localdomain python3[80317]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 08:17:36 np0005625204.localdomain sshd[80395]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:37 np0005625204.localdomain sudo[80315]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:37 np0005625204.localdomain sshd[80395]: Invalid user data from 185.246.128.171 port 40923
Feb 20 08:17:38 np0005625204.localdomain sshd[80395]: Disconnecting invalid user data 185.246.128.171 port 40923: Change of username or service not allowed: (data,ssh-connection) -> (nemo,ssh-connection) [preauth]
Feb 20 08:17:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:17:39 np0005625204.localdomain systemd[1]: tmp-crun.qx1phW.mount: Deactivated successfully.
Feb 20 08:17:39 np0005625204.localdomain podman[80506]: 2026-02-20 08:17:39.032688907 +0000 UTC m=+0.094035557 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:17:39 np0005625204.localdomain podman[80506]: 2026-02-20 08:17:39.227363172 +0000 UTC m=+0.288709822 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, vendor=Red Hat, Inc.)
Feb 20 08:17:39 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:17:40 np0005625204.localdomain sshd[80533]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:41 np0005625204.localdomain sudo[80609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnmurylubflwvevnxuispkehyolgkkib ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771575461.6206772-42535-219896615299877/AnsiballZ_dnf.py
Feb 20 08:17:41 np0005625204.localdomain sudo[80609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:17:42 np0005625204.localdomain sshd[80533]: Invalid user nemo from 185.246.128.171 port 16525
Feb 20 08:17:42 np0005625204.localdomain python3[80611]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Feb 20 08:17:42 np0005625204.localdomain sshd[80533]: Disconnecting invalid user nemo 185.246.128.171 port 16525: Change of username or service not allowed: (nemo,ssh-connection) -> (Administrator,ssh-connection) [preauth]
Feb 20 08:17:43 np0005625204.localdomain sshd[80614]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:44 np0005625204.localdomain sshd[80614]: Invalid user Administrator from 185.246.128.171 port 8413
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:17:44 np0005625204.localdomain podman[80616]: 2026-02-20 08:17:44.381965311 +0000 UTC m=+0.061551271 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, container_name=logrotate_crond, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:17:44 np0005625204.localdomain podman[80617]: 2026-02-20 08:17:44.419917593 +0000 UTC m=+0.093613464 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, container_name=collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:17:44 np0005625204.localdomain podman[80617]: 2026-02-20 08:17:44.435191228 +0000 UTC m=+0.108887099 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Feb 20 08:17:44 np0005625204.localdomain podman[80616]: 2026-02-20 08:17:44.44745526 +0000 UTC m=+0.127041280 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: tmp-crun.BMeDHx.mount: Deactivated successfully.
Feb 20 08:17:44 np0005625204.localdomain podman[80664]: 2026-02-20 08:17:44.539387873 +0000 UTC m=+0.080176977 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:17:44 np0005625204.localdomain podman[80647]: 2026-02-20 08:17:44.589587078 +0000 UTC m=+0.183598778 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 20 08:17:44 np0005625204.localdomain podman[80664]: 2026-02-20 08:17:44.596767917 +0000 UTC m=+0.137557021 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.13, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4)
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:17:44 np0005625204.localdomain podman[80647]: 2026-02-20 08:17:44.642239458 +0000 UTC m=+0.236251178 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:17:44 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:17:45 np0005625204.localdomain sudo[80609]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:46 np0005625204.localdomain sudo[80796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxavvxttsqiqqxugumwnakzesegydtzi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771575466.6666322-42592-100355591936215/AnsiballZ_iptables.py
Feb 20 08:17:46 np0005625204.localdomain sudo[80796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:17:47 np0005625204.localdomain python3[80798]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Feb 20 08:17:47 np0005625204.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 20 08:17:47 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Feb 20 08:17:47 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 08:17:47 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 08:17:47 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 08:17:47 np0005625204.localdomain sudo[80796]: pam_unix(sudo:session): session closed for user root
Feb 20 08:17:47 np0005625204.localdomain sshd[80821]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:47 np0005625204.localdomain sshd[80614]: Disconnecting invalid user Administrator 185.246.128.171 port 8413: Change of username or service not allowed: (Administrator,ssh-connection) -> (firefly,ssh-connection [preauth]
Feb 20 08:17:47 np0005625204.localdomain sshd[80821]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:17:50 np0005625204.localdomain sshd[80872]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: tmp-crun.e8kkiz.mount: Deactivated successfully.
Feb 20 08:17:50 np0005625204.localdomain podman[80869]: 2026-02-20 08:17:50.15669532 +0000 UTC m=+0.088640073 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=iscsid, release=1766032510, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vcs-type=git)
Feb 20 08:17:50 np0005625204.localdomain podman[80869]: 2026-02-20 08:17:50.167028564 +0000 UTC m=+0.098973327 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z)
Feb 20 08:17:50 np0005625204.localdomain podman[80868]: 2026-02-20 08:17:50.125857143 +0000 UTC m=+0.061588202 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1)
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:17:50 np0005625204.localdomain podman[80868]: 2026-02-20 08:17:50.208995549 +0000 UTC m=+0.144726598 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public)
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:17:50 np0005625204.localdomain podman[80877]: 2026-02-20 08:17:50.257488812 +0000 UTC m=+0.178950568 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:17:50 np0005625204.localdomain podman[80870]: 2026-02-20 08:17:50.310285336 +0000 UTC m=+0.238243719 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13)
Feb 20 08:17:50 np0005625204.localdomain podman[80871]: 2026-02-20 08:17:50.3656968 +0000 UTC m=+0.292337272 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:17:50 np0005625204.localdomain podman[80870]: 2026-02-20 08:17:50.381980925 +0000 UTC m=+0.309939308 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:17:50 np0005625204.localdomain podman[80871]: 2026-02-20 08:17:50.403064845 +0000 UTC m=+0.329705317 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:17:50 np0005625204.localdomain podman[80877]: 2026-02-20 08:17:50.649342947 +0000 UTC m=+0.570804743 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team)
Feb 20 08:17:50 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:17:52 np0005625204.localdomain sshd[80872]: Invalid user firefly from 185.246.128.171 port 31200
Feb 20 08:17:53 np0005625204.localdomain sshd[80872]: Disconnecting invalid user firefly 185.246.128.171 port 31200: Change of username or service not allowed: (firefly,ssh-connection) -> (cat,ssh-connection) [preauth]
Feb 20 08:17:55 np0005625204.localdomain sshd[80983]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:57 np0005625204.localdomain sshd[80983]: Invalid user cat from 185.246.128.171 port 30180
Feb 20 08:17:58 np0005625204.localdomain sshd[80983]: Disconnecting invalid user cat 185.246.128.171 port 30180: Change of username or service not allowed: (cat,ssh-connection) -> (username,ssh-connection) [preauth]
Feb 20 08:17:58 np0005625204.localdomain sshd[80985]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:58 np0005625204.localdomain sshd[80986]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:17:59 np0005625204.localdomain sshd[80986]: Invalid user debian from 101.36.109.176 port 34126
Feb 20 08:17:59 np0005625204.localdomain sshd[80986]: Received disconnect from 101.36.109.176 port 34126:11: Bye Bye [preauth]
Feb 20 08:17:59 np0005625204.localdomain sshd[80986]: Disconnected from invalid user debian 101.36.109.176 port 34126 [preauth]
Feb 20 08:18:00 np0005625204.localdomain sshd[80985]: Invalid user username from 185.246.128.171 port 34834
Feb 20 08:18:01 np0005625204.localdomain sshd[80985]: Disconnecting invalid user username 185.246.128.171 port 34834: Change of username or service not allowed: (username,ssh-connection) -> (paul,ssh-connection) [preauth]
Feb 20 08:18:03 np0005625204.localdomain sshd[80989]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:05 np0005625204.localdomain sshd[80989]: Invalid user paul from 185.246.128.171 port 25945
Feb 20 08:18:06 np0005625204.localdomain sshd[80989]: Disconnecting invalid user paul 185.246.128.171 port 25945: Change of username or service not allowed: (paul,ssh-connection) -> (username1,ssh-connection) [preauth]
Feb 20 08:18:07 np0005625204.localdomain sshd[80991]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:18:10 np0005625204.localdomain podman[80993]: 2026-02-20 08:18:10.148203881 +0000 UTC m=+0.086170959 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1)
Feb 20 08:18:10 np0005625204.localdomain podman[80993]: 2026-02-20 08:18:10.372044741 +0000 UTC m=+0.310011759 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com)
Feb 20 08:18:10 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:18:10 np0005625204.localdomain sshd[80991]: Invalid user username1 from 185.246.128.171 port 59418
Feb 20 08:18:11 np0005625204.localdomain sshd[80991]: Disconnecting invalid user username1 185.246.128.171 port 59418: Change of username or service not allowed: (username1,ssh-connection) -> (qaz,ssh-connection) [preauth]
Feb 20 08:18:13 np0005625204.localdomain sshd[81023]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:14 np0005625204.localdomain sshd[81023]: Invalid user qaz from 185.246.128.171 port 58345
Feb 20 08:18:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:18:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:18:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:18:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:18:14 np0005625204.localdomain podman[81028]: 2026-02-20 08:18:14.915006349 +0000 UTC m=+0.078100024 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:18:14 np0005625204.localdomain podman[81025]: 2026-02-20 08:18:14.951380244 +0000 UTC m=+0.122888435 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:18:14 np0005625204.localdomain sshd[81023]: Disconnecting invalid user qaz 185.246.128.171 port 58345: Change of username or service not allowed: (qaz,ssh-connection) -> (josie,ssh-connection) [preauth]
Feb 20 08:18:14 np0005625204.localdomain podman[81028]: 2026-02-20 08:18:14.95847222 +0000 UTC m=+0.121565895 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com)
Feb 20 08:18:14 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:18:15 np0005625204.localdomain podman[81025]: 2026-02-20 08:18:15.012117719 +0000 UTC m=+0.183625890 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5)
Feb 20 08:18:15 np0005625204.localdomain systemd[1]: tmp-crun.Ax18bw.mount: Deactivated successfully.
Feb 20 08:18:15 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:18:15 np0005625204.localdomain podman[81026]: 2026-02-20 08:18:15.055768725 +0000 UTC m=+0.227340148 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Feb 20 08:18:15 np0005625204.localdomain podman[81027]: 2026-02-20 08:18:15.015217543 +0000 UTC m=+0.181482914 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:18:15 np0005625204.localdomain podman[81027]: 2026-02-20 08:18:15.098044479 +0000 UTC m=+0.264309880 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd)
Feb 20 08:18:15 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:18:15 np0005625204.localdomain podman[81026]: 2026-02-20 08:18:15.118992406 +0000 UTC m=+0.290563829 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:18:15 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:18:15 np0005625204.localdomain sshd[81118]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:17 np0005625204.localdomain sshd[81118]: Invalid user josie from 185.246.128.171 port 59457
Feb 20 08:18:18 np0005625204.localdomain sshd[81118]: Disconnecting invalid user josie 185.246.128.171 port 59457: Change of username or service not allowed: (josie,ssh-connection) -> (lixiang,ssh-connection) [preauth]
Feb 20 08:18:18 np0005625204.localdomain sshd[81120]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:19 np0005625204.localdomain sudo[81122]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:18:19 np0005625204.localdomain sudo[81122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:18:19 np0005625204.localdomain sudo[81122]: pam_unix(sudo:session): session closed for user root
Feb 20 08:18:19 np0005625204.localdomain sudo[81137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:18:19 np0005625204.localdomain sudo[81137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:18:19 np0005625204.localdomain sudo[81137]: pam_unix(sudo:session): session closed for user root
Feb 20 08:18:20 np0005625204.localdomain sshd[81120]: Invalid user lixiang from 185.246.128.171 port 56027
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: tmp-crun.7ToJat.mount: Deactivated successfully.
Feb 20 08:18:20 np0005625204.localdomain podman[81184]: 2026-02-20 08:18:20.321362636 +0000 UTC m=+0.088943153 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Feb 20 08:18:20 np0005625204.localdomain sshd[81120]: Disconnecting invalid user lixiang 185.246.128.171 port 56027: Change of username or service not allowed: (lixiang,ssh-connection) -> (support,ssh-connection) [preauth]
Feb 20 08:18:20 np0005625204.localdomain podman[81184]: 2026-02-20 08:18:20.350976406 +0000 UTC m=+0.118556883 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:18:20 np0005625204.localdomain podman[81183]: 2026-02-20 08:18:20.362996411 +0000 UTC m=+0.129778704 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 20 08:18:20 np0005625204.localdomain podman[81183]: 2026-02-20 08:18:20.377483021 +0000 UTC m=+0.144265294 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:18:20 np0005625204.localdomain sudo[81224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:18:20 np0005625204.localdomain sudo[81224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:18:20 np0005625204.localdomain sudo[81224]: pam_unix(sudo:session): session closed for user root
Feb 20 08:18:20 np0005625204.localdomain podman[81240]: 2026-02-20 08:18:20.604132977 +0000 UTC m=+0.077868047 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:18:20 np0005625204.localdomain podman[81238]: 2026-02-20 08:18:20.657271521 +0000 UTC m=+0.134171077 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:18:20 np0005625204.localdomain podman[81240]: 2026-02-20 08:18:20.682008903 +0000 UTC m=+0.155743993 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:18:20 np0005625204.localdomain podman[81238]: 2026-02-20 08:18:20.708102305 +0000 UTC m=+0.185001791 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vcs-type=git, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4)
Feb 20 08:18:20 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:18:20 np0005625204.localdomain podman[81288]: 2026-02-20 08:18:20.798870202 +0000 UTC m=+0.080258078 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:18:21 np0005625204.localdomain podman[81288]: 2026-02-20 08:18:21.174226296 +0000 UTC m=+0.455614212 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:18:21 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:18:21 np0005625204.localdomain sshd[81311]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:24 np0005625204.localdomain sshd[81311]: Invalid user support from 185.246.128.171 port 9424
Feb 20 08:18:26 np0005625204.localdomain sshd[81311]: error: maximum authentication attempts exceeded for invalid user support from 185.246.128.171 port 9424 ssh2 [preauth]
Feb 20 08:18:26 np0005625204.localdomain sshd[81311]: Disconnecting invalid user support 185.246.128.171 port 9424: Too many authentication failures [preauth]
Feb 20 08:18:27 np0005625204.localdomain sshd[81313]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:28 np0005625204.localdomain sshd[81313]: Invalid user support from 185.246.128.171 port 16876
Feb 20 08:18:29 np0005625204.localdomain sshd[81313]: Disconnecting invalid user support 185.246.128.171 port 16876: Change of username or service not allowed: (support,ssh-connection) -> (aman,ssh-connection) [preauth]
Feb 20 08:18:29 np0005625204.localdomain sshd[81315]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:30 np0005625204.localdomain sshd[81315]: Invalid user aman from 185.246.128.171 port 11077
Feb 20 08:18:31 np0005625204.localdomain sshd[81315]: Disconnecting invalid user aman 185.246.128.171 port 11077: Change of username or service not allowed: (aman,ssh-connection) -> (mj,ssh-connection) [preauth]
Feb 20 08:18:32 np0005625204.localdomain sshd[81317]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:33 np0005625204.localdomain sshd[81318]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:33 np0005625204.localdomain sshd[81318]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:18:34 np0005625204.localdomain sshd[81317]: Invalid user mj from 185.246.128.171 port 23752
Feb 20 08:18:34 np0005625204.localdomain sshd[81317]: Disconnecting invalid user mj 185.246.128.171 port 23752: Change of username or service not allowed: (mj,ssh-connection) -> (draytek,ssh-connection) [preauth]
Feb 20 08:18:36 np0005625204.localdomain sshd[81321]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:37 np0005625204.localdomain sshd[81321]: Invalid user draytek from 185.246.128.171 port 39831
Feb 20 08:18:38 np0005625204.localdomain sshd[81321]: Disconnecting invalid user draytek 185.246.128.171 port 39831: Change of username or service not allowed: (draytek,ssh-connection) -> (upload,ssh-connection) [preauth]
Feb 20 08:18:39 np0005625204.localdomain sshd[81323]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:18:41 np0005625204.localdomain sshd[81323]: Invalid user upload from 185.246.128.171 port 13904
Feb 20 08:18:41 np0005625204.localdomain podman[81325]: 2026-02-20 08:18:41.149790392 +0000 UTC m=+0.087569511 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:18:41 np0005625204.localdomain sshd[81323]: Disconnecting invalid user upload 185.246.128.171 port 13904: Change of username or service not allowed: (upload,ssh-connection) -> (ruoyi,ssh-connection) [preauth]
Feb 20 08:18:41 np0005625204.localdomain podman[81325]: 2026-02-20 08:18:41.333317288 +0000 UTC m=+0.271096397 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, maintainer=OpenStack TripleO Team)
Feb 20 08:18:41 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:18:42 np0005625204.localdomain sshd[81354]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:44 np0005625204.localdomain sshd[81354]: Invalid user ruoyi from 185.246.128.171 port 23672
Feb 20 08:18:44 np0005625204.localdomain sshd[81356]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:45 np0005625204.localdomain sshd[81354]: Disconnecting invalid user ruoyi 185.246.128.171 port 23672: Change of username or service not allowed: (ruoyi,ssh-connection) -> (~,ssh-connection) [preauth]
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:18:45 np0005625204.localdomain podman[81358]: 2026-02-20 08:18:45.139799299 +0000 UTC m=+0.067935314 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:18:45 np0005625204.localdomain podman[81358]: 2026-02-20 08:18:45.19080283 +0000 UTC m=+0.118938785 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:18:45 np0005625204.localdomain podman[81390]: 2026-02-20 08:18:45.247556254 +0000 UTC m=+0.078881968 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, container_name=logrotate_crond, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z)
Feb 20 08:18:45 np0005625204.localdomain podman[81357]: 2026-02-20 08:18:45.198226875 +0000 UTC m=+0.130812885 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:18:45 np0005625204.localdomain podman[81394]: 2026-02-20 08:18:45.303174623 +0000 UTC m=+0.131268210 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:18:45 np0005625204.localdomain podman[81357]: 2026-02-20 08:18:45.331984598 +0000 UTC m=+0.264570598 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:18:45 np0005625204.localdomain podman[81394]: 2026-02-20 08:18:45.340469796 +0000 UTC m=+0.168563352 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container)
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:18:45 np0005625204.localdomain podman[81390]: 2026-02-20 08:18:45.3850424 +0000 UTC m=+0.216368094 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:18:45 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:18:45 np0005625204.localdomain sshd[81449]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:46 np0005625204.localdomain sshd[80206]: pam_unix(sshd:session): session closed for user zuul
Feb 20 08:18:46 np0005625204.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Feb 20 08:18:46 np0005625204.localdomain systemd[1]: session-34.scope: Consumed 6.047s CPU time.
Feb 20 08:18:46 np0005625204.localdomain systemd-logind[759]: Session 34 logged out. Waiting for processes to exit.
Feb 20 08:18:46 np0005625204.localdomain systemd-logind[759]: Removed session 34.
Feb 20 08:18:48 np0005625204.localdomain sshd[81449]: Invalid user ~ from 185.246.128.171 port 6171
Feb 20 08:18:48 np0005625204.localdomain sshd[81449]: Disconnecting invalid user ~ 185.246.128.171 port 6171: Change of username or service not allowed: (~,ssh-connection) -> (tech,ssh-connection) [preauth]
Feb 20 08:18:49 np0005625204.localdomain sshd[81451]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:50 np0005625204.localdomain sshd[81451]: Invalid user tech from 185.246.128.171 port 37825
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: tmp-crun.ENK6FA.mount: Deactivated successfully.
Feb 20 08:18:50 np0005625204.localdomain podman[81473]: 2026-02-20 08:18:50.621240908 +0000 UTC m=+0.086511090 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13)
Feb 20 08:18:50 np0005625204.localdomain sshd[81451]: Disconnecting invalid user tech 185.246.128.171 port 37825: Change of username or service not allowed: (tech,ssh-connection) -> (student,ssh-connection) [preauth]
Feb 20 08:18:50 np0005625204.localdomain podman[81474]: 2026-02-20 08:18:50.690229424 +0000 UTC m=+0.152874496 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:18:50 np0005625204.localdomain podman[81473]: 2026-02-20 08:18:50.698205976 +0000 UTC m=+0.163476148 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-type=git)
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:18:50 np0005625204.localdomain podman[81474]: 2026-02-20 08:18:50.750090962 +0000 UTC m=+0.212736024 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-type=git, container_name=iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:18:50 np0005625204.localdomain podman[81516]: 2026-02-20 08:18:50.820437179 +0000 UTC m=+0.093599694 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_compute, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, release=1766032510)
Feb 20 08:18:50 np0005625204.localdomain podman[81529]: 2026-02-20 08:18:50.859212517 +0000 UTC m=+0.074039070 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:18:50 np0005625204.localdomain podman[81516]: 2026-02-20 08:18:50.891486048 +0000 UTC m=+0.164648633 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com)
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:18:50 np0005625204.localdomain podman[81529]: 2026-02-20 08:18:50.934110863 +0000 UTC m=+0.148937386 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 20 08:18:50 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:18:51 np0005625204.localdomain sshd[81588]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:18:51 np0005625204.localdomain systemd[1]: tmp-crun.I3HnZt.mount: Deactivated successfully.
Feb 20 08:18:51 np0005625204.localdomain systemd[1]: tmp-crun.eVPD42.mount: Deactivated successfully.
Feb 20 08:18:51 np0005625204.localdomain podman[81589]: 2026-02-20 08:18:51.649532147 +0000 UTC m=+0.093337386 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4)
Feb 20 08:18:52 np0005625204.localdomain podman[81589]: 2026-02-20 08:18:52.027104018 +0000 UTC m=+0.470909257 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Feb 20 08:18:52 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:18:52 np0005625204.localdomain sshd[81588]: Invalid user student from 185.246.128.171 port 35103
Feb 20 08:18:53 np0005625204.localdomain sshd[81588]: Disconnecting invalid user student 185.246.128.171 port 35103: Change of username or service not allowed: (student,ssh-connection) -> (ftpuser,ssh-connection) [preauth]
Feb 20 08:18:53 np0005625204.localdomain sshd[81615]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:53 np0005625204.localdomain sshd[81617]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:18:54 np0005625204.localdomain sshd[81615]: Invalid user ubuntu from 178.217.173.50 port 47348
Feb 20 08:18:54 np0005625204.localdomain sshd[81615]: Received disconnect from 178.217.173.50 port 47348:11: Bye Bye [preauth]
Feb 20 08:18:54 np0005625204.localdomain sshd[81615]: Disconnected from invalid user ubuntu 178.217.173.50 port 47348 [preauth]
Feb 20 08:18:55 np0005625204.localdomain sshd[81356]: error: kex_exchange_identification: read: Connection timed out
Feb 20 08:18:55 np0005625204.localdomain sshd[81356]: banner exchange: Connection from 115.190.172.63 port 48562: Connection timed out
Feb 20 08:18:55 np0005625204.localdomain sshd[81617]: Invalid user ftpuser from 185.246.128.171 port 41235
Feb 20 08:18:57 np0005625204.localdomain sshd[81617]: error: maximum authentication attempts exceeded for invalid user ftpuser from 185.246.128.171 port 41235 ssh2 [preauth]
Feb 20 08:18:57 np0005625204.localdomain sshd[81617]: Disconnecting invalid user ftpuser 185.246.128.171 port 41235: Too many authentication failures [preauth]
Feb 20 08:18:59 np0005625204.localdomain sshd[81619]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:01 np0005625204.localdomain sshd[81621]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:01 np0005625204.localdomain sshd[81621]: Accepted publickey for zuul from 38.102.83.114 port 37642 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 08:19:01 np0005625204.localdomain systemd-logind[759]: New session 35 of user zuul.
Feb 20 08:19:01 np0005625204.localdomain systemd[1]: Started Session 35 of User zuul.
Feb 20 08:19:01 np0005625204.localdomain sshd[81621]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 08:19:01 np0005625204.localdomain sudo[81638]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eymymskbpiyrmazmrsyrycphfjdsrggd ; /usr/bin/python3
Feb 20 08:19:01 np0005625204.localdomain sudo[81638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 08:19:01 np0005625204.localdomain sshd[81619]: Invalid user ftpuser from 185.246.128.171 port 35472
Feb 20 08:19:01 np0005625204.localdomain python3[81640]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 08:19:04 np0005625204.localdomain sudo[81638]: pam_unix(sudo:session): session closed for user root
Feb 20 08:19:05 np0005625204.localdomain sshd[81642]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:06 np0005625204.localdomain sshd[81642]: Received disconnect from 83.235.16.111 port 58756:11: Bye Bye [preauth]
Feb 20 08:19:06 np0005625204.localdomain sshd[81642]: Disconnected from authenticating user root 83.235.16.111 port 58756 [preauth]
Feb 20 08:19:07 np0005625204.localdomain sshd[81619]: error: maximum authentication attempts exceeded for invalid user ftpuser from 185.246.128.171 port 35472 ssh2 [preauth]
Feb 20 08:19:07 np0005625204.localdomain sshd[81619]: Disconnecting invalid user ftpuser 185.246.128.171 port 35472: Too many authentication failures [preauth]
Feb 20 08:19:07 np0005625204.localdomain sshd[81644]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:11 np0005625204.localdomain sshd[81644]: Invalid user ftpuser from 185.246.128.171 port 15758
Feb 20 08:19:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:19:12 np0005625204.localdomain systemd[1]: tmp-crun.gsEFlc.mount: Deactivated successfully.
Feb 20 08:19:12 np0005625204.localdomain podman[81646]: 2026-02-20 08:19:12.15687201 +0000 UTC m=+0.092026647 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true)
Feb 20 08:19:12 np0005625204.localdomain sshd[81644]: Disconnecting invalid user ftpuser 185.246.128.171 port 15758: Change of username or service not allowed: (ftpuser,ssh-connection) -> (momoru,ssh-connection) [preauth]
Feb 20 08:19:12 np0005625204.localdomain podman[81646]: 2026-02-20 08:19:12.398055457 +0000 UTC m=+0.333210104 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container)
Feb 20 08:19:12 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:19:14 np0005625204.localdomain sshd[81675]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: tmp-crun.ZG0z0x.mount: Deactivated successfully.
Feb 20 08:19:16 np0005625204.localdomain podman[81678]: 2026-02-20 08:19:16.142333279 +0000 UTC m=+0.081713863 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:19:16 np0005625204.localdomain podman[81678]: 2026-02-20 08:19:16.152112237 +0000 UTC m=+0.091492821 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z)
Feb 20 08:19:16 np0005625204.localdomain podman[81680]: 2026-02-20 08:19:16.158448269 +0000 UTC m=+0.094029238 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vcs-type=git, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:19:16 np0005625204.localdomain sshd[81675]: Invalid user momoru from 185.246.128.171 port 24096
Feb 20 08:19:16 np0005625204.localdomain podman[81680]: 2026-02-20 08:19:16.185069377 +0000 UTC m=+0.120650426 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:19:16 np0005625204.localdomain podman[81679]: 2026-02-20 08:19:16.190235034 +0000 UTC m=+0.128330880 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:19:16 np0005625204.localdomain podman[81677]: 2026-02-20 08:19:16.248964208 +0000 UTC m=+0.189033933 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com)
Feb 20 08:19:16 np0005625204.localdomain podman[81677]: 2026-02-20 08:19:16.272026669 +0000 UTC m=+0.212096414 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team)
Feb 20 08:19:16 np0005625204.localdomain sshd[81675]: Disconnecting invalid user momoru 185.246.128.171 port 24096: Change of username or service not allowed: (momoru,ssh-connection) -> (ganesh,ssh-connection) [preauth]
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:19:16 np0005625204.localdomain podman[81679]: 2026-02-20 08:19:16.32240867 +0000 UTC m=+0.260504576 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, container_name=collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true)
Feb 20 08:19:16 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:19:16 np0005625204.localdomain sshd[81768]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:17 np0005625204.localdomain systemd[1]: tmp-crun.TlVR7l.mount: Deactivated successfully.
Feb 20 08:19:19 np0005625204.localdomain sshd[81770]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:19 np0005625204.localdomain sshd[81770]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:19:20 np0005625204.localdomain sshd[81768]: Invalid user ganesh from 185.246.128.171 port 38977
Feb 20 08:19:20 np0005625204.localdomain sudo[81772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:19:20 np0005625204.localdomain sudo[81772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:19:20 np0005625204.localdomain sudo[81772]: pam_unix(sudo:session): session closed for user root
Feb 20 08:19:20 np0005625204.localdomain sudo[81787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:19:20 np0005625204.localdomain sudo[81787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:19:21 np0005625204.localdomain podman[81819]: 2026-02-20 08:19:21.13441151 +0000 UTC m=+0.064096749 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step5, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:19:21 np0005625204.localdomain podman[81816]: 2026-02-20 08:19:21.172380863 +0000 UTC m=+0.105526147 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:19:21 np0005625204.localdomain sshd[81768]: Disconnecting invalid user ganesh 185.246.128.171 port 38977: Change of username or service not allowed: (ganesh,ssh-connection) -> (liuj,ssh-connection) [preauth]
Feb 20 08:19:21 np0005625204.localdomain podman[81817]: 2026-02-20 08:19:21.247270108 +0000 UTC m=+0.180381981 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, batch=17.1_20260112.1)
Feb 20 08:19:21 np0005625204.localdomain sudo[81787]: pam_unix(sudo:session): session closed for user root
Feb 20 08:19:21 np0005625204.localdomain podman[81817]: 2026-02-20 08:19:21.258932843 +0000 UTC m=+0.192044736 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team)
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:19:21 np0005625204.localdomain podman[81816]: 2026-02-20 08:19:21.268856594 +0000 UTC m=+0.202001858 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, release=1766032510, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:19:21 np0005625204.localdomain podman[81819]: 2026-02-20 08:19:21.309575261 +0000 UTC m=+0.239260430 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:19:21 np0005625204.localdomain podman[81818]: 2026-02-20 08:19:21.398217534 +0000 UTC m=+0.328325126 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public)
Feb 20 08:19:21 np0005625204.localdomain podman[81818]: 2026-02-20 08:19:21.444184241 +0000 UTC m=+0.374291823 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, container_name=ovn_metadata_agent, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:19:21 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:19:22 np0005625204.localdomain systemd[1]: tmp-crun.DQK7OI.mount: Deactivated successfully.
Feb 20 08:19:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:19:22 np0005625204.localdomain podman[81922]: 2026-02-20 08:19:22.245402012 +0000 UTC m=+0.087392256 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:19:22 np0005625204.localdomain podman[81922]: 2026-02-20 08:19:22.645222018 +0000 UTC m=+0.487212242 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, release=1766032510)
Feb 20 08:19:22 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:19:22 np0005625204.localdomain sshd[81945]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:23 np0005625204.localdomain sshd[81945]: Invalid user liuj from 185.246.128.171 port 6367
Feb 20 08:19:25 np0005625204.localdomain sshd[81945]: Disconnecting invalid user liuj 185.246.128.171 port 6367: Change of username or service not allowed: (liuj,ssh-connection) -> (theta,ssh-connection) [preauth]
Feb 20 08:19:25 np0005625204.localdomain sudo[81948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:19:25 np0005625204.localdomain sudo[81948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:19:25 np0005625204.localdomain sudo[81948]: pam_unix(sudo:session): session closed for user root
Feb 20 08:19:26 np0005625204.localdomain sshd[81963]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:26 np0005625204.localdomain sshd[81963]: Invalid user theta from 185.246.128.171 port 54946
Feb 20 08:19:26 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:19:26 np0005625204.localdomain recover_tripleo_nova_virtqemud[81966]: 63005
Feb 20 08:19:26 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:19:26 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:19:27 np0005625204.localdomain sshd[81963]: Disconnecting invalid user theta 185.246.128.171 port 54946: Change of username or service not allowed: (theta,ssh-connection) -> (squid,ssh-connection) [preauth]
Feb 20 08:19:27 np0005625204.localdomain sshd[81967]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:30 np0005625204.localdomain sshd[81967]: Invalid user squid from 185.246.128.171 port 21157
Feb 20 08:19:30 np0005625204.localdomain sshd[81967]: Disconnecting invalid user squid 185.246.128.171 port 21157: Change of username or service not allowed: (squid,ssh-connection) -> (abigail,ssh-connection) [preauth]
Feb 20 08:19:30 np0005625204.localdomain sudo[81982]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxdofqabsuwauqvqchlvbgavxhragbqo ; /usr/bin/python3
Feb 20 08:19:30 np0005625204.localdomain sudo[81982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 08:19:30 np0005625204.localdomain python3[81984]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 20 08:19:31 np0005625204.localdomain sshd[81986]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 08:19:34 np0005625204.localdomain sshd[81986]: Invalid user abigail from 185.246.128.171 port 30767
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: run-r20ebb92b4d124708b26a802eb60ecf24.service: Deactivated successfully.
Feb 20 08:19:34 np0005625204.localdomain systemd[1]: run-r82e63e13d0294acca6ca1ab412f7b379.service: Deactivated successfully.
Feb 20 08:19:34 np0005625204.localdomain sshd[81986]: Disconnecting invalid user abigail 185.246.128.171 port 30767: Change of username or service not allowed: (abigail,ssh-connection) -> (jane,ssh-connection) [preauth]
Feb 20 08:19:35 np0005625204.localdomain sudo[81982]: pam_unix(sudo:session): session closed for user root
Feb 20 08:19:36 np0005625204.localdomain sshd[82139]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:37 np0005625204.localdomain sshd[82139]: Invalid user jane from 185.246.128.171 port 38233
Feb 20 08:19:38 np0005625204.localdomain sshd[82139]: Disconnecting invalid user jane 185.246.128.171 port 38233: Change of username or service not allowed: (jane,ssh-connection) -> (pi,ssh-connection) [preauth]
Feb 20 08:19:38 np0005625204.localdomain sshd[82141]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:39 np0005625204.localdomain sshd[82141]: Invalid user pi from 185.246.128.171 port 28542
Feb 20 08:19:42 np0005625204.localdomain sshd[82141]: error: maximum authentication attempts exceeded for invalid user pi from 185.246.128.171 port 28542 ssh2 [preauth]
Feb 20 08:19:42 np0005625204.localdomain sshd[82141]: Disconnecting invalid user pi 185.246.128.171 port 28542: Too many authentication failures [preauth]
Feb 20 08:19:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:19:42 np0005625204.localdomain podman[82143]: 2026-02-20 08:19:42.704533519 +0000 UTC m=+0.220701207 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, version=17.1.13, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:19:42 np0005625204.localdomain podman[82143]: 2026-02-20 08:19:42.903120552 +0000 UTC m=+0.419288260 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:19:42 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:19:43 np0005625204.localdomain sshd[82173]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:45 np0005625204.localdomain sshd[82173]: Invalid user pi from 185.246.128.171 port 20533
Feb 20 08:19:46 np0005625204.localdomain sshd[82173]: Disconnecting invalid user pi 185.246.128.171 port 20533: Change of username or service not allowed: (pi,ssh-connection) -> (wiki,ssh-connection) [preauth]
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:19:46 np0005625204.localdomain podman[82175]: 2026-02-20 08:19:46.414503178 +0000 UTC m=+0.096390509 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:19:46 np0005625204.localdomain podman[82175]: 2026-02-20 08:19:46.449132221 +0000 UTC m=+0.131019552 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, architecture=x86_64)
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: tmp-crun.9glfMt.mount: Deactivated successfully.
Feb 20 08:19:46 np0005625204.localdomain podman[82210]: 2026-02-20 08:19:46.551681946 +0000 UTC m=+0.136557469 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 20 08:19:46 np0005625204.localdomain podman[82210]: 2026-02-20 08:19:46.562106653 +0000 UTC m=+0.146982186 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z)
Feb 20 08:19:46 np0005625204.localdomain podman[82176]: 2026-02-20 08:19:46.521531119 +0000 UTC m=+0.199357037 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true)
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:19:46 np0005625204.localdomain podman[82177]: 2026-02-20 08:19:46.603198361 +0000 UTC m=+0.278092370 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 20 08:19:46 np0005625204.localdomain podman[82176]: 2026-02-20 08:19:46.65156036 +0000 UTC m=+0.329386348 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:47Z, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, vcs-type=git, distribution-scope=public)
Feb 20 08:19:46 np0005625204.localdomain podman[82177]: 2026-02-20 08:19:46.660224514 +0000 UTC m=+0.335118523 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:19:46 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:19:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:19:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4463 writes, 20K keys, 4463 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4463 writes, 468 syncs, 9.54 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:19:48 np0005625204.localdomain sshd[82267]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:48 np0005625204.localdomain sshd[82269]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:49 np0005625204.localdomain sshd[82267]: Invalid user claude from 152.32.189.21 port 45494
Feb 20 08:19:49 np0005625204.localdomain sshd[82267]: Received disconnect from 152.32.189.21 port 45494:11: Bye Bye [preauth]
Feb 20 08:19:49 np0005625204.localdomain sshd[82267]: Disconnected from invalid user claude 152.32.189.21 port 45494 [preauth]
Feb 20 08:19:51 np0005625204.localdomain sshd[82269]: Invalid user wiki from 185.246.128.171 port 57130
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:19:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:19:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 5194 writes, 22K keys, 5194 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5194 writes, 621 syncs, 8.36 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:19:51 np0005625204.localdomain podman[82314]: 2026-02-20 08:19:51.448720229 +0000 UTC m=+0.082818056 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, tcib_managed=true)
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: tmp-crun.SLE0XG.mount: Deactivated successfully.
Feb 20 08:19:51 np0005625204.localdomain podman[82314]: 2026-02-20 08:19:51.511112945 +0000 UTC m=+0.145210762 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:19:51 np0005625204.localdomain podman[82315]: 2026-02-20 08:19:51.566849968 +0000 UTC m=+0.196726808 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3)
Feb 20 08:19:51 np0005625204.localdomain podman[82315]: 2026-02-20 08:19:51.578958506 +0000 UTC m=+0.208835346 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, distribution-scope=public, url=https://www.redhat.com)
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:19:51 np0005625204.localdomain podman[82359]: 2026-02-20 08:19:51.58897538 +0000 UTC m=+0.096440110 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510)
Feb 20 08:19:51 np0005625204.localdomain podman[82316]: 2026-02-20 08:19:51.516796448 +0000 UTC m=+0.143289745 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5)
Feb 20 08:19:51 np0005625204.localdomain podman[82316]: 2026-02-20 08:19:51.648003073 +0000 UTC m=+0.274496330 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:19:51 np0005625204.localdomain podman[82359]: 2026-02-20 08:19:51.670232979 +0000 UTC m=+0.177697709 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z)
Feb 20 08:19:51 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:19:52 np0005625204.localdomain sshd[82269]: Disconnecting invalid user wiki 185.246.128.171 port 57130: Change of username or service not allowed: (wiki,ssh-connection) -> (vodafone,ssh-connection) [preauth]
Feb 20 08:19:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:19:53 np0005625204.localdomain podman[82406]: 2026-02-20 08:19:53.139785235 +0000 UTC m=+0.077258259 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 20 08:19:53 np0005625204.localdomain podman[82406]: 2026-02-20 08:19:53.512995873 +0000 UTC m=+0.450468907 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:19:53 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:19:54 np0005625204.localdomain sshd[82429]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:19:56 np0005625204.localdomain sshd[82429]: Invalid user vodafone from 185.246.128.171 port 43496
Feb 20 08:19:57 np0005625204.localdomain sshd[82429]: Disconnecting invalid user vodafone 185.246.128.171 port 43496: Change of username or service not allowed: (vodafone,ssh-connection) -> (www-data,ssh-connection) [preauth]
Feb 20 08:19:58 np0005625204.localdomain sshd[82431]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:00 np0005625204.localdomain sshd[82433]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:01 np0005625204.localdomain sshd[82431]: Invalid user www-data from 185.246.128.171 port 8944
Feb 20 08:20:01 np0005625204.localdomain sshd[82433]: Invalid user shreyas from 77.232.138.190 port 44122
Feb 20 08:20:01 np0005625204.localdomain sshd[82433]: Received disconnect from 77.232.138.190 port 44122:11: Bye Bye [preauth]
Feb 20 08:20:01 np0005625204.localdomain sshd[82433]: Disconnected from invalid user shreyas 77.232.138.190 port 44122 [preauth]
Feb 20 08:20:05 np0005625204.localdomain sshd[82431]: Disconnecting invalid user www-data 185.246.128.171 port 8944: Change of username or service not allowed: (www-data,ssh-connection) -> (router,ssh-connection) [preauth]
Feb 20 08:20:06 np0005625204.localdomain sshd[82435]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:07 np0005625204.localdomain sshd[82436]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:07 np0005625204.localdomain sshd[82436]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:20:08 np0005625204.localdomain sshd[82435]: Invalid user router from 185.246.128.171 port 21957
Feb 20 08:20:09 np0005625204.localdomain sshd[82435]: Disconnecting invalid user router 185.246.128.171 port 21957: Change of username or service not allowed: (router,ssh-connection) -> (es1,ssh-connection) [preauth]
Feb 20 08:20:09 np0005625204.localdomain sshd[82439]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:10 np0005625204.localdomain sshd[82439]: Invalid user es1 from 185.246.128.171 port 8279
Feb 20 08:20:11 np0005625204.localdomain sshd[82439]: Disconnecting invalid user es1 185.246.128.171 port 8279: Change of username or service not allowed: (es1,ssh-connection) -> (telnet,ssh-connection) [preauth]
Feb 20 08:20:11 np0005625204.localdomain sshd[82441]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:12 np0005625204.localdomain sudo[82456]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtubeupszeetjzdepnwmwzqwxoqjvxii ; /usr/bin/python3
Feb 20 08:20:12 np0005625204.localdomain sudo[82456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 08:20:12 np0005625204.localdomain python3[82458]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 08:20:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:20:13 np0005625204.localdomain sshd[82441]: Invalid user telnet from 185.246.128.171 port 20394
Feb 20 08:20:13 np0005625204.localdomain systemd[1]: tmp-crun.GIIyPv.mount: Deactivated successfully.
Feb 20 08:20:13 np0005625204.localdomain podman[82461]: 2026-02-20 08:20:13.139961079 +0000 UTC m=+0.080197086 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:20:13 np0005625204.localdomain podman[82461]: 2026-02-20 08:20:13.333691765 +0000 UTC m=+0.273927802 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=)
Feb 20 08:20:13 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:20:13 np0005625204.localdomain sshd[82441]: Disconnecting invalid user telnet 185.246.128.171 port 20394: Change of username or service not allowed: (telnet,ssh-connection) -> (gns3,ssh-connection) [preauth]
Feb 20 08:20:14 np0005625204.localdomain sshd[82490]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:15 np0005625204.localdomain sshd[82490]: Invalid user gns3 from 185.246.128.171 port 6243
Feb 20 08:20:15 np0005625204.localdomain rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 08:20:15 np0005625204.localdomain sshd[82490]: Disconnecting invalid user gns3 185.246.128.171 port 6243: Change of username or service not allowed: (gns3,ssh-connection) -> (hasan,ssh-connection) [preauth]
Feb 20 08:20:16 np0005625204.localdomain sshd[82616]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: tmp-crun.dlhQ31.mount: Deactivated successfully.
Feb 20 08:20:17 np0005625204.localdomain podman[82620]: 2026-02-20 08:20:17.156765052 +0000 UTC m=+0.092518932 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 20 08:20:17 np0005625204.localdomain podman[82621]: 2026-02-20 08:20:17.203255245 +0000 UTC m=+0.136099846 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=logrotate_crond, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z)
Feb 20 08:20:17 np0005625204.localdomain podman[82620]: 2026-02-20 08:20:17.214793045 +0000 UTC m=+0.150546895 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:20:17 np0005625204.localdomain podman[82623]: 2026-02-20 08:20:17.264381362 +0000 UTC m=+0.190880281 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 20 08:20:17 np0005625204.localdomain podman[82622]: 2026-02-20 08:20:17.315166505 +0000 UTC m=+0.245081648 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:20:17 np0005625204.localdomain podman[82623]: 2026-02-20 08:20:17.324062385 +0000 UTC m=+0.250561244 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, version=17.1.13)
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:20:17 np0005625204.localdomain podman[82621]: 2026-02-20 08:20:17.34100826 +0000 UTC m=+0.273852911 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:20:17 np0005625204.localdomain podman[82622]: 2026-02-20 08:20:17.375485867 +0000 UTC m=+0.305401010 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:20:17 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:20:17 np0005625204.localdomain sshd[82616]: Invalid user hasan from 185.246.128.171 port 26760
Feb 20 08:20:17 np0005625204.localdomain sshd[82616]: Disconnecting invalid user hasan 185.246.128.171 port 26760: Change of username or service not allowed: (hasan,ssh-connection) -> (log,ssh-connection) [preauth]
Feb 20 08:20:19 np0005625204.localdomain sudo[82456]: pam_unix(sudo:session): session closed for user root
Feb 20 08:20:20 np0005625204.localdomain sshd[82765]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: tmp-crun.Bbl1Zk.mount: Deactivated successfully.
Feb 20 08:20:22 np0005625204.localdomain podman[82769]: 2026-02-20 08:20:22.158970812 +0000 UTC m=+0.088951154 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: tmp-crun.tEMcsT.mount: Deactivated successfully.
Feb 20 08:20:22 np0005625204.localdomain podman[82768]: 2026-02-20 08:20:22.206385612 +0000 UTC m=+0.141057907 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, distribution-scope=public, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:20:22 np0005625204.localdomain podman[82768]: 2026-02-20 08:20:22.249990977 +0000 UTC m=+0.184663292 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:20:22 np0005625204.localdomain podman[82770]: 2026-02-20 08:20:22.260830126 +0000 UTC m=+0.185687942 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=)
Feb 20 08:20:22 np0005625204.localdomain podman[82767]: 2026-02-20 08:20:22.310619579 +0000 UTC m=+0.244248701 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com)
Feb 20 08:20:22 np0005625204.localdomain podman[82770]: 2026-02-20 08:20:22.31624428 +0000 UTC m=+0.241102046 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, version=17.1.13, container_name=nova_compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:20:22 np0005625204.localdomain podman[82767]: 2026-02-20 08:20:22.334625449 +0000 UTC m=+0.268254541 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:20:22 np0005625204.localdomain podman[82769]: 2026-02-20 08:20:22.391039942 +0000 UTC m=+0.321020274 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 20 08:20:22 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:20:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:20:24 np0005625204.localdomain podman[82859]: 2026-02-20 08:20:24.142350817 +0000 UTC m=+0.081187087 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:20:24 np0005625204.localdomain sshd[82765]: Invalid user log from 185.246.128.171 port 50636
Feb 20 08:20:24 np0005625204.localdomain podman[82859]: 2026-02-20 08:20:24.484267865 +0000 UTC m=+0.423104095 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, container_name=nova_migration_target, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 20 08:20:24 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:20:24 np0005625204.localdomain sshd[82765]: Disconnecting invalid user log 185.246.128.171 port 50636: Change of username or service not allowed: (log,ssh-connection) -> (Matthew,ssh-connection) [preauth]
Feb 20 08:20:25 np0005625204.localdomain sudo[82882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:20:25 np0005625204.localdomain sudo[82882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:20:25 np0005625204.localdomain sudo[82882]: pam_unix(sudo:session): session closed for user root
Feb 20 08:20:25 np0005625204.localdomain sudo[82897]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 08:20:25 np0005625204.localdomain sudo[82897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:20:25 np0005625204.localdomain sudo[82897]: pam_unix(sudo:session): session closed for user root
Feb 20 08:20:25 np0005625204.localdomain sudo[82932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:20:25 np0005625204.localdomain sudo[82932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:20:25 np0005625204.localdomain sudo[82932]: pam_unix(sudo:session): session closed for user root
Feb 20 08:20:25 np0005625204.localdomain sshd[82946]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:26 np0005625204.localdomain sudo[82948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:20:26 np0005625204.localdomain sudo[82948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:20:26 np0005625204.localdomain sshd[82978]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:26 np0005625204.localdomain sudo[82948]: pam_unix(sudo:session): session closed for user root
Feb 20 08:20:27 np0005625204.localdomain sudo[82997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:20:27 np0005625204.localdomain sudo[82997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:20:27 np0005625204.localdomain sudo[82997]: pam_unix(sudo:session): session closed for user root
Feb 20 08:20:27 np0005625204.localdomain sshd[82978]: Invalid user Matthew from 185.246.128.171 port 14123
Feb 20 08:20:28 np0005625204.localdomain sshd[82978]: Disconnecting invalid user Matthew 185.246.128.171 port 14123: Change of username or service not allowed: (Matthew,ssh-connection) -> (mohammad,ssh-connection) [preauth]
Feb 20 08:20:30 np0005625204.localdomain sshd[83012]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:32 np0005625204.localdomain sshd[83012]: Invalid user mohammad from 185.246.128.171 port 6736
Feb 20 08:20:32 np0005625204.localdomain sshd[83012]: Disconnecting invalid user mohammad 185.246.128.171 port 6736: Change of username or service not allowed: (mohammad,ssh-connection) -> (ds,ssh-connection) [preauth]
Feb 20 08:20:33 np0005625204.localdomain sshd[83014]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:34 np0005625204.localdomain sshd[83016]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:36 np0005625204.localdomain sshd[83014]: Received disconnect from 103.157.25.4 port 45736:11: Bye Bye [preauth]
Feb 20 08:20:36 np0005625204.localdomain sshd[83014]: Disconnected from authenticating user root 103.157.25.4 port 45736 [preauth]
Feb 20 08:20:36 np0005625204.localdomain sshd[83016]: Invalid user ds from 185.246.128.171 port 32274
Feb 20 08:20:36 np0005625204.localdomain sshd[82946]: error: kex_exchange_identification: read: Connection timed out
Feb 20 08:20:36 np0005625204.localdomain sshd[82946]: banner exchange: Connection from 115.190.172.63 port 55236: Connection timed out
Feb 20 08:20:36 np0005625204.localdomain sshd[83016]: Disconnecting invalid user ds 185.246.128.171 port 32274: Change of username or service not allowed: (ds,ssh-connection) -> (keycloak,ssh-connection) [preauth]
Feb 20 08:20:37 np0005625204.localdomain sshd[83018]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:39 np0005625204.localdomain sshd[83018]: Invalid user keycloak from 185.246.128.171 port 9963
Feb 20 08:20:39 np0005625204.localdomain sshd[83018]: Disconnecting invalid user keycloak 185.246.128.171 port 9963: Change of username or service not allowed: (keycloak,ssh-connection) -> (minecraft,ssh-connection) [preauth]
Feb 20 08:20:40 np0005625204.localdomain sshd[83020]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:42 np0005625204.localdomain sshd[83020]: Invalid user minecraft from 185.246.128.171 port 45390
Feb 20 08:20:44 np0005625204.localdomain sshd[83020]: Disconnecting invalid user minecraft 185.246.128.171 port 45390: Change of username or service not allowed: (minecraft,ssh-connection) -> (frontend,ssh-connection) [preauth]
Feb 20 08:20:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:20:44 np0005625204.localdomain podman[83022]: 2026-02-20 08:20:44.14751627 +0000 UTC m=+0.083773797 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, container_name=metrics_qdr, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1)
Feb 20 08:20:44 np0005625204.localdomain podman[83022]: 2026-02-20 08:20:44.317729186 +0000 UTC m=+0.253986783 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, container_name=metrics_qdr, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 20 08:20:44 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:20:45 np0005625204.localdomain sshd[83051]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:48 np0005625204.localdomain sshd[83051]: Invalid user frontend from 185.246.128.171 port 20429
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:20:48 np0005625204.localdomain podman[83053]: 2026-02-20 08:20:48.117691805 +0000 UTC m=+0.078617859 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:20:48 np0005625204.localdomain podman[83053]: 2026-02-20 08:20:48.170361815 +0000 UTC m=+0.131287849 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public)
Feb 20 08:20:48 np0005625204.localdomain podman[83055]: 2026-02-20 08:20:48.180022981 +0000 UTC m=+0.134851968 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public)
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:20:48 np0005625204.localdomain podman[83055]: 2026-02-20 08:20:48.219755734 +0000 UTC m=+0.174584751 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:20:48 np0005625204.localdomain podman[83054]: 2026-02-20 08:20:48.230595457 +0000 UTC m=+0.186440915 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.component=openstack-cron-container, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:20:48 np0005625204.localdomain podman[83056]: 2026-02-20 08:20:48.299027432 +0000 UTC m=+0.251002701 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:20:48 np0005625204.localdomain podman[83054]: 2026-02-20 08:20:48.315291473 +0000 UTC m=+0.271136991 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:20:48 np0005625204.localdomain podman[83056]: 2026-02-20 08:20:48.335173614 +0000 UTC m=+0.287148853 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:20:48 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:20:49 np0005625204.localdomain sshd[83051]: Disconnecting invalid user frontend 185.246.128.171 port 20429: Change of username or service not allowed: (frontend,ssh-connection) -> (admin,ssh-connection) [preauth]
Feb 20 08:20:51 np0005625204.localdomain sshd[83146]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:52 np0005625204.localdomain sshd[83146]: Invalid user admin from 185.246.128.171 port 4979
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:20:53 np0005625204.localdomain podman[83174]: 2026-02-20 08:20:53.166742413 +0000 UTC m=+0.092052772 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:20:53 np0005625204.localdomain podman[83172]: 2026-02-20 08:20:53.211931314 +0000 UTC m=+0.145758615 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Feb 20 08:20:53 np0005625204.localdomain podman[83172]: 2026-02-20 08:20:53.221127026 +0000 UTC m=+0.154954327 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid)
Feb 20 08:20:53 np0005625204.localdomain podman[83173]: 2026-02-20 08:20:53.257576817 +0000 UTC m=+0.187756156 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 20 08:20:53 np0005625204.localdomain podman[83174]: 2026-02-20 08:20:53.271938629 +0000 UTC m=+0.197248928 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:20:53 np0005625204.localdomain podman[83173]: 2026-02-20 08:20:53.308092611 +0000 UTC m=+0.238271900 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:20:53 np0005625204.localdomain podman[83171]: 2026-02-20 08:20:53.318486361 +0000 UTC m=+0.253040055 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, container_name=ovn_controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:20:53 np0005625204.localdomain podman[83171]: 2026-02-20 08:20:53.348450332 +0000 UTC m=+0.283004046 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public)
Feb 20 08:20:53 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:20:53 np0005625204.localdomain sshd[83146]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 4979 ssh2 [preauth]
Feb 20 08:20:53 np0005625204.localdomain sshd[83146]: Disconnecting invalid user admin 185.246.128.171 port 4979: Too many authentication failures [preauth]
Feb 20 08:20:54 np0005625204.localdomain sshd[83282]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:54 np0005625204.localdomain sshd[83282]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:20:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:20:54 np0005625204.localdomain systemd[1]: tmp-crun.tqPNHc.mount: Deactivated successfully.
Feb 20 08:20:54 np0005625204.localdomain podman[83284]: 2026-02-20 08:20:54.851603126 +0000 UTC m=+0.085596484 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 20 08:20:55 np0005625204.localdomain sshd[83308]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:20:55 np0005625204.localdomain podman[83284]: 2026-02-20 08:20:55.220001907 +0000 UTC m=+0.453995275 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:32:04Z, version=17.1.13, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 20 08:20:55 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:20:56 np0005625204.localdomain sshd[83308]: Invalid user admin from 185.246.128.171 port 11009
Feb 20 08:20:57 np0005625204.localdomain sshd[83308]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 11009 ssh2 [preauth]
Feb 20 08:20:57 np0005625204.localdomain sshd[83308]: Disconnecting invalid user admin 185.246.128.171 port 11009: Too many authentication failures [preauth]
Feb 20 08:20:58 np0005625204.localdomain sshd[83311]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:00 np0005625204.localdomain sshd[83311]: Invalid user admin from 185.246.128.171 port 46609
Feb 20 08:21:02 np0005625204.localdomain sshd[83311]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 46609 ssh2 [preauth]
Feb 20 08:21:02 np0005625204.localdomain sshd[83311]: Disconnecting invalid user admin 185.246.128.171 port 46609: Too many authentication failures [preauth]
Feb 20 08:21:02 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:21:02 np0005625204.localdomain recover_tripleo_nova_virtqemud[83314]: 63005
Feb 20 08:21:02 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:21:02 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:21:02 np0005625204.localdomain sudo[83328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnbgpcmzvdogztnhwdyjwhbfzsgsfxju ; /usr/bin/python3
Feb 20 08:21:02 np0005625204.localdomain sudo[83328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 08:21:02 np0005625204.localdomain python3[83330]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 08:21:03 np0005625204.localdomain sshd[83334]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:05 np0005625204.localdomain rhsm-service[6644]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 20 08:21:07 np0005625204.localdomain sshd[83334]: Invalid user admin from 185.246.128.171 port 9412
Feb 20 08:21:09 np0005625204.localdomain sshd[83334]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 9412 ssh2 [preauth]
Feb 20 08:21:09 np0005625204.localdomain sshd[83334]: Disconnecting invalid user admin 185.246.128.171 port 9412: Too many authentication failures [preauth]
Feb 20 08:21:09 np0005625204.localdomain sudo[83328]: pam_unix(sudo:session): session closed for user root
Feb 20 08:21:09 np0005625204.localdomain sshd[83519]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:12 np0005625204.localdomain sshd[83519]: Invalid user admin from 185.246.128.171 port 20638
Feb 20 08:21:14 np0005625204.localdomain sshd[83519]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 20638 ssh2 [preauth]
Feb 20 08:21:14 np0005625204.localdomain sshd[83519]: Disconnecting invalid user admin 185.246.128.171 port 20638: Too many authentication failures [preauth]
Feb 20 08:21:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:21:14 np0005625204.localdomain podman[83521]: 2026-02-20 08:21:14.679318237 +0000 UTC m=+0.074223404 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:21:14 np0005625204.localdomain podman[83521]: 2026-02-20 08:21:14.878946717 +0000 UTC m=+0.273851884 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5)
Feb 20 08:21:14 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:21:15 np0005625204.localdomain sshd[83549]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:18 np0005625204.localdomain sshd[83549]: Invalid user admin from 185.246.128.171 port 45536
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: tmp-crun.Sw9Hmr.mount: Deactivated successfully.
Feb 20 08:21:18 np0005625204.localdomain podman[83552]: 2026-02-20 08:21:18.357238163 +0000 UTC m=+0.088386879 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, container_name=collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:21:18 np0005625204.localdomain podman[83551]: 2026-02-20 08:21:18.403386252 +0000 UTC m=+0.137831530 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, vcs-type=git)
Feb 20 08:21:18 np0005625204.localdomain podman[83552]: 2026-02-20 08:21:18.425873354 +0000 UTC m=+0.157022100 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, vcs-type=git)
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:21:18 np0005625204.localdomain podman[83551]: 2026-02-20 08:21:18.457762775 +0000 UTC m=+0.192208153 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, release=1766032510, container_name=ceilometer_agent_ipmi, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:21:18 np0005625204.localdomain podman[83581]: 2026-02-20 08:21:18.520698591 +0000 UTC m=+0.134362574 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, version=17.1.13, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z)
Feb 20 08:21:18 np0005625204.localdomain podman[83581]: 2026-02-20 08:21:18.531989268 +0000 UTC m=+0.145653291 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:21:18 np0005625204.localdomain podman[83582]: 2026-02-20 08:21:18.622707048 +0000 UTC m=+0.232392799 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5)
Feb 20 08:21:18 np0005625204.localdomain podman[83582]: 2026-02-20 08:21:18.682047094 +0000 UTC m=+0.291732875 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, tcib_managed=true, vcs-type=git)
Feb 20 08:21:18 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:21:22 np0005625204.localdomain python3[83657]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Feb 20 08:21:23 np0005625204.localdomain sshd[83549]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 45536 ssh2 [preauth]
Feb 20 08:21:23 np0005625204.localdomain sshd[83549]: Disconnecting invalid user admin 185.246.128.171 port 45536: Too many authentication failures [preauth]
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:21:23 np0005625204.localdomain podman[83659]: 2026-02-20 08:21:23.428599159 +0000 UTC m=+0.084594893 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:21:23 np0005625204.localdomain podman[83658]: 2026-02-20 08:21:23.484685834 +0000 UTC m=+0.141101441 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public)
Feb 20 08:21:23 np0005625204.localdomain podman[83659]: 2026-02-20 08:21:23.484598121 +0000 UTC m=+0.140593885 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, container_name=nova_compute, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13)
Feb 20 08:21:23 np0005625204.localdomain podman[83691]: 2026-02-20 08:21:23.543939506 +0000 UTC m=+0.087350448 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:21:23 np0005625204.localdomain podman[83658]: 2026-02-20 08:21:23.556984568 +0000 UTC m=+0.213400185 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:21:23 np0005625204.localdomain podman[83691]: 2026-02-20 08:21:23.570287426 +0000 UTC m=+0.113698428 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:21:23 np0005625204.localdomain podman[83695]: 2026-02-20 08:21:23.558729131 +0000 UTC m=+0.099122090 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510)
Feb 20 08:21:23 np0005625204.localdomain podman[83695]: 2026-02-20 08:21:23.638915717 +0000 UTC m=+0.179308636 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, architecture=x86_64, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:21:23 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:21:23 np0005625204.localdomain sshd[83747]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:24 np0005625204.localdomain systemd[1]: tmp-crun.JPhOOx.mount: Deactivated successfully.
Feb 20 08:21:24 np0005625204.localdomain sshd[83747]: Invalid user admin from 185.246.128.171 port 34440
Feb 20 08:21:25 np0005625204.localdomain sshd[83749]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:21:26 np0005625204.localdomain sshd[83747]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 34440 ssh2 [preauth]
Feb 20 08:21:26 np0005625204.localdomain sshd[83747]: Disconnecting invalid user admin 185.246.128.171 port 34440: Too many authentication failures [preauth]
Feb 20 08:21:26 np0005625204.localdomain podman[83751]: 2026-02-20 08:21:26.143873745 +0000 UTC m=+0.083070826 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public)
Feb 20 08:21:26 np0005625204.localdomain podman[83751]: 2026-02-20 08:21:26.508705886 +0000 UTC m=+0.447902927 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:21:26 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:21:26 np0005625204.localdomain sshd[83749]: Invalid user robin from 101.36.109.176 port 38580
Feb 20 08:21:26 np0005625204.localdomain sshd[83749]: Received disconnect from 101.36.109.176 port 38580:11: Bye Bye [preauth]
Feb 20 08:21:26 np0005625204.localdomain sshd[83749]: Disconnected from invalid user robin 101.36.109.176 port 38580 [preauth]
Feb 20 08:21:27 np0005625204.localdomain sudo[83774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:21:27 np0005625204.localdomain sudo[83774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:21:27 np0005625204.localdomain sudo[83774]: pam_unix(sudo:session): session closed for user root
Feb 20 08:21:27 np0005625204.localdomain sudo[83789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:21:27 np0005625204.localdomain sudo[83789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:21:27 np0005625204.localdomain sshd[83804]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:28 np0005625204.localdomain sudo[83789]: pam_unix(sudo:session): session closed for user root
Feb 20 08:21:28 np0005625204.localdomain sshd[83804]: Invalid user admin from 185.246.128.171 port 31410
Feb 20 08:21:28 np0005625204.localdomain sudo[83839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:21:28 np0005625204.localdomain sudo[83839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:21:28 np0005625204.localdomain sudo[83839]: pam_unix(sudo:session): session closed for user root
Feb 20 08:21:30 np0005625204.localdomain sshd[83804]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 31410 ssh2 [preauth]
Feb 20 08:21:30 np0005625204.localdomain sshd[83804]: Disconnecting invalid user admin 185.246.128.171 port 31410: Too many authentication failures [preauth]
Feb 20 08:21:32 np0005625204.localdomain sshd[83854]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:33 np0005625204.localdomain sshd[83854]: Invalid user admin from 185.246.128.171 port 24638
Feb 20 08:21:35 np0005625204.localdomain sshd[83854]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 24638 ssh2 [preauth]
Feb 20 08:21:35 np0005625204.localdomain sshd[83854]: Disconnecting invalid user admin 185.246.128.171 port 24638: Too many authentication failures [preauth]
Feb 20 08:21:36 np0005625204.localdomain sshd[83856]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:37 np0005625204.localdomain sshd[83856]: Invalid user admin from 185.246.128.171 port 35615
Feb 20 08:21:39 np0005625204.localdomain sshd[83856]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 35615 ssh2 [preauth]
Feb 20 08:21:39 np0005625204.localdomain sshd[83856]: Disconnecting invalid user admin 185.246.128.171 port 35615: Too many authentication failures [preauth]
Feb 20 08:21:39 np0005625204.localdomain sshd[83858]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:40 np0005625204.localdomain sshd[83860]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:40 np0005625204.localdomain sshd[83860]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:21:41 np0005625204.localdomain sshd[83858]: Invalid user admin from 185.246.128.171 port 20927
Feb 20 08:21:44 np0005625204.localdomain sshd[83858]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 20927 ssh2 [preauth]
Feb 20 08:21:44 np0005625204.localdomain sshd[83858]: Disconnecting invalid user admin 185.246.128.171 port 20927: Too many authentication failures [preauth]
Feb 20 08:21:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:21:45 np0005625204.localdomain podman[83862]: 2026-02-20 08:21:45.14758491 +0000 UTC m=+0.082133326 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:21:45 np0005625204.localdomain podman[83862]: 2026-02-20 08:21:45.350959935 +0000 UTC m=+0.285508431 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, vcs-type=git, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=)
Feb 20 08:21:45 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:21:46 np0005625204.localdomain sshd[83892]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:47 np0005625204.localdomain sshd[83892]: Invalid user admin from 185.246.128.171 port 37045
Feb 20 08:21:48 np0005625204.localdomain sshd[83892]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 37045 ssh2 [preauth]
Feb 20 08:21:48 np0005625204.localdomain sshd[83892]: Disconnecting invalid user admin 185.246.128.171 port 37045: Too many authentication failures [preauth]
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: tmp-crun.CNlYYr.mount: Deactivated successfully.
Feb 20 08:21:48 np0005625204.localdomain podman[83894]: 2026-02-20 08:21:48.814779086 +0000 UTC m=+0.090401351 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510)
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: tmp-crun.tlyn5n.mount: Deactivated successfully.
Feb 20 08:21:48 np0005625204.localdomain podman[83895]: 2026-02-20 08:21:48.831475849 +0000 UTC m=+0.099203552 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=logrotate_crond, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:21:48 np0005625204.localdomain podman[83895]: 2026-02-20 08:21:48.867066674 +0000 UTC m=+0.134794327 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13)
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:21:48 np0005625204.localdomain podman[83894]: 2026-02-20 08:21:48.895821808 +0000 UTC m=+0.171444093 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 20 08:21:48 np0005625204.localdomain podman[83899]: 2026-02-20 08:21:48.869817959 +0000 UTC m=+0.134404045 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vcs-type=git)
Feb 20 08:21:48 np0005625204.localdomain podman[83902]: 2026-02-20 08:21:48.928585156 +0000 UTC m=+0.187155717 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:21:48 np0005625204.localdomain podman[83899]: 2026-02-20 08:21:48.9550497 +0000 UTC m=+0.219635776 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true)
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:21:48 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:21:48 np0005625204.localdomain podman[83902]: 2026-02-20 08:21:48.989157519 +0000 UTC m=+0.247728100 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 20 08:21:49 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:21:49 np0005625204.localdomain sshd[83988]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:50 np0005625204.localdomain sshd[83988]: Invalid user admin from 185.246.128.171 port 37174
Feb 20 08:21:51 np0005625204.localdomain sshd[83988]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 37174 ssh2 [preauth]
Feb 20 08:21:51 np0005625204.localdomain sshd[83988]: Disconnecting invalid user admin 185.246.128.171 port 37174: Too many authentication failures [preauth]
Feb 20 08:21:52 np0005625204.localdomain sshd[83991]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:53 np0005625204.localdomain sshd[83991]: Invalid user admin from 185.246.128.171 port 10465
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:21:54 np0005625204.localdomain podman[84037]: 2026-02-20 08:21:54.159310933 +0000 UTC m=+0.094840138 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=)
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: tmp-crun.oYl5d4.mount: Deactivated successfully.
Feb 20 08:21:54 np0005625204.localdomain podman[84038]: 2026-02-20 08:21:54.205832163 +0000 UTC m=+0.140807081 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:21:54 np0005625204.localdomain podman[84038]: 2026-02-20 08:21:54.239303532 +0000 UTC m=+0.174278420 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z)
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:21:54 np0005625204.localdomain podman[84037]: 2026-02-20 08:21:54.259555045 +0000 UTC m=+0.195084190 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:21:54 np0005625204.localdomain podman[84039]: 2026-02-20 08:21:54.259162353 +0000 UTC m=+0.190252142 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:21:54 np0005625204.localdomain podman[84040]: 2026-02-20 08:21:54.315028662 +0000 UTC m=+0.242195730 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Feb 20 08:21:54 np0005625204.localdomain podman[84039]: 2026-02-20 08:21:54.337857314 +0000 UTC m=+0.268947043 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 20 08:21:54 np0005625204.localdomain podman[84040]: 2026-02-20 08:21:54.349813202 +0000 UTC m=+0.276980300 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=)
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:21:54 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:21:55 np0005625204.localdomain sshd[83991]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 10465 ssh2 [preauth]
Feb 20 08:21:55 np0005625204.localdomain sshd[83991]: Disconnecting invalid user admin 185.246.128.171 port 10465: Too many authentication failures [preauth]
Feb 20 08:21:55 np0005625204.localdomain sshd[84131]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:21:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:21:57 np0005625204.localdomain sshd[84131]: Invalid user admin from 185.246.128.171 port 20955
Feb 20 08:21:57 np0005625204.localdomain systemd[1]: tmp-crun.UQfk1g.mount: Deactivated successfully.
Feb 20 08:21:57 np0005625204.localdomain podman[84133]: 2026-02-20 08:21:57.1477556 +0000 UTC m=+0.086526441 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=nova_migration_target, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:21:57 np0005625204.localdomain podman[84133]: 2026-02-20 08:21:57.498060636 +0000 UTC m=+0.436831537 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z)
Feb 20 08:21:57 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:21:59 np0005625204.localdomain sshd[84131]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 20955 ssh2 [preauth]
Feb 20 08:21:59 np0005625204.localdomain sshd[84131]: Disconnecting invalid user admin 185.246.128.171 port 20955: Too many authentication failures [preauth]
Feb 20 08:22:00 np0005625204.localdomain sshd[84155]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:02 np0005625204.localdomain sshd[84155]: Invalid user admin from 185.246.128.171 port 62920
Feb 20 08:22:08 np0005625204.localdomain sshd[84155]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 62920 ssh2 [preauth]
Feb 20 08:22:08 np0005625204.localdomain sshd[84155]: Disconnecting invalid user admin 185.246.128.171 port 62920: Too many authentication failures [preauth]
Feb 20 08:22:09 np0005625204.localdomain sshd[84157]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:11 np0005625204.localdomain sshd[84157]: Invalid user admin from 185.246.128.171 port 19206
Feb 20 08:22:14 np0005625204.localdomain sshd[84157]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 19206 ssh2 [preauth]
Feb 20 08:22:14 np0005625204.localdomain sshd[84157]: Disconnecting invalid user admin 185.246.128.171 port 19206: Too many authentication failures [preauth]
Feb 20 08:22:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:22:16 np0005625204.localdomain podman[84159]: 2026-02-20 08:22:16.13433978 +0000 UTC m=+0.074807342 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:22:16 np0005625204.localdomain sshd[84177]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:16 np0005625204.localdomain podman[84159]: 2026-02-20 08:22:16.337014543 +0000 UTC m=+0.277482095 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1)
Feb 20 08:22:16 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:22:18 np0005625204.localdomain sshd[84177]: Invalid user admin from 185.246.128.171 port 63878
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:22:19 np0005625204.localdomain sshd[84177]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 63878 ssh2 [preauth]
Feb 20 08:22:19 np0005625204.localdomain sshd[84177]: Disconnecting invalid user admin 185.246.128.171 port 63878: Too many authentication failures [preauth]
Feb 20 08:22:19 np0005625204.localdomain podman[84192]: 2026-02-20 08:22:19.160956932 +0000 UTC m=+0.091664649 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:22:19 np0005625204.localdomain podman[84192]: 2026-02-20 08:22:19.20216717 +0000 UTC m=+0.132874917 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:22:19 np0005625204.localdomain podman[84191]: 2026-02-20 08:22:19.212093116 +0000 UTC m=+0.145257039 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1)
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:22:19 np0005625204.localdomain podman[84191]: 2026-02-20 08:22:19.270163161 +0000 UTC m=+0.203327084 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:22:19 np0005625204.localdomain podman[84193]: 2026-02-20 08:22:19.275551867 +0000 UTC m=+0.201219050 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com)
Feb 20 08:22:19 np0005625204.localdomain podman[84193]: 2026-02-20 08:22:19.360988955 +0000 UTC m=+0.286656078 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_id=tripleo_step3, tcib_managed=true)
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:22:19 np0005625204.localdomain podman[84197]: 2026-02-20 08:22:19.327052101 +0000 UTC m=+0.249678590 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 08:22:19 np0005625204.localdomain podman[84197]: 2026-02-20 08:22:19.411142798 +0000 UTC m=+0.333769216 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z)
Feb 20 08:22:19 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:22:19 np0005625204.localdomain sshd[84284]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:20 np0005625204.localdomain sshd[84286]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:21 np0005625204.localdomain sshd[84286]: Invalid user pasi from 83.235.16.111 port 36494
Feb 20 08:22:21 np0005625204.localdomain sshd[84286]: Received disconnect from 83.235.16.111 port 36494:11: Bye Bye [preauth]
Feb 20 08:22:21 np0005625204.localdomain sshd[84286]: Disconnected from invalid user pasi 83.235.16.111 port 36494 [preauth]
Feb 20 08:22:22 np0005625204.localdomain sshd[81624]: Received disconnect from 38.102.83.114 port 37642:11: disconnected by user
Feb 20 08:22:22 np0005625204.localdomain sshd[81624]: Disconnected from user zuul 38.102.83.114 port 37642
Feb 20 08:22:22 np0005625204.localdomain sshd[81621]: pam_unix(sshd:session): session closed for user zuul
Feb 20 08:22:22 np0005625204.localdomain systemd[1]: session-35.scope: Deactivated successfully.
Feb 20 08:22:22 np0005625204.localdomain systemd[1]: session-35.scope: Consumed 19.197s CPU time.
Feb 20 08:22:22 np0005625204.localdomain systemd-logind[759]: Session 35 logged out. Waiting for processes to exit.
Feb 20 08:22:22 np0005625204.localdomain systemd-logind[759]: Removed session 35.
Feb 20 08:22:22 np0005625204.localdomain sshd[84284]: Invalid user admin from 185.246.128.171 port 57362
Feb 20 08:22:24 np0005625204.localdomain sshd[84288]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:24 np0005625204.localdomain sshd[84288]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 08:22:24 np0005625204.localdomain sshd[84288]: Connection closed by 37.113.10.98 port 36016
Feb 20 08:22:24 np0005625204.localdomain sshd[84289]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:24 np0005625204.localdomain sshd[84289]: Invalid user a from 37.113.10.98 port 36022
Feb 20 08:22:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:22:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:22:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:22:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:22:25 np0005625204.localdomain sshd[84293]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:25 np0005625204.localdomain podman[84292]: 2026-02-20 08:22:25.07573754 +0000 UTC m=+0.075802342 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:34:43Z, container_name=iscsid, architecture=x86_64, release=1766032510, batch=17.1_20260112.1)
Feb 20 08:22:25 np0005625204.localdomain sshd[84289]: Connection closed by invalid user a 37.113.10.98 port 36022 [preauth]
Feb 20 08:22:25 np0005625204.localdomain podman[84294]: 2026-02-20 08:22:25.142510454 +0000 UTC m=+0.137449688 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510)
Feb 20 08:22:25 np0005625204.localdomain podman[84297]: 2026-02-20 08:22:25.121781116 +0000 UTC m=+0.118857346 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute)
Feb 20 08:22:25 np0005625204.localdomain podman[84291]: 2026-02-20 08:22:25.182683079 +0000 UTC m=+0.186412165 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 20 08:22:25 np0005625204.localdomain sshd[84293]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:22:25 np0005625204.localdomain podman[84297]: 2026-02-20 08:22:25.202020424 +0000 UTC m=+0.199096654 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:22:25 np0005625204.localdomain podman[84294]: 2026-02-20 08:22:25.210965839 +0000 UTC m=+0.205905003 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, vcs-type=git, container_name=ovn_metadata_agent)
Feb 20 08:22:25 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:22:25 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:22:25 np0005625204.localdomain podman[84291]: 2026-02-20 08:22:25.241119817 +0000 UTC m=+0.244848943 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510)
Feb 20 08:22:25 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:22:25 np0005625204.localdomain podman[84292]: 2026-02-20 08:22:25.262327289 +0000 UTC m=+0.262392151 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 20 08:22:25 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:22:25 np0005625204.localdomain sshd[84284]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 57362 ssh2 [preauth]
Feb 20 08:22:25 np0005625204.localdomain sshd[84284]: Disconnecting invalid user admin 185.246.128.171 port 57362: Too many authentication failures [preauth]
Feb 20 08:22:26 np0005625204.localdomain sshd[84381]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:22:28 np0005625204.localdomain podman[84383]: 2026-02-20 08:22:28.140884488 +0000 UTC m=+0.080466226 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, tcib_managed=true)
Feb 20 08:22:28 np0005625204.localdomain podman[84383]: 2026-02-20 08:22:28.497180897 +0000 UTC m=+0.436762625 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:22:28 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:22:29 np0005625204.localdomain sudo[84406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:22:29 np0005625204.localdomain sudo[84406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:22:29 np0005625204.localdomain sudo[84406]: pam_unix(sudo:session): session closed for user root
Feb 20 08:22:29 np0005625204.localdomain sudo[84421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 08:22:29 np0005625204.localdomain sudo[84421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:22:29 np0005625204.localdomain systemd[1]: tmp-crun.td5LNP.mount: Deactivated successfully.
Feb 20 08:22:29 np0005625204.localdomain podman[84508]: 2026-02-20 08:22:29.962030893 +0000 UTC m=+0.093282290 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 08:22:30 np0005625204.localdomain podman[84508]: 2026-02-20 08:22:30.069015313 +0000 UTC m=+0.200266670 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.buildah.version=1.42.2, release=1770267347, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 08:22:30 np0005625204.localdomain sshd[84381]: Invalid user admin from 185.246.128.171 port 26862
Feb 20 08:22:30 np0005625204.localdomain sudo[84421]: pam_unix(sudo:session): session closed for user root
Feb 20 08:22:30 np0005625204.localdomain sudo[84572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:22:30 np0005625204.localdomain sudo[84572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:22:30 np0005625204.localdomain sudo[84572]: pam_unix(sudo:session): session closed for user root
Feb 20 08:22:30 np0005625204.localdomain sudo[84587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:22:30 np0005625204.localdomain sudo[84587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:22:31 np0005625204.localdomain sshd[84632]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:31 np0005625204.localdomain sudo[84587]: pam_unix(sudo:session): session closed for user root
Feb 20 08:22:31 np0005625204.localdomain sudo[84636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:22:31 np0005625204.localdomain sudo[84636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:22:31 np0005625204.localdomain sudo[84636]: pam_unix(sudo:session): session closed for user root
Feb 20 08:22:32 np0005625204.localdomain sshd[84632]: Received disconnect from 178.217.173.50 port 53896:11: Bye Bye [preauth]
Feb 20 08:22:32 np0005625204.localdomain sshd[84632]: Disconnected from authenticating user root 178.217.173.50 port 53896 [preauth]
Feb 20 08:22:34 np0005625204.localdomain sshd[84381]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 26862 ssh2 [preauth]
Feb 20 08:22:34 np0005625204.localdomain sshd[84381]: Disconnecting invalid user admin 185.246.128.171 port 26862: Too many authentication failures [preauth]
Feb 20 08:22:35 np0005625204.localdomain sshd[84651]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:38 np0005625204.localdomain sshd[84651]: Invalid user admin from 185.246.128.171 port 59681
Feb 20 08:22:42 np0005625204.localdomain sshd[84651]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 59681 ssh2 [preauth]
Feb 20 08:22:42 np0005625204.localdomain sshd[84651]: Disconnecting invalid user admin 185.246.128.171 port 59681: Too many authentication failures [preauth]
Feb 20 08:22:43 np0005625204.localdomain sshd[84653]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:44 np0005625204.localdomain sshd[84653]: Invalid user admin from 185.246.128.171 port 38105
Feb 20 08:22:46 np0005625204.localdomain sshd[84653]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 38105 ssh2 [preauth]
Feb 20 08:22:46 np0005625204.localdomain sshd[84653]: Disconnecting invalid user admin 185.246.128.171 port 38105: Too many authentication failures [preauth]
Feb 20 08:22:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:22:47 np0005625204.localdomain systemd[1]: tmp-crun.urLTK2.mount: Deactivated successfully.
Feb 20 08:22:47 np0005625204.localdomain podman[84655]: 2026-02-20 08:22:47.046905891 +0000 UTC m=+0.078796815 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, version=17.1.13, vendor=Red Hat, Inc., container_name=metrics_qdr, release=1766032510, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:22:47 np0005625204.localdomain podman[84655]: 2026-02-20 08:22:47.264950727 +0000 UTC m=+0.296841631 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, container_name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc.)
Feb 20 08:22:47 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:22:48 np0005625204.localdomain sshd[84685]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:49 np0005625204.localdomain sshd[84685]: Invalid user admin from 185.246.128.171 port 9171
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: tmp-crun.YfkTbb.mount: Deactivated successfully.
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: tmp-crun.oYCDCA.mount: Deactivated successfully.
Feb 20 08:22:49 np0005625204.localdomain podman[84687]: 2026-02-20 08:22:49.578325562 +0000 UTC m=+0.088720609 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 20 08:22:49 np0005625204.localdomain podman[84688]: 2026-02-20 08:22:49.627759963 +0000 UTC m=+0.132952839 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:22:49 np0005625204.localdomain podman[84695]: 2026-02-20 08:22:49.644622742 +0000 UTC m=+0.140035538 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, release=1766032510, distribution-scope=public, container_name=ceilometer_agent_compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:22:49 np0005625204.localdomain podman[84689]: 2026-02-20 08:22:49.60554637 +0000 UTC m=+0.105122704 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:22:49 np0005625204.localdomain podman[84688]: 2026-02-20 08:22:49.660798289 +0000 UTC m=+0.165991185 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=logrotate_crond, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:22:49 np0005625204.localdomain podman[84695]: 2026-02-20 08:22:49.673099737 +0000 UTC m=+0.168512523 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:22:49 np0005625204.localdomain podman[84689]: 2026-02-20 08:22:49.69107198 +0000 UTC m=+0.190648314 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:22:49 np0005625204.localdomain podman[84687]: 2026-02-20 08:22:49.711107127 +0000 UTC m=+0.221502194 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5)
Feb 20 08:22:49 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:22:52 np0005625204.localdomain sshd[84685]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 9171 ssh2 [preauth]
Feb 20 08:22:52 np0005625204.localdomain sshd[84685]: Disconnecting invalid user admin 185.246.128.171 port 9171: Too many authentication failures [preauth]
Feb 20 08:22:52 np0005625204.localdomain sshd[84783]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:22:54 np0005625204.localdomain sshd[84783]: Invalid user admin from 185.246.128.171 port 16807
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:22:56 np0005625204.localdomain podman[84830]: 2026-02-20 08:22:56.144925786 +0000 UTC m=+0.081124226 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 20 08:22:56 np0005625204.localdomain podman[84830]: 2026-02-20 08:22:56.174085933 +0000 UTC m=+0.110284363 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container)
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: tmp-crun.6qNJm7.mount: Deactivated successfully.
Feb 20 08:22:56 np0005625204.localdomain podman[84831]: 2026-02-20 08:22:56.256866969 +0000 UTC m=+0.190061426 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, release=1766032510, tcib_managed=true, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:22:56 np0005625204.localdomain podman[84831]: 2026-02-20 08:22:56.267052972 +0000 UTC m=+0.200247429 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64)
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:22:56 np0005625204.localdomain podman[84836]: 2026-02-20 08:22:56.302901705 +0000 UTC m=+0.229556521 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1)
Feb 20 08:22:56 np0005625204.localdomain podman[84836]: 2026-02-20 08:22:56.349747106 +0000 UTC m=+0.276401942 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:22:56 np0005625204.localdomain podman[84832]: 2026-02-20 08:22:56.352647895 +0000 UTC m=+0.286845983 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, release=1766032510)
Feb 20 08:22:56 np0005625204.localdomain podman[84832]: 2026-02-20 08:22:56.436202364 +0000 UTC m=+0.370400462 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:22:56 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:22:57 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:22:57 np0005625204.localdomain recover_tripleo_nova_virtqemud[84919]: 63005
Feb 20 08:22:57 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:22:57 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:22:57 np0005625204.localdomain systemd[1]: tmp-crun.6CMRZN.mount: Deactivated successfully.
Feb 20 08:22:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:22:59 np0005625204.localdomain systemd[1]: tmp-crun.S904Ct.mount: Deactivated successfully.
Feb 20 08:22:59 np0005625204.localdomain podman[84920]: 2026-02-20 08:22:59.141873116 +0000 UTC m=+0.081366723 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 20 08:22:59 np0005625204.localdomain sshd[84783]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 16807 ssh2 [preauth]
Feb 20 08:22:59 np0005625204.localdomain sshd[84783]: Disconnecting invalid user admin 185.246.128.171 port 16807: Too many authentication failures [preauth]
Feb 20 08:22:59 np0005625204.localdomain podman[84920]: 2026-02-20 08:22:59.513063653 +0000 UTC m=+0.452557280 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:32:04Z, version=17.1.13)
Feb 20 08:22:59 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:23:01 np0005625204.localdomain sshd[84943]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:03 np0005625204.localdomain sshd[84943]: Invalid user admin from 185.246.128.171 port 17999
Feb 20 08:23:05 np0005625204.localdomain sshd[84943]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 17999 ssh2 [preauth]
Feb 20 08:23:05 np0005625204.localdomain sshd[84943]: Disconnecting invalid user admin 185.246.128.171 port 17999: Too many authentication failures [preauth]
Feb 20 08:23:06 np0005625204.localdomain sshd[84945]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:07 np0005625204.localdomain sshd[84945]: Invalid user admin from 185.246.128.171 port 37939
Feb 20 08:23:09 np0005625204.localdomain sshd[84947]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:09 np0005625204.localdomain sshd[84947]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:23:09 np0005625204.localdomain sshd[84945]: error: maximum authentication attempts exceeded for invalid user admin from 185.246.128.171 port 37939 ssh2 [preauth]
Feb 20 08:23:09 np0005625204.localdomain sshd[84945]: Disconnecting invalid user admin 185.246.128.171 port 37939: Too many authentication failures [preauth]
Feb 20 08:23:09 np0005625204.localdomain sshd[84949]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:10 np0005625204.localdomain sshd[84951]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:10 np0005625204.localdomain sshd[84953]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:11 np0005625204.localdomain sshd[84951]: Invalid user oracle from 77.232.138.190 port 36332
Feb 20 08:23:11 np0005625204.localdomain sshd[84951]: Received disconnect from 77.232.138.190 port 36332:11: Bye Bye [preauth]
Feb 20 08:23:11 np0005625204.localdomain sshd[84951]: Disconnected from invalid user oracle 77.232.138.190 port 36332 [preauth]
Feb 20 08:23:11 np0005625204.localdomain sshd[84949]: Received disconnect from 152.32.189.21 port 49840:11: Bye Bye [preauth]
Feb 20 08:23:11 np0005625204.localdomain sshd[84949]: Disconnected from authenticating user root 152.32.189.21 port 49840 [preauth]
Feb 20 08:23:11 np0005625204.localdomain sshd[84953]: Invalid user admin from 185.246.128.171 port 31939
Feb 20 08:23:12 np0005625204.localdomain sshd[84953]: Disconnecting invalid user admin 185.246.128.171 port 31939: Change of username or service not allowed: (admin,ssh-connection) -> (www,ssh-connection) [preauth]
Feb 20 08:23:14 np0005625204.localdomain sshd[84955]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:16 np0005625204.localdomain sshd[84955]: Invalid user www from 185.246.128.171 port 18338
Feb 20 08:23:16 np0005625204.localdomain sshd[84955]: Disconnecting invalid user www 185.246.128.171 port 18338: Change of username or service not allowed: (www,ssh-connection) -> (sales,ssh-connection) [preauth]
Feb 20 08:23:17 np0005625204.localdomain sshd[84957]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:23:18 np0005625204.localdomain podman[84959]: 2026-02-20 08:23:18.19687902 +0000 UTC m=+0.138761119 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, distribution-scope=public, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:23:18 np0005625204.localdomain podman[84959]: 2026-02-20 08:23:18.419994563 +0000 UTC m=+0.361876602 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:23:18 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:23:18 np0005625204.localdomain sshd[84957]: Invalid user sales from 185.246.128.171 port 8484
Feb 20 08:23:18 np0005625204.localdomain sshd[84957]: Disconnecting invalid user sales 185.246.128.171 port 8484: Change of username or service not allowed: (sales,ssh-connection) -> (volumio,ssh-connection) [preauth]
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:23:20 np0005625204.localdomain sshd[84999]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:20 np0005625204.localdomain podman[84988]: 2026-02-20 08:23:20.142157663 +0000 UTC m=+0.081311681 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true)
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: tmp-crun.xewIWF.mount: Deactivated successfully.
Feb 20 08:23:20 np0005625204.localdomain podman[84989]: 2026-02-20 08:23:20.165572973 +0000 UTC m=+0.095036814 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond)
Feb 20 08:23:20 np0005625204.localdomain podman[84989]: 2026-02-20 08:23:20.173570799 +0000 UTC m=+0.103034570 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4)
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:23:20 np0005625204.localdomain podman[84996]: 2026-02-20 08:23:20.222747341 +0000 UTC m=+0.141454471 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z)
Feb 20 08:23:20 np0005625204.localdomain podman[84996]: 2026-02-20 08:23:20.252075644 +0000 UTC m=+0.170782724 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, version=17.1.13, container_name=ceilometer_agent_compute, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:23:20 np0005625204.localdomain podman[84995]: 2026-02-20 08:23:20.275108532 +0000 UTC m=+0.201582031 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, tcib_managed=true, config_id=tripleo_step3)
Feb 20 08:23:20 np0005625204.localdomain podman[84995]: 2026-02-20 08:23:20.2893521 +0000 UTC m=+0.215825529 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z)
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:23:20 np0005625204.localdomain podman[84988]: 2026-02-20 08:23:20.326685638 +0000 UTC m=+0.265839636 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:23:20 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:23:21 np0005625204.localdomain sshd[84999]: Invalid user volumio from 185.246.128.171 port 63339
Feb 20 08:23:22 np0005625204.localdomain sshd[84999]: Disconnecting invalid user volumio 185.246.128.171 port 63339: Change of username or service not allowed: (volumio,ssh-connection) -> (newuser,ssh-connection) [preauth]
Feb 20 08:23:22 np0005625204.localdomain sshd[85080]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:24 np0005625204.localdomain sshd[85080]: Invalid user newuser from 185.246.128.171 port 17270
Feb 20 08:23:25 np0005625204.localdomain sshd[85080]: Disconnecting invalid user newuser 185.246.128.171 port 17270: Change of username or service not allowed: (newuser,ssh-connection) -> (demo,ssh-connection) [preauth]
Feb 20 08:23:25 np0005625204.localdomain sshd[85082]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:26 np0005625204.localdomain sshd[85082]: Invalid user demo from 185.246.128.171 port 62074
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:23:26 np0005625204.localdomain podman[85085]: 2026-02-20 08:23:26.452333361 +0000 UTC m=+0.068544779 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:23:26 np0005625204.localdomain podman[85084]: 2026-02-20 08:23:26.501268827 +0000 UTC m=+0.120240930 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510)
Feb 20 08:23:26 np0005625204.localdomain podman[85084]: 2026-02-20 08:23:26.549144049 +0000 UTC m=+0.168116152 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ovn_controller, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:23:26 np0005625204.localdomain podman[85091]: 2026-02-20 08:23:26.481287041 +0000 UTC m=+0.089178623 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:23:26 np0005625204.localdomain podman[85128]: 2026-02-20 08:23:26.550357806 +0000 UTC m=+0.064004010 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Feb 20 08:23:26 np0005625204.localdomain podman[85091]: 2026-02-20 08:23:26.615051766 +0000 UTC m=+0.222943378 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:23:26 np0005625204.localdomain podman[85128]: 2026-02-20 08:23:26.631254134 +0000 UTC m=+0.144900308 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, release=1766032510, container_name=ovn_metadata_agent, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public)
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:23:26 np0005625204.localdomain podman[85085]: 2026-02-20 08:23:26.684494832 +0000 UTC m=+0.300706210 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5)
Feb 20 08:23:26 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:23:26 np0005625204.localdomain sshd[85082]: Disconnecting invalid user demo 185.246.128.171 port 62074: Change of username or service not allowed: (demo,ssh-connection) -> (zabbix,ssh-connection) [preauth]
Feb 20 08:23:27 np0005625204.localdomain sshd[85195]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:29 np0005625204.localdomain sshd[85195]: Invalid user zabbix from 185.246.128.171 port 18272
Feb 20 08:23:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:23:29 np0005625204.localdomain podman[85300]: 2026-02-20 08:23:29.783150371 +0000 UTC m=+0.090131213 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_migration_target, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Feb 20 08:23:30 np0005625204.localdomain podman[85300]: 2026-02-20 08:23:30.18627346 +0000 UTC m=+0.493254292 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13)
Feb 20 08:23:30 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:23:31 np0005625204.localdomain sudo[85378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:23:32 np0005625204.localdomain sudo[85378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:23:32 np0005625204.localdomain sudo[85378]: pam_unix(sudo:session): session closed for user root
Feb 20 08:23:32 np0005625204.localdomain sudo[85409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:23:32 np0005625204.localdomain sudo[85409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:23:32 np0005625204.localdomain sshd[85195]: Disconnecting invalid user zabbix 185.246.128.171 port 18272: Change of username or service not allowed: (zabbix,ssh-connection) -> (vr,ssh-connection) [preauth]
Feb 20 08:23:32 np0005625204.localdomain sudo[85409]: pam_unix(sudo:session): session closed for user root
Feb 20 08:23:33 np0005625204.localdomain sudo[85612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:23:33 np0005625204.localdomain sudo[85612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:23:33 np0005625204.localdomain sudo[85612]: pam_unix(sudo:session): session closed for user root
Feb 20 08:23:34 np0005625204.localdomain sudo[85651]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf privsep-helper --config-file /etc/nova/nova.conf --privsep_context vif_plug_ovs.privsep.vif_plug --privsep_sock_path /tmp/tmpsj6rp49m/privsep.sock
Feb 20 08:23:34 np0005625204.localdomain systemd-logind[759]: Existing logind session ID 29 used by new audit session, ignoring.
Feb 20 08:23:34 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 20 08:23:34 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 20 08:23:34 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 20 08:23:34 np0005625204.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 20 08:23:34 np0005625204.localdomain sshd[85666]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Queued start job for default target Main User Target.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Created slice User Application Slice.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Reached target Paths.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Reached target Timers.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Starting D-Bus User Message Bus Socket...
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Starting Create User's Volatile Files and Directories...
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Finished Create User's Volatile Files and Directories.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Listening on D-Bus User Message Bus Socket.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Reached target Sockets.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Reached target Basic System.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Reached target Main User Target.
Feb 20 08:23:34 np0005625204.localdomain systemd[85653]: Startup finished in 150ms.
Feb 20 08:23:34 np0005625204.localdomain systemd[1]: Started User Manager for UID 0.
Feb 20 08:23:34 np0005625204.localdomain systemd[1]: Started Session c11 of User root.
Feb 20 08:23:34 np0005625204.localdomain sudo[85651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 20 08:23:35 np0005625204.localdomain sudo[85651]: pam_unix(sudo:session): session closed for user root
Feb 20 08:23:35 np0005625204.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 20 08:23:35 np0005625204.localdomain kernel: device tape7aa8e2a-27 entered promiscuous mode
Feb 20 08:23:35 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575815.5975] manager: (tape7aa8e2a-27): new Tun device (/org/freedesktop/NetworkManager/Devices/13)
Feb 20 08:23:35 np0005625204.localdomain systemd-udevd[85690]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 08:23:35 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575815.6116] device (tape7aa8e2a-27): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 08:23:35 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575815.6129] device (tape7aa8e2a-27): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 20 08:23:35 np0005625204.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 20 08:23:35 np0005625204.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 20 08:23:35 np0005625204.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 20 08:23:35 np0005625204.localdomain systemd-machined[85698]: New machine qemu-1-instance-00000002.
Feb 20 08:23:35 np0005625204.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Feb 20 08:23:35 np0005625204.localdomain systemd-udevd[85689]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 08:23:35 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575815.8353] manager: (tapde929a91-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/14)
Feb 20 08:23:35 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c1: link becomes ready
Feb 20 08:23:35 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c0: link becomes ready
Feb 20 08:23:35 np0005625204.localdomain NetworkManager[5988]: <info>  [1771575815.8799] device (tapde929a91-c0): carrier: link connected
Feb 20 08:23:36 np0005625204.localdomain kernel: device tapde929a91-c0 entered promiscuous mode
Feb 20 08:23:36 np0005625204.localdomain sshd[85666]: Invalid user vr from 185.246.128.171 port 62529
Feb 20 08:23:36 np0005625204.localdomain sshd[85666]: Disconnecting invalid user vr 185.246.128.171 port 62529: Change of username or service not allowed: (vr,ssh-connection) -> (kali,ssh-connection) [preauth]
Feb 20 08:23:37 np0005625204.localdomain sshd[85790]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:37 np0005625204.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 20 08:23:37 np0005625204.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 20 08:23:37 np0005625204.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 20 08:23:37 np0005625204.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 20 08:23:37 np0005625204.localdomain sudo[85809]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf ip netns exec ovnmeta-de929a91-c460-4398-96e0-15a80685a485 haproxy -f /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf
Feb 20 08:23:37 np0005625204.localdomain sudo[85809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 20 08:23:38 np0005625204.localdomain podman[85835]: 2026-02-20 08:23:38.231703121 +0000 UTC m=+0.094232579 container create 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 20 08:23:38 np0005625204.localdomain podman[85835]: 2026-02-20 08:23:38.185350595 +0000 UTC m=+0.047880073 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 20 08:23:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba.scope.
Feb 20 08:23:38 np0005625204.localdomain systemd[1]: tmp-crun.XD2Jlr.mount: Deactivated successfully.
Feb 20 08:23:38 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:23:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ea5b54d1da71d972d7e8dd243987640d185da35de896817d599cfae85808380/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 08:23:38 np0005625204.localdomain podman[85835]: 2026-02-20 08:23:38.372472681 +0000 UTC m=+0.235002109 container init 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:23:38 np0005625204.localdomain podman[85835]: 2026-02-20 08:23:38.378238998 +0000 UTC m=+0.240768426 container start 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:23:38 np0005625204.localdomain sudo[85809]: pam_unix(sudo:session): session closed for user root
Feb 20 08:23:38 np0005625204.localdomain sshd[85790]: Invalid user kali from 185.246.128.171 port 41389
Feb 20 08:23:38 np0005625204.localdomain setroubleshoot[85792]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count. For complete SELinux messages run: sealert -l 4c153363-0b75-4da9-9673-ecc521f0261c
Feb 20 08:23:38 np0005625204.localdomain setroubleshoot[85792]: SELinux is preventing /usr/libexec/qemu-kvm from read access on the file max_map_count.
                                                                
                                                                *****  Plugin qemu_file_image (98.8 confidence) suggests   *******************
                                                                
                                                                If max_map_count is a virtualization target
                                                                Then you need to change the label on max_map_count'
                                                                Do
                                                                # semanage fcontext -a -t virt_image_t 'max_map_count'
                                                                # restorecon -v 'max_map_count'
                                                                
                                                                *****  Plugin catchall (2.13 confidence) suggests   **************************
                                                                
                                                                If you believe that qemu-kvm should be allowed read access on the max_map_count file by default.
                                                                Then you should report this as a bug.
                                                                You can generate a local policy module to allow this access.
                                                                Do
                                                                allow this access for now by executing:
                                                                # ausearch -c 'qemu-kvm' --raw | audit2allow -M my-qemukvm
                                                                # semodule -X 300 -i my-qemukvm.pp
                                                                
Feb 20 08:23:38 np0005625204.localdomain sshd[85790]: Disconnecting invalid user kali 185.246.128.171 port 41389: Change of username or service not allowed: (kali,ssh-connection) -> (saeed,ssh-connection) [preauth]
Feb 20 08:23:41 np0005625204.localdomain sshd[85859]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:43 np0005625204.localdomain sshd[85859]: Invalid user saeed from 185.246.128.171 port 50441
Feb 20 08:23:43 np0005625204.localdomain sshd[85859]: Disconnecting invalid user saeed 185.246.128.171 port 50441: Change of username or service not allowed: (saeed,ssh-connection) -> (localadmin,ssh-connection) [preauth]
Feb 20 08:23:45 np0005625204.localdomain sshd[85862]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:47 np0005625204.localdomain sshd[85862]: Invalid user localadmin from 185.246.128.171 port 13992
Feb 20 08:23:47 np0005625204.localdomain sshd[85862]: Disconnecting invalid user localadmin 185.246.128.171 port 13992: Change of username or service not allowed: (localadmin,ssh-connection) -> (master,ssh-connection) [preauth]
Feb 20 08:23:47 np0005625204.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 20 08:23:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 08:23:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 08:23:48 np0005625204.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 20 08:23:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:23:48 np0005625204.localdomain podman[85864]: 2026-02-20 08:23:48.736396575 +0000 UTC m=+0.070241872 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 20 08:23:48 np0005625204.localdomain sshd[85877]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:48 np0005625204.localdomain podman[85864]: 2026-02-20 08:23:48.922507909 +0000 UTC m=+0.256353206 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:23:48 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:23:50 np0005625204.localdomain sshd[85877]: Invalid user master from 185.246.128.171 port 63377
Feb 20 08:23:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:23:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:23:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:23:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:23:50 np0005625204.localdomain podman[85895]: 2026-02-20 08:23:50.933535124 +0000 UTC m=+0.090393581 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:23:50 np0005625204.localdomain podman[85895]: 2026-02-20 08:23:50.969991146 +0000 UTC m=+0.126849573 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:23:50 np0005625204.localdomain systemd[1]: tmp-crun.t1t0GO.mount: Deactivated successfully.
Feb 20 08:23:50 np0005625204.localdomain podman[85897]: 2026-02-20 08:23:50.983798 +0000 UTC m=+0.134807797 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git)
Feb 20 08:23:50 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:23:51 np0005625204.localdomain sshd[85877]: Disconnecting invalid user master 185.246.128.171 port 63377: Change of username or service not allowed: (master,ssh-connection) -> (wikijs,ssh-connection) [preauth]
Feb 20 08:23:51 np0005625204.localdomain podman[85897]: 2026-02-20 08:23:51.023026917 +0000 UTC m=+0.174036694 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:23:51 np0005625204.localdomain podman[85896]: 2026-02-20 08:23:51.034714276 +0000 UTC m=+0.186441405 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=logrotate_crond, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, url=https://www.redhat.com)
Feb 20 08:23:51 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:23:51 np0005625204.localdomain podman[85896]: 2026-02-20 08:23:51.04590734 +0000 UTC m=+0.197634489 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:23:51 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:23:51 np0005625204.localdomain podman[85898]: 2026-02-20 08:23:51.136037423 +0000 UTC m=+0.283714498 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510)
Feb 20 08:23:51 np0005625204.localdomain podman[85898]: 2026-02-20 08:23:51.190134007 +0000 UTC m=+0.337811082 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 20 08:23:51 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:23:51 np0005625204.localdomain sshd[85988]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:52 np0005625204.localdomain sshd[85988]: Invalid user wikijs from 185.246.128.171 port 22529
Feb 20 08:23:52 np0005625204.localdomain sshd[85988]: Disconnecting invalid user wikijs 185.246.128.171 port 22529: Change of username or service not allowed: (wikijs,ssh-connection) -> (ADMIN,ssh-connection) [preauth]
Feb 20 08:23:53 np0005625204.localdomain sshd[85990]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:53 np0005625204.localdomain sshd[85991]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:53 np0005625204.localdomain sshd[85991]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:23:55 np0005625204.localdomain sshd[85990]: Invalid user ADMIN from 185.246.128.171 port 57760
Feb 20 08:23:56 np0005625204.localdomain sshd[85990]: Disconnecting invalid user ADMIN 185.246.128.171 port 57760: Change of username or service not allowed: (ADMIN,ssh-connection) -> (suraj,ssh-connection) [preauth]
Feb 20 08:23:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:23:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:23:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:23:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:23:57 np0005625204.localdomain podman[86039]: 2026-02-20 08:23:57.07096871 +0000 UTC m=+0.083422267 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, container_name=ovn_controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:23:57 np0005625204.localdomain systemd[1]: tmp-crun.wamHle.mount: Deactivated successfully.
Feb 20 08:23:57 np0005625204.localdomain podman[86040]: 2026-02-20 08:23:57.140031975 +0000 UTC m=+0.149646484 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, release=1766032510, config_id=tripleo_step3)
Feb 20 08:23:57 np0005625204.localdomain podman[86041]: 2026-02-20 08:23:57.175304139 +0000 UTC m=+0.181965158 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:23:57 np0005625204.localdomain podman[86039]: 2026-02-20 08:23:57.194478189 +0000 UTC m=+0.206931716 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:23:57 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:23:57 np0005625204.localdomain podman[86040]: 2026-02-20 08:23:57.203093564 +0000 UTC m=+0.212708063 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5)
Feb 20 08:23:57 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:23:57 np0005625204.localdomain podman[86041]: 2026-02-20 08:23:57.221005465 +0000 UTC m=+0.227666534 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, tcib_managed=true)
Feb 20 08:23:57 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:23:57 np0005625204.localdomain podman[86042]: 2026-02-20 08:23:57.281230437 +0000 UTC m=+0.285509762 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:23:57 np0005625204.localdomain podman[86042]: 2026-02-20 08:23:57.310013903 +0000 UTC m=+0.314293288 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:23:57 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:23:57 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35900 [20/Feb/2026:08:23:56.500] listener listener/metadata 0/0/0/1164/1164 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 20 08:23:57 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35912 [20/Feb/2026:08:23:57.763] listener listener/metadata 0/0/0/15/15 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Feb 20 08:23:57 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35924 [20/Feb/2026:08:23:57.817] listener listener/metadata 0/0/0/12/12 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 20 08:23:57 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35938 [20/Feb/2026:08:23:57.867] listener listener/metadata 0/0/0/13/13 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Feb 20 08:23:57 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35950 [20/Feb/2026:08:23:57.917] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Feb 20 08:23:57 np0005625204.localdomain sshd[86129]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:23:57 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35954 [20/Feb/2026:08:23:57.967] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35958 [20/Feb/2026:08:23:58.018] listener listener/metadata 0/0/0/14/14 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35960 [20/Feb/2026:08:23:58.081] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35966 [20/Feb/2026:08:23:58.168] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35968 [20/Feb/2026:08:23:58.234] listener listener/metadata 0/0/0/12/12 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35970 [20/Feb/2026:08:23:58.292] listener listener/metadata 0/0/0/11/11 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35972 [20/Feb/2026:08:23:58.331] listener listener/metadata 0/0/0/13/13 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35978 [20/Feb/2026:08:23:58.378] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35982 [20/Feb/2026:08:23:58.419] listener listener/metadata 0/0/0/11/11 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:35998 [20/Feb/2026:08:23:58.470] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Feb 20 08:23:58 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[85856]: 192.168.0.140:36006 [20/Feb/2026:08:23:58.521] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Feb 20 08:24:00 np0005625204.localdomain sshd[86129]: Invalid user suraj from 185.246.128.171 port 32664
Feb 20 08:24:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:24:00 np0005625204.localdomain systemd[1]: tmp-crun.2p1RDW.mount: Deactivated successfully.
Feb 20 08:24:00 np0005625204.localdomain podman[86131]: 2026-02-20 08:24:00.619902209 +0000 UTC m=+0.092499106 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:24:00 np0005625204.localdomain sshd[86129]: Disconnecting invalid user suraj 185.246.128.171 port 32664: Change of username or service not allowed: (suraj,ssh-connection) -> (avalanche,ssh-connection) [preauth]
Feb 20 08:24:01 np0005625204.localdomain podman[86131]: 2026-02-20 08:24:01.03578668 +0000 UTC m=+0.508383607 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:24:01 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:24:03 np0005625204.localdomain sshd[86154]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:24:09 np0005625204.localdomain sshd[86154]: Invalid user avalanche from 185.246.128.171 port 52859
Feb 20 08:24:10 np0005625204.localdomain sshd[86154]: Disconnecting invalid user avalanche 185.246.128.171 port 52859: Change of username or service not allowed: (avalanche,ssh-connection) -> (monitoring,ssh-connection) [preauth]
Feb 20 08:24:10 np0005625204.localdomain sshd[86156]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:24:12 np0005625204.localdomain sshd[86156]: Invalid user monitoring from 185.246.128.171 port 29481
Feb 20 08:24:14 np0005625204.localdomain sshd[86156]: Connection closed by invalid user monitoring 185.246.128.171 port 29481 [preauth]
Feb 20 08:24:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:24:19 np0005625204.localdomain podman[86158]: 2026-02-20 08:24:19.115779004 +0000 UTC m=+0.059825260 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, architecture=x86_64, release=1766032510, tcib_managed=true, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 20 08:24:19 np0005625204.localdomain podman[86158]: 2026-02-20 08:24:19.305846261 +0000 UTC m=+0.249892447 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:24:19 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:24:21 np0005625204.localdomain podman[86189]: 2026-02-20 08:24:21.145088801 +0000 UTC m=+0.081220309 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:24:21 np0005625204.localdomain podman[86189]: 2026-02-20 08:24:21.156662358 +0000 UTC m=+0.092793826 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, container_name=collectd, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5)
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: tmp-crun.zkodIA.mount: Deactivated successfully.
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:24:21 np0005625204.localdomain podman[86188]: 2026-02-20 08:24:21.236902626 +0000 UTC m=+0.174502508 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, version=17.1.13, vcs-type=git, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:24:21 np0005625204.localdomain podman[86188]: 2026-02-20 08:24:21.263244736 +0000 UTC m=+0.200844618 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 20 08:24:21 np0005625204.localdomain podman[86217]: 2026-02-20 08:24:21.274681898 +0000 UTC m=+0.107353624 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public)
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:24:21 np0005625204.localdomain podman[86217]: 2026-02-20 08:24:21.31213005 +0000 UTC m=+0.144801796 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:24:21 np0005625204.localdomain podman[86232]: 2026-02-20 08:24:21.323733296 +0000 UTC m=+0.088742320 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true)
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:24:21 np0005625204.localdomain podman[86232]: 2026-02-20 08:24:21.378359357 +0000 UTC m=+0.143368401 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1)
Feb 20 08:24:21 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: tmp-crun.CkalX4.mount: Deactivated successfully.
Feb 20 08:24:28 np0005625204.localdomain podman[86284]: 2026-02-20 08:24:28.16499866 +0000 UTC m=+0.098633364 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:24:28 np0005625204.localdomain podman[86283]: 2026-02-20 08:24:28.135044289 +0000 UTC m=+0.076203995 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:24:28 np0005625204.localdomain podman[86282]: 2026-02-20 08:24:28.223979184 +0000 UTC m=+0.162504059 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, release=1766032510, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:24:28 np0005625204.localdomain podman[86284]: 2026-02-20 08:24:28.250026486 +0000 UTC m=+0.183661210 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 20 08:24:28 np0005625204.localdomain podman[86281]: 2026-02-20 08:24:28.19910819 +0000 UTC m=+0.141346619 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1)
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:24:28 np0005625204.localdomain podman[86283]: 2026-02-20 08:24:28.272198418 +0000 UTC m=+0.213358124 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z)
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:24:28 np0005625204.localdomain podman[86282]: 2026-02-20 08:24:28.305556084 +0000 UTC m=+0.244080939 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:24:28 np0005625204.localdomain podman[86281]: 2026-02-20 08:24:28.330205652 +0000 UTC m=+0.272444091 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:24:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:24:29 np0005625204.localdomain systemd[1]: tmp-crun.XASm87.mount: Deactivated successfully.
Feb 20 08:24:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:24:32 np0005625204.localdomain systemd[1]: tmp-crun.zuzjNJ.mount: Deactivated successfully.
Feb 20 08:24:32 np0005625204.localdomain podman[86372]: 2026-02-20 08:24:32.154672645 +0000 UTC m=+0.092154215 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4)
Feb 20 08:24:32 np0005625204.localdomain podman[86372]: 2026-02-20 08:24:32.563238941 +0000 UTC m=+0.500720511 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:24:32 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:24:33 np0005625204.localdomain sudo[86397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:24:33 np0005625204.localdomain sudo[86397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:24:33 np0005625204.localdomain sudo[86397]: pam_unix(sudo:session): session closed for user root
Feb 20 08:24:33 np0005625204.localdomain sudo[86412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:24:33 np0005625204.localdomain sudo[86412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:24:34 np0005625204.localdomain sudo[86412]: pam_unix(sudo:session): session closed for user root
Feb 20 08:24:35 np0005625204.localdomain sudo[86459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:24:35 np0005625204.localdomain sudo[86459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:24:35 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:24:35 np0005625204.localdomain sudo[86459]: pam_unix(sudo:session): session closed for user root
Feb 20 08:24:35 np0005625204.localdomain recover_tripleo_nova_virtqemud[86475]: 63005
Feb 20 08:24:35 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:24:35 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:24:39 np0005625204.localdomain sshd[86476]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:24:39 np0005625204.localdomain sshd[86476]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:24:48 np0005625204.localdomain sshd[86479]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:24:49 np0005625204.localdomain sshd[86479]: Received disconnect from 101.36.109.176 port 36622:11: Bye Bye [preauth]
Feb 20 08:24:49 np0005625204.localdomain sshd[86479]: Disconnected from authenticating user root 101.36.109.176 port 36622 [preauth]
Feb 20 08:24:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:24:49 np0005625204.localdomain podman[86481]: 2026-02-20 08:24:49.909779617 +0000 UTC m=+0.067990652 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:24:50 np0005625204.localdomain podman[86481]: 2026-02-20 08:24:50.085732809 +0000 UTC m=+0.243943834 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:24:50 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: tmp-crun.VNwrCD.mount: Deactivated successfully.
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: tmp-crun.WLQZrS.mount: Deactivated successfully.
Feb 20 08:24:52 np0005625204.localdomain podman[86511]: 2026-02-20 08:24:52.139582351 +0000 UTC m=+0.082424576 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 20 08:24:52 np0005625204.localdomain podman[86520]: 2026-02-20 08:24:52.199704391 +0000 UTC m=+0.129338189 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 20 08:24:52 np0005625204.localdomain podman[86511]: 2026-02-20 08:24:52.226089462 +0000 UTC m=+0.168931707 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:24:52 np0005625204.localdomain podman[86513]: 2026-02-20 08:24:52.175132435 +0000 UTC m=+0.106672512 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:24:52 np0005625204.localdomain podman[86520]: 2026-02-20 08:24:52.245878191 +0000 UTC m=+0.175512009 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible)
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:24:52 np0005625204.localdomain podman[86512]: 2026-02-20 08:24:52.302278966 +0000 UTC m=+0.238191508 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510)
Feb 20 08:24:52 np0005625204.localdomain podman[86512]: 2026-02-20 08:24:52.313997937 +0000 UTC m=+0.249910499 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-cron-container)
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:24:52 np0005625204.localdomain podman[86513]: 2026-02-20 08:24:52.362128897 +0000 UTC m=+0.293668964 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, container_name=collectd, vcs-type=git)
Feb 20 08:24:52 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: tmp-crun.iZyW3n.mount: Deactivated successfully.
Feb 20 08:24:59 np0005625204.localdomain podman[86648]: 2026-02-20 08:24:59.165145404 +0000 UTC m=+0.098662836 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:24:59 np0005625204.localdomain podman[86649]: 2026-02-20 08:24:59.231390972 +0000 UTC m=+0.162357795 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:24:59 np0005625204.localdomain podman[86647]: 2026-02-20 08:24:59.140780835 +0000 UTC m=+0.080042273 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:24:59 np0005625204.localdomain podman[86648]: 2026-02-20 08:24:59.244461574 +0000 UTC m=+0.177979056 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public)
Feb 20 08:24:59 np0005625204.localdomain podman[86655]: 2026-02-20 08:24:59.193282059 +0000 UTC m=+0.123795798 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5)
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:24:59 np0005625204.localdomain podman[86649]: 2026-02-20 08:24:59.274045303 +0000 UTC m=+0.205012086 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible)
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:24:59 np0005625204.localdomain podman[86647]: 2026-02-20 08:24:59.321876794 +0000 UTC m=+0.261138262 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z)
Feb 20 08:24:59 np0005625204.localdomain podman[86655]: 2026-02-20 08:24:59.329149139 +0000 UTC m=+0.259662888 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:24:59 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:25:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:25:03 np0005625204.localdomain podman[86736]: 2026-02-20 08:25:03.145351957 +0000 UTC m=+0.083750817 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public)
Feb 20 08:25:03 np0005625204.localdomain podman[86736]: 2026-02-20 08:25:03.526102369 +0000 UTC m=+0.464501229 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:25:03 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:25:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:25:21 np0005625204.localdomain podman[86760]: 2026-02-20 08:25:21.148212389 +0000 UTC m=+0.087755680 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64)
Feb 20 08:25:21 np0005625204.localdomain podman[86760]: 2026-02-20 08:25:21.31986725 +0000 UTC m=+0.259410551 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 20 08:25:21 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:25:22 np0005625204.localdomain sshd[86789]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:25:23 np0005625204.localdomain sshd[86789]: Received disconnect from 83.235.16.111 port 42474:11: Bye Bye [preauth]
Feb 20 08:25:23 np0005625204.localdomain sshd[86789]: Disconnected from authenticating user root 83.235.16.111 port 42474 [preauth]
Feb 20 08:25:23 np0005625204.localdomain podman[86791]: 2026-02-20 08:25:23.166068755 +0000 UTC m=+0.095076485 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:25:23 np0005625204.localdomain podman[86794]: 2026-02-20 08:25:23.220017885 +0000 UTC m=+0.140326448 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_id=tripleo_step4, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:25:23 np0005625204.localdomain podman[86791]: 2026-02-20 08:25:23.246975634 +0000 UTC m=+0.175983324 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T23:07:30Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:25:23 np0005625204.localdomain podman[86794]: 2026-02-20 08:25:23.275275674 +0000 UTC m=+0.195584257 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public)
Feb 20 08:25:23 np0005625204.localdomain podman[86793]: 2026-02-20 08:25:23.185970608 +0000 UTC m=+0.108052765 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd)
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:25:23 np0005625204.localdomain podman[86792]: 2026-02-20 08:25:23.319930848 +0000 UTC m=+0.245534124 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=logrotate_crond, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, release=1766032510, managed_by=tripleo_ansible)
Feb 20 08:25:23 np0005625204.localdomain podman[86792]: 2026-02-20 08:25:23.355248524 +0000 UTC m=+0.280851850 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public)
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:25:23 np0005625204.localdomain podman[86793]: 2026-02-20 08:25:23.372097542 +0000 UTC m=+0.294179669 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible)
Feb 20 08:25:23 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:25:24 np0005625204.localdomain systemd[1]: tmp-crun.4aadbn.mount: Deactivated successfully.
Feb 20 08:25:25 np0005625204.localdomain sshd[86885]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:25:25 np0005625204.localdomain sshd[86885]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:25:30 np0005625204.localdomain recover_tripleo_nova_virtqemud[86913]: 63005
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: tmp-crun.f8rzCM.mount: Deactivated successfully.
Feb 20 08:25:30 np0005625204.localdomain podman[86889]: 2026-02-20 08:25:30.134562153 +0000 UTC m=+0.072491511 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1)
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: tmp-crun.r3UlS6.mount: Deactivated successfully.
Feb 20 08:25:30 np0005625204.localdomain podman[86890]: 2026-02-20 08:25:30.156731745 +0000 UTC m=+0.087872244 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, container_name=nova_compute, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step5, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:25:30 np0005625204.localdomain podman[86889]: 2026-02-20 08:25:30.183018034 +0000 UTC m=+0.120947452 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:25:30 np0005625204.localdomain podman[86890]: 2026-02-20 08:25:30.213211532 +0000 UTC m=+0.144351981 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:25:30 np0005625204.localdomain podman[86888]: 2026-02-20 08:25:30.300374962 +0000 UTC m=+0.237202126 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, release=1766032510, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:25:30 np0005625204.localdomain podman[86888]: 2026-02-20 08:25:30.307887604 +0000 UTC m=+0.244714778 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, container_name=iscsid, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public)
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:25:30 np0005625204.localdomain podman[86887]: 2026-02-20 08:25:30.397471779 +0000 UTC m=+0.337666816 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, build-date=2026-01-12T22:36:40Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:25:30 np0005625204.localdomain podman[86887]: 2026-02-20 08:25:30.446111926 +0000 UTC m=+0.386307003 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com)
Feb 20 08:25:30 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:25:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:25:34 np0005625204.localdomain podman[86980]: 2026-02-20 08:25:34.154159118 +0000 UTC m=+0.084382717 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, release=1766032510, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git)
Feb 20 08:25:34 np0005625204.localdomain podman[86980]: 2026-02-20 08:25:34.525046806 +0000 UTC m=+0.455270385 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:25:34 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:25:35 np0005625204.localdomain sudo[87004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:25:35 np0005625204.localdomain sudo[87004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:25:35 np0005625204.localdomain sudo[87004]: pam_unix(sudo:session): session closed for user root
Feb 20 08:25:35 np0005625204.localdomain sudo[87019]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:25:35 np0005625204.localdomain sudo[87019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:25:35 np0005625204.localdomain sudo[87019]: pam_unix(sudo:session): session closed for user root
Feb 20 08:25:36 np0005625204.localdomain sudo[87067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:25:36 np0005625204.localdomain sudo[87067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:25:36 np0005625204.localdomain sudo[87067]: pam_unix(sudo:session): session closed for user root
Feb 20 08:25:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:25:52 np0005625204.localdomain podman[87083]: 2026-02-20 08:25:52.137849111 +0000 UTC m=+0.077397192 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, distribution-scope=public, version=17.1.13)
Feb 20 08:25:52 np0005625204.localdomain podman[87083]: 2026-02-20 08:25:52.296667765 +0000 UTC m=+0.236215816 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public)
Feb 20 08:25:52 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: tmp-crun.P4AdEu.mount: Deactivated successfully.
Feb 20 08:25:54 np0005625204.localdomain podman[87113]: 2026-02-20 08:25:54.149690621 +0000 UTC m=+0.091336111 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public)
Feb 20 08:25:54 np0005625204.localdomain podman[87113]: 2026-02-20 08:25:54.197558483 +0000 UTC m=+0.139203963 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true)
Feb 20 08:25:54 np0005625204.localdomain podman[87121]: 2026-02-20 08:25:54.196708237 +0000 UTC m=+0.129277127 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5)
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:25:54 np0005625204.localdomain podman[87114]: 2026-02-20 08:25:54.243069453 +0000 UTC m=+0.181227985 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=)
Feb 20 08:25:54 np0005625204.localdomain podman[87114]: 2026-02-20 08:25:54.257932501 +0000 UTC m=+0.196091043 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git)
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:25:54 np0005625204.localdomain podman[87115]: 2026-02-20 08:25:54.3021543 +0000 UTC m=+0.236681600 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, container_name=collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 20 08:25:54 np0005625204.localdomain podman[87115]: 2026-02-20 08:25:54.309982081 +0000 UTC m=+0.244509421 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=)
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:25:54 np0005625204.localdomain podman[87121]: 2026-02-20 08:25:54.327006885 +0000 UTC m=+0.259575775 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:25:54 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:26:00 np0005625204.localdomain sshd[87252]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:26:01 np0005625204.localdomain podman[87257]: 2026-02-20 08:26:01.164666538 +0000 UTC m=+0.091743413 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: tmp-crun.gPcAlS.mount: Deactivated successfully.
Feb 20 08:26:01 np0005625204.localdomain podman[87254]: 2026-02-20 08:26:01.216237634 +0000 UTC m=+0.149894571 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:26:01 np0005625204.localdomain podman[87257]: 2026-02-20 08:26:01.222214218 +0000 UTC m=+0.149291073 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13)
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:26:01 np0005625204.localdomain podman[87256]: 2026-02-20 08:26:01.262240069 +0000 UTC m=+0.193018738 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, distribution-scope=public, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git)
Feb 20 08:26:01 np0005625204.localdomain podman[87256]: 2026-02-20 08:26:01.303757246 +0000 UTC m=+0.234535905 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:26:01 np0005625204.localdomain podman[87255]: 2026-02-20 08:26:01.319753578 +0000 UTC m=+0.248598427 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team)
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:26:01 np0005625204.localdomain podman[87255]: 2026-02-20 08:26:01.329840228 +0000 UTC m=+0.258685087 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid)
Feb 20 08:26:01 np0005625204.localdomain podman[87254]: 2026-02-20 08:26:01.344228851 +0000 UTC m=+0.277885788 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13)
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:26:01 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:26:02 np0005625204.localdomain sshd[87252]: Invalid user royal from 103.157.25.4 port 52498
Feb 20 08:26:02 np0005625204.localdomain sshd[87252]: Received disconnect from 103.157.25.4 port 52498:11: Bye Bye [preauth]
Feb 20 08:26:02 np0005625204.localdomain sshd[87252]: Disconnected from invalid user royal 103.157.25.4 port 52498 [preauth]
Feb 20 08:26:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:26:05 np0005625204.localdomain podman[87344]: 2026-02-20 08:26:05.142710264 +0000 UTC m=+0.077326509 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 20 08:26:05 np0005625204.localdomain podman[87344]: 2026-02-20 08:26:05.5190699 +0000 UTC m=+0.453686225 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5)
Feb 20 08:26:05 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:26:11 np0005625204.localdomain sshd[87367]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:26:11 np0005625204.localdomain sshd[87369]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:26:11 np0005625204.localdomain sshd[87369]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:26:11 np0005625204.localdomain sshd[87367]: Received disconnect from 77.232.138.190 port 36052:11: Bye Bye [preauth]
Feb 20 08:26:11 np0005625204.localdomain sshd[87367]: Disconnected from authenticating user root 77.232.138.190 port 36052 [preauth]
Feb 20 08:26:19 np0005625204.localdomain sshd[87371]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:26:20 np0005625204.localdomain sshd[87371]: Received disconnect from 152.32.189.21 port 57934:11: Bye Bye [preauth]
Feb 20 08:26:20 np0005625204.localdomain sshd[87371]: Disconnected from authenticating user root 152.32.189.21 port 57934 [preauth]
Feb 20 08:26:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:26:23 np0005625204.localdomain systemd[1]: tmp-crun.dPYgr3.mount: Deactivated successfully.
Feb 20 08:26:23 np0005625204.localdomain podman[87373]: 2026-02-20 08:26:23.142411739 +0000 UTC m=+0.087671077 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 20 08:26:23 np0005625204.localdomain podman[87373]: 2026-02-20 08:26:23.356842625 +0000 UTC m=+0.302101953 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:26:23 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: tmp-crun.TGOfTC.mount: Deactivated successfully.
Feb 20 08:26:25 np0005625204.localdomain podman[87403]: 2026-02-20 08:26:25.183120058 +0000 UTC m=+0.104574828 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-cron, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 20 08:26:25 np0005625204.localdomain podman[87410]: 2026-02-20 08:26:25.203558706 +0000 UTC m=+0.121410295 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, distribution-scope=public, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:26:25 np0005625204.localdomain podman[87404]: 2026-02-20 08:26:25.159352937 +0000 UTC m=+0.081597751 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git)
Feb 20 08:26:25 np0005625204.localdomain podman[87410]: 2026-02-20 08:26:25.236052786 +0000 UTC m=+0.153904445 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute)
Feb 20 08:26:25 np0005625204.localdomain podman[87404]: 2026-02-20 08:26:25.244044461 +0000 UTC m=+0.166289225 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z)
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:26:25 np0005625204.localdomain podman[87402]: 2026-02-20 08:26:25.304104619 +0000 UTC m=+0.232889544 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, version=17.1.13, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:26:25 np0005625204.localdomain podman[87403]: 2026-02-20 08:26:25.315978894 +0000 UTC m=+0.237433664 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:26:25 np0005625204.localdomain podman[87402]: 2026-02-20 08:26:25.354998004 +0000 UTC m=+0.283782929 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:26:25 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:26:26 np0005625204.localdomain systemd[1]: tmp-crun.3amH1q.mount: Deactivated successfully.
Feb 20 08:26:28 np0005625204.localdomain sshd[87492]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:26:29 np0005625204.localdomain sshd[87492]: Received disconnect from 178.217.173.50 port 60458:11: Bye Bye [preauth]
Feb 20 08:26:29 np0005625204.localdomain sshd[87492]: Disconnected from authenticating user root 178.217.173.50 port 60458 [preauth]
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: tmp-crun.nlNCLm.mount: Deactivated successfully.
Feb 20 08:26:32 np0005625204.localdomain podman[87496]: 2026-02-20 08:26:32.160770286 +0000 UTC m=+0.087839512 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:26:32 np0005625204.localdomain podman[87499]: 2026-02-20 08:26:32.177952926 +0000 UTC m=+0.096993135 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, distribution-scope=public, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64)
Feb 20 08:26:32 np0005625204.localdomain podman[87495]: 2026-02-20 08:26:32.20570989 +0000 UTC m=+0.135197181 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, container_name=iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13)
Feb 20 08:26:32 np0005625204.localdomain podman[87495]: 2026-02-20 08:26:32.216982686 +0000 UTC m=+0.146469957 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1)
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:26:32 np0005625204.localdomain podman[87496]: 2026-02-20 08:26:32.237417395 +0000 UTC m=+0.164486631 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64)
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:26:32 np0005625204.localdomain podman[87499]: 2026-02-20 08:26:32.261163124 +0000 UTC m=+0.180203383 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute)
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:26:32 np0005625204.localdomain podman[87494]: 2026-02-20 08:26:32.303986892 +0000 UTC m=+0.236780014 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:26:32 np0005625204.localdomain podman[87494]: 2026-02-20 08:26:32.35202439 +0000 UTC m=+0.284817492 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1)
Feb 20 08:26:32 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:26:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:26:36 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:26:36 np0005625204.localdomain recover_tripleo_nova_virtqemud[87585]: 63005
Feb 20 08:26:36 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:26:36 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:26:36 np0005625204.localdomain systemd[1]: tmp-crun.tD16SX.mount: Deactivated successfully.
Feb 20 08:26:36 np0005625204.localdomain podman[87583]: 2026-02-20 08:26:36.15039633 +0000 UTC m=+0.088853684 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, container_name=nova_migration_target, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510)
Feb 20 08:26:36 np0005625204.localdomain podman[87583]: 2026-02-20 08:26:36.520782813 +0000 UTC m=+0.459240147 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:26:36 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:26:36 np0005625204.localdomain sudo[87607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:26:36 np0005625204.localdomain sudo[87607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:26:36 np0005625204.localdomain sudo[87607]: pam_unix(sudo:session): session closed for user root
Feb 20 08:26:36 np0005625204.localdomain sudo[87622]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:26:36 np0005625204.localdomain sudo[87622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:26:37 np0005625204.localdomain sudo[87622]: pam_unix(sudo:session): session closed for user root
Feb 20 08:26:38 np0005625204.localdomain sudo[87669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:26:38 np0005625204.localdomain sudo[87669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:26:38 np0005625204.localdomain sudo[87669]: pam_unix(sudo:session): session closed for user root
Feb 20 08:26:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:26:54 np0005625204.localdomain podman[87685]: 2026-02-20 08:26:54.143686378 +0000 UTC m=+0.082230491 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:26:54 np0005625204.localdomain podman[87685]: 2026-02-20 08:26:54.321886599 +0000 UTC m=+0.260430682 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:26:54 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:26:56 np0005625204.localdomain podman[87717]: 2026-02-20 08:26:56.152479655 +0000 UTC m=+0.079209388 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc.)
Feb 20 08:26:56 np0005625204.localdomain podman[87717]: 2026-02-20 08:26:56.192043941 +0000 UTC m=+0.118773634 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: tmp-crun.Oz76mY.mount: Deactivated successfully.
Feb 20 08:26:56 np0005625204.localdomain podman[87716]: 2026-02-20 08:26:56.326988892 +0000 UTC m=+0.254911322 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible)
Feb 20 08:26:56 np0005625204.localdomain podman[87716]: 2026-02-20 08:26:56.334142512 +0000 UTC m=+0.262064932 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:26:56 np0005625204.localdomain podman[87715]: 2026-02-20 08:26:56.29183719 +0000 UTC m=+0.223424982 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:26:56 np0005625204.localdomain podman[87715]: 2026-02-20 08:26:56.375212685 +0000 UTC m=+0.306800427 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-type=git)
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:26:56 np0005625204.localdomain podman[87719]: 2026-02-20 08:26:56.376888057 +0000 UTC m=+0.299210574 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510)
Feb 20 08:26:56 np0005625204.localdomain podman[87719]: 2026-02-20 08:26:56.460382665 +0000 UTC m=+0.382705182 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:26:56 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:26:57 np0005625204.localdomain systemd[1]: tmp-crun.EJJfhA.mount: Deactivated successfully.
Feb 20 08:26:57 np0005625204.localdomain sshd[87828]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:26:57 np0005625204.localdomain sshd[87828]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:27:03 np0005625204.localdomain podman[87854]: 2026-02-20 08:27:03.145817876 +0000 UTC m=+0.081378804 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, container_name=nova_compute, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:27:03 np0005625204.localdomain podman[87854]: 2026-02-20 08:27:03.181509344 +0000 UTC m=+0.117070332 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: tmp-crun.UZ2vVc.mount: Deactivated successfully.
Feb 20 08:27:03 np0005625204.localdomain podman[87853]: 2026-02-20 08:27:03.197830706 +0000 UTC m=+0.136622763 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z)
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:27:03 np0005625204.localdomain podman[87851]: 2026-02-20 08:27:03.250146405 +0000 UTC m=+0.190124959 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:27:03 np0005625204.localdomain podman[87852]: 2026-02-20 08:27:03.302304659 +0000 UTC m=+0.241511489 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, container_name=iscsid, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:27:03 np0005625204.localdomain podman[87853]: 2026-02-20 08:27:03.325709169 +0000 UTC m=+0.264501276 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, container_name=ovn_metadata_agent, architecture=x86_64)
Feb 20 08:27:03 np0005625204.localdomain podman[87851]: 2026-02-20 08:27:03.326105891 +0000 UTC m=+0.266084425 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, architecture=x86_64)
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:27:03 np0005625204.localdomain podman[87852]: 2026-02-20 08:27:03.341042891 +0000 UTC m=+0.280249721 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:27:03 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:27:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:27:07 np0005625204.localdomain podman[87950]: 2026-02-20 08:27:07.143103195 +0000 UTC m=+0.081856358 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:27:07 np0005625204.localdomain podman[87950]: 2026-02-20 08:27:07.511533167 +0000 UTC m=+0.450286340 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-type=git, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:27:07 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:27:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:27:25 np0005625204.localdomain systemd[1]: tmp-crun.OOV9RU.mount: Deactivated successfully.
Feb 20 08:27:25 np0005625204.localdomain podman[87973]: 2026-02-20 08:27:25.163115565 +0000 UTC m=+0.096568231 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, release=1766032510, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:27:25 np0005625204.localdomain podman[87973]: 2026-02-20 08:27:25.391617063 +0000 UTC m=+0.325069689 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:14Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:27:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: tmp-crun.y0M8Bn.mount: Deactivated successfully.
Feb 20 08:27:27 np0005625204.localdomain podman[88003]: 2026-02-20 08:27:27.150795932 +0000 UTC m=+0.088394250 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 20 08:27:27 np0005625204.localdomain podman[88005]: 2026-02-20 08:27:27.190858364 +0000 UTC m=+0.121973292 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=)
Feb 20 08:27:27 np0005625204.localdomain podman[88003]: 2026-02-20 08:27:27.198922012 +0000 UTC m=+0.136520360 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:27:27 np0005625204.localdomain podman[88005]: 2026-02-20 08:27:27.252559662 +0000 UTC m=+0.183674600 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:27:27 np0005625204.localdomain podman[88009]: 2026-02-20 08:27:27.304885032 +0000 UTC m=+0.232797082 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:47Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:27:27 np0005625204.localdomain podman[88004]: 2026-02-20 08:27:27.259381462 +0000 UTC m=+0.189990455 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., architecture=x86_64)
Feb 20 08:27:27 np0005625204.localdomain podman[88004]: 2026-02-20 08:27:27.347984187 +0000 UTC m=+0.278593120 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510)
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:27:27 np0005625204.localdomain podman[88009]: 2026-02-20 08:27:27.402794393 +0000 UTC m=+0.330706483 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:27:27 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:27:34 np0005625204.localdomain podman[88097]: 2026-02-20 08:27:34.143291817 +0000 UTC m=+0.079718982 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:27:34 np0005625204.localdomain podman[88097]: 2026-02-20 08:27:34.165987955 +0000 UTC m=+0.102415100 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: tmp-crun.tTiZnP.mount: Deactivated successfully.
Feb 20 08:27:34 np0005625204.localdomain podman[88098]: 2026-02-20 08:27:34.256883042 +0000 UTC m=+0.190474460 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:27:34 np0005625204.localdomain podman[88098]: 2026-02-20 08:27:34.289340839 +0000 UTC m=+0.222932197 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, release=1766032510, architecture=x86_64)
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:27:34 np0005625204.localdomain podman[88100]: 2026-02-20 08:27:34.30560636 +0000 UTC m=+0.234637568 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:27:34 np0005625204.localdomain podman[88099]: 2026-02-20 08:27:34.356734402 +0000 UTC m=+0.288593437 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 20 08:27:34 np0005625204.localdomain podman[88099]: 2026-02-20 08:27:34.391920715 +0000 UTC m=+0.323779660 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:27:34 np0005625204.localdomain podman[88100]: 2026-02-20 08:27:34.40966567 +0000 UTC m=+0.338696888 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 20 08:27:34 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:27:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:27:38 np0005625204.localdomain podman[88189]: 2026-02-20 08:27:38.148730617 +0000 UTC m=+0.088337239 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:27:38 np0005625204.localdomain sudo[88212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:27:38 np0005625204.localdomain sudo[88212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:27:38 np0005625204.localdomain sudo[88212]: pam_unix(sudo:session): session closed for user root
Feb 20 08:27:38 np0005625204.localdomain sudo[88228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:27:38 np0005625204.localdomain sudo[88228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:27:38 np0005625204.localdomain podman[88189]: 2026-02-20 08:27:38.508206263 +0000 UTC m=+0.447812855 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:27:38 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:27:39 np0005625204.localdomain sudo[88228]: pam_unix(sudo:session): session closed for user root
Feb 20 08:27:39 np0005625204.localdomain sudo[88275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:27:39 np0005625204.localdomain sudo[88275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:27:39 np0005625204.localdomain sudo[88275]: pam_unix(sudo:session): session closed for user root
Feb 20 08:27:43 np0005625204.localdomain sshd[88290]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:27:43 np0005625204.localdomain sshd[88290]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:27:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:27:56 np0005625204.localdomain podman[88292]: 2026-02-20 08:27:56.15584944 +0000 UTC m=+0.091027732 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:27:56 np0005625204.localdomain podman[88292]: 2026-02-20 08:27:56.378060274 +0000 UTC m=+0.313238496 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 20 08:27:56 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:27:58 np0005625204.localdomain podman[88367]: 2026-02-20 08:27:58.159381654 +0000 UTC m=+0.090211816 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:27:58 np0005625204.localdomain podman[88370]: 2026-02-20 08:27:58.207016889 +0000 UTC m=+0.129032020 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 20 08:27:58 np0005625204.localdomain podman[88367]: 2026-02-20 08:27:58.211962681 +0000 UTC m=+0.142792773 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git)
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: tmp-crun.V20X3Z.mount: Deactivated successfully.
Feb 20 08:27:58 np0005625204.localdomain podman[88368]: 2026-02-20 08:27:58.257937645 +0000 UTC m=+0.187483928 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:27:58 np0005625204.localdomain podman[88370]: 2026-02-20 08:27:58.267936153 +0000 UTC m=+0.189951244 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:27:58 np0005625204.localdomain podman[88368]: 2026-02-20 08:27:58.295125209 +0000 UTC m=+0.224671482 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, architecture=x86_64, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, distribution-scope=public, container_name=logrotate_crond, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:27:58 np0005625204.localdomain podman[88369]: 2026-02-20 08:27:58.312324528 +0000 UTC m=+0.236846356 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd)
Feb 20 08:27:58 np0005625204.localdomain podman[88369]: 2026-02-20 08:27:58.323853983 +0000 UTC m=+0.248375811 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5)
Feb 20 08:27:58 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: tmp-crun.ACF79h.mount: Deactivated successfully.
Feb 20 08:28:05 np0005625204.localdomain podman[88459]: 2026-02-20 08:28:05.146838934 +0000 UTC m=+0.086140540 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, container_name=iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:28:05 np0005625204.localdomain podman[88459]: 2026-02-20 08:28:05.150700584 +0000 UTC m=+0.090002180 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:28:05 np0005625204.localdomain podman[88460]: 2026-02-20 08:28:05.195943045 +0000 UTC m=+0.133886529 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, vcs-type=git, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 20 08:28:05 np0005625204.localdomain podman[88458]: 2026-02-20 08:28:05.242578159 +0000 UTC m=+0.183362091 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, build-date=2026-01-12T22:36:40Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, version=17.1.13)
Feb 20 08:28:05 np0005625204.localdomain podman[88461]: 2026-02-20 08:28:05.163070174 +0000 UTC m=+0.101063850 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step5, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git)
Feb 20 08:28:05 np0005625204.localdomain podman[88458]: 2026-02-20 08:28:05.295068673 +0000 UTC m=+0.235852625 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, version=17.1.13, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=ovn_controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git)
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:28:05 np0005625204.localdomain podman[88460]: 2026-02-20 08:28:05.31640942 +0000 UTC m=+0.254352964 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, config_id=tripleo_step4, release=1766032510, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:28:05 np0005625204.localdomain podman[88461]: 2026-02-20 08:28:05.346035531 +0000 UTC m=+0.284029237 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:28:05 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:28:06 np0005625204.localdomain systemd[1]: tmp-crun.SmFgiz.mount: Deactivated successfully.
Feb 20 08:28:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:28:09 np0005625204.localdomain systemd[1]: tmp-crun.aWNM79.mount: Deactivated successfully.
Feb 20 08:28:09 np0005625204.localdomain podman[88551]: 2026-02-20 08:28:09.139889133 +0000 UTC m=+0.082284432 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:28:09 np0005625204.localdomain podman[88551]: 2026-02-20 08:28:09.506097456 +0000 UTC m=+0.448492745 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4)
Feb 20 08:28:09 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:28:14 np0005625204.localdomain sshd[88575]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:28:15 np0005625204.localdomain sshd[88575]: Invalid user n8n from 101.36.109.176 port 53646
Feb 20 08:28:15 np0005625204.localdomain sshd[88575]: Received disconnect from 101.36.109.176 port 53646:11: Bye Bye [preauth]
Feb 20 08:28:15 np0005625204.localdomain sshd[88575]: Disconnected from invalid user n8n 101.36.109.176 port 53646 [preauth]
Feb 20 08:28:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:28:27 np0005625204.localdomain systemd[1]: tmp-crun.0b9wfA.mount: Deactivated successfully.
Feb 20 08:28:27 np0005625204.localdomain podman[88577]: 2026-02-20 08:28:27.147569923 +0000 UTC m=+0.086204022 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, container_name=metrics_qdr, distribution-scope=public, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:28:27 np0005625204.localdomain podman[88577]: 2026-02-20 08:28:27.346347667 +0000 UTC m=+0.284981756 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, distribution-scope=public, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:28:27 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:28:28 np0005625204.localdomain sshd[88606]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:28:28 np0005625204.localdomain sshd[88606]: Invalid user claude from 83.235.16.111 port 48440
Feb 20 08:28:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:28:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:28:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:28:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: tmp-crun.RBtN3v.mount: Deactivated successfully.
Feb 20 08:28:29 np0005625204.localdomain podman[88608]: 2026-02-20 08:28:29.028791975 +0000 UTC m=+0.080786806 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 20 08:28:29 np0005625204.localdomain podman[88616]: 2026-02-20 08:28:29.078179895 +0000 UTC m=+0.118124665 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, container_name=ceilometer_agent_compute)
Feb 20 08:28:29 np0005625204.localdomain sshd[88606]: Received disconnect from 83.235.16.111 port 48440:11: Bye Bye [preauth]
Feb 20 08:28:29 np0005625204.localdomain sshd[88606]: Disconnected from invalid user claude 83.235.16.111 port 48440 [preauth]
Feb 20 08:28:29 np0005625204.localdomain podman[88609]: 2026-02-20 08:28:29.086404058 +0000 UTC m=+0.129061741 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron)
Feb 20 08:28:29 np0005625204.localdomain podman[88616]: 2026-02-20 08:28:29.101243274 +0000 UTC m=+0.141188064 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1)
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:28:29 np0005625204.localdomain podman[88608]: 2026-02-20 08:28:29.11702965 +0000 UTC m=+0.169024481 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:28:29 np0005625204.localdomain podman[88609]: 2026-02-20 08:28:29.121420324 +0000 UTC m=+0.164078017 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:28:29 np0005625204.localdomain podman[88615]: 2026-02-20 08:28:29.054449414 +0000 UTC m=+0.095644143 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, distribution-scope=public, container_name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:28:29 np0005625204.localdomain podman[88615]: 2026-02-20 08:28:29.183211535 +0000 UTC m=+0.224406264 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:28:29 np0005625204.localdomain sshd[88697]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:28:29 np0005625204.localdomain recover_tripleo_nova_virtqemud[88698]: 63005
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:28:29 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:28:29 np0005625204.localdomain sshd[88697]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:28:30 np0005625204.localdomain systemd[1]: tmp-crun.rywsdY.mount: Deactivated successfully.
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:28:36 np0005625204.localdomain systemd[85653]: Created slice User Background Tasks Slice.
Feb 20 08:28:36 np0005625204.localdomain systemd[85653]: Starting Cleanup of User's Temporary Files and Directories...
Feb 20 08:28:36 np0005625204.localdomain podman[88701]: 2026-02-20 08:28:36.173654807 +0000 UTC m=+0.099648356 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 20 08:28:36 np0005625204.localdomain systemd[85653]: Finished Cleanup of User's Temporary Files and Directories.
Feb 20 08:28:36 np0005625204.localdomain podman[88701]: 2026-02-20 08:28:36.189119152 +0000 UTC m=+0.115112711 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:28:36 np0005625204.localdomain podman[88702]: 2026-02-20 08:28:36.268310459 +0000 UTC m=+0.188295023 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4)
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: tmp-crun.G0pL8Z.mount: Deactivated successfully.
Feb 20 08:28:36 np0005625204.localdomain podman[88703]: 2026-02-20 08:28:36.325704444 +0000 UTC m=+0.245388119 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team)
Feb 20 08:28:36 np0005625204.localdomain podman[88700]: 2026-02-20 08:28:36.364098895 +0000 UTC m=+0.293311793 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1766032510, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:28:36 np0005625204.localdomain podman[88703]: 2026-02-20 08:28:36.383064227 +0000 UTC m=+0.302747893 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true)
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:28:36 np0005625204.localdomain podman[88702]: 2026-02-20 08:28:36.394013334 +0000 UTC m=+0.313997868 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:28:36 np0005625204.localdomain podman[88700]: 2026-02-20 08:28:36.437820552 +0000 UTC m=+0.367033490 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:28:36 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:28:39 np0005625204.localdomain sudo[88792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:28:39 np0005625204.localdomain sudo[88792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:28:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:28:39 np0005625204.localdomain sudo[88792]: pam_unix(sudo:session): session closed for user root
Feb 20 08:28:40 np0005625204.localdomain sudo[88813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:28:40 np0005625204.localdomain sudo[88813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:28:40 np0005625204.localdomain podman[88806]: 2026-02-20 08:28:40.060088526 +0000 UTC m=+0.096639173 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:28:40 np0005625204.localdomain podman[88806]: 2026-02-20 08:28:40.444483859 +0000 UTC m=+0.481034576 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:28:40 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:28:40 np0005625204.localdomain sudo[88813]: pam_unix(sudo:session): session closed for user root
Feb 20 08:28:41 np0005625204.localdomain sudo[88876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:28:41 np0005625204.localdomain sudo[88876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:28:41 np0005625204.localdomain sudo[88876]: pam_unix(sudo:session): session closed for user root
Feb 20 08:28:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:28:58 np0005625204.localdomain podman[88936]: 2026-02-20 08:28:58.126544703 +0000 UTC m=+0.063476734 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z)
Feb 20 08:28:58 np0005625204.localdomain podman[88936]: 2026-02-20 08:28:58.340141943 +0000 UTC m=+0.277073974 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64)
Feb 20 08:28:58 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: tmp-crun.B21YD0.mount: Deactivated successfully.
Feb 20 08:29:00 np0005625204.localdomain podman[88967]: 2026-02-20 08:29:00.157859862 +0000 UTC m=+0.089471273 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510)
Feb 20 08:29:00 np0005625204.localdomain podman[88967]: 2026-02-20 08:29:00.17144704 +0000 UTC m=+0.103058451 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true)
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:29:00 np0005625204.localdomain podman[88973]: 2026-02-20 08:29:00.260760357 +0000 UTC m=+0.187147207 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 20 08:29:00 np0005625204.localdomain podman[88965]: 2026-02-20 08:29:00.310656502 +0000 UTC m=+0.248685440 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible)
Feb 20 08:29:00 np0005625204.localdomain podman[88973]: 2026-02-20 08:29:00.314427628 +0000 UTC m=+0.240814448 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:29:00 np0005625204.localdomain podman[88966]: 2026-02-20 08:29:00.359501604 +0000 UTC m=+0.293629942 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:29:00 np0005625204.localdomain podman[88965]: 2026-02-20 08:29:00.365939103 +0000 UTC m=+0.303968051 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:29:00 np0005625204.localdomain podman[88966]: 2026-02-20 08:29:00.420445459 +0000 UTC m=+0.354573797 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.13, com.redhat.component=openstack-cron-container, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc.)
Feb 20 08:29:00 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: tmp-crun.L3zaiK.mount: Deactivated successfully.
Feb 20 08:29:07 np0005625204.localdomain podman[89056]: 2026-02-20 08:29:07.15304747 +0000 UTC m=+0.090747261 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:29:07 np0005625204.localdomain podman[89056]: 2026-02-20 08:29:07.235941561 +0000 UTC m=+0.173641342 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, architecture=x86_64)
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:29:07 np0005625204.localdomain podman[89058]: 2026-02-20 08:29:07.247236818 +0000 UTC m=+0.135628822 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 20 08:29:07 np0005625204.localdomain podman[89057]: 2026-02-20 08:29:07.217534384 +0000 UTC m=+0.152748339 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:29:07 np0005625204.localdomain podman[89059]: 2026-02-20 08:29:07.187927224 +0000 UTC m=+0.113636457 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:29:07 np0005625204.localdomain podman[89057]: 2026-02-20 08:29:07.29769688 +0000 UTC m=+0.232910835 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:29:07 np0005625204.localdomain podman[89059]: 2026-02-20 08:29:07.322161273 +0000 UTC m=+0.247870526 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:29:07 np0005625204.localdomain podman[89058]: 2026-02-20 08:29:07.354712653 +0000 UTC m=+0.243104637 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:29:07 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:29:08 np0005625204.localdomain systemd[1]: tmp-crun.cuT8lv.mount: Deactivated successfully.
Feb 20 08:29:09 np0005625204.localdomain sshd[89147]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:29:10 np0005625204.localdomain sshd[89147]: Invalid user n8n from 77.232.138.190 port 55412
Feb 20 08:29:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:29:10 np0005625204.localdomain podman[89149]: 2026-02-20 08:29:10.594059802 +0000 UTC m=+0.083669825 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:29:10 np0005625204.localdomain sshd[89147]: Received disconnect from 77.232.138.190 port 55412:11: Bye Bye [preauth]
Feb 20 08:29:10 np0005625204.localdomain sshd[89147]: Disconnected from invalid user n8n 77.232.138.190 port 55412 [preauth]
Feb 20 08:29:11 np0005625204.localdomain podman[89149]: 2026-02-20 08:29:11.038155377 +0000 UTC m=+0.527765370 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:29:11 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:29:14 np0005625204.localdomain sshd[89172]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:29:15 np0005625204.localdomain sshd[89172]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:29:27 np0005625204.localdomain sshd[89174]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:29:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:29:29 np0005625204.localdomain systemd[1]: tmp-crun.Yjsavt.mount: Deactivated successfully.
Feb 20 08:29:29 np0005625204.localdomain podman[89176]: 2026-02-20 08:29:29.162954571 +0000 UTC m=+0.100988589 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 20 08:29:29 np0005625204.localdomain podman[89176]: 2026-02-20 08:29:29.355922548 +0000 UTC m=+0.293956486 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:29:29 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:29:29 np0005625204.localdomain sshd[89174]: Received disconnect from 152.32.189.21 port 43462:11: Bye Bye [preauth]
Feb 20 08:29:29 np0005625204.localdomain sshd[89174]: Disconnected from authenticating user root 152.32.189.21 port 43462 [preauth]
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:29:31 np0005625204.localdomain podman[89208]: 2026-02-20 08:29:31.144167674 +0000 UTC m=+0.078125635 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:29:31 np0005625204.localdomain podman[89206]: 2026-02-20 08:29:31.153514643 +0000 UTC m=+0.088632919 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, container_name=ceilometer_agent_ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 20 08:29:31 np0005625204.localdomain podman[89207]: 2026-02-20 08:29:31.208818264 +0000 UTC m=+0.142572969 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5)
Feb 20 08:29:31 np0005625204.localdomain podman[89206]: 2026-02-20 08:29:31.220008788 +0000 UTC m=+0.155127074 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:29:31 np0005625204.localdomain podman[89208]: 2026-02-20 08:29:31.234158973 +0000 UTC m=+0.168116934 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-collectd-container, release=1766032510, tcib_managed=true, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:29:31 np0005625204.localdomain podman[89207]: 2026-02-20 08:29:31.245866814 +0000 UTC m=+0.179621579 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:29:31 np0005625204.localdomain podman[89209]: 2026-02-20 08:29:31.222825935 +0000 UTC m=+0.151209464 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:29:31 np0005625204.localdomain podman[89209]: 2026-02-20 08:29:31.308027747 +0000 UTC m=+0.236411286 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5)
Feb 20 08:29:31 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:29:32 np0005625204.localdomain systemd[1]: tmp-crun.jOJB1X.mount: Deactivated successfully.
Feb 20 08:29:36 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:29:36 np0005625204.localdomain recover_tripleo_nova_virtqemud[89298]: 63005
Feb 20 08:29:36 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:29:36 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: tmp-crun.0tu98f.mount: Deactivated successfully.
Feb 20 08:29:38 np0005625204.localdomain podman[89299]: 2026-02-20 08:29:38.161569548 +0000 UTC m=+0.092244080 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:29:38 np0005625204.localdomain podman[89301]: 2026-02-20 08:29:38.199965189 +0000 UTC m=+0.121877571 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=)
Feb 20 08:29:38 np0005625204.localdomain podman[89299]: 2026-02-20 08:29:38.207969026 +0000 UTC m=+0.138643558 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1)
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:29:38 np0005625204.localdomain podman[89301]: 2026-02-20 08:29:38.267533668 +0000 UTC m=+0.189446040 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_id=tripleo_step4)
Feb 20 08:29:38 np0005625204.localdomain podman[89303]: 2026-02-20 08:29:38.275727481 +0000 UTC m=+0.197819919 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.expose-services=, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:29:38 np0005625204.localdomain podman[89300]: 2026-02-20 08:29:38.326919146 +0000 UTC m=+0.250120668 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public)
Feb 20 08:29:38 np0005625204.localdomain podman[89300]: 2026-02-20 08:29:38.360118887 +0000 UTC m=+0.283320369 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510)
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:29:38 np0005625204.localdomain podman[89303]: 2026-02-20 08:29:38.380806744 +0000 UTC m=+0.302899182 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:29:38 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:29:41 np0005625204.localdomain sudo[89393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:29:41 np0005625204.localdomain sudo[89393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:29:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:29:41 np0005625204.localdomain sudo[89393]: pam_unix(sudo:session): session closed for user root
Feb 20 08:29:41 np0005625204.localdomain sudo[89409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:29:41 np0005625204.localdomain sudo[89409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:29:41 np0005625204.localdomain systemd[1]: tmp-crun.bt2Z3k.mount: Deactivated successfully.
Feb 20 08:29:41 np0005625204.localdomain podman[89408]: 2026-02-20 08:29:41.696198152 +0000 UTC m=+0.088862656 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, distribution-scope=public, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:29:42 np0005625204.localdomain podman[89408]: 2026-02-20 08:29:42.072290144 +0000 UTC m=+0.464954638 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, release=1766032510, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:29:42 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:29:42 np0005625204.localdomain sudo[89409]: pam_unix(sudo:session): session closed for user root
Feb 20 08:29:45 np0005625204.localdomain sudo[89477]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:29:45 np0005625204.localdomain sudo[89477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:29:45 np0005625204.localdomain sudo[89477]: pam_unix(sudo:session): session closed for user root
Feb 20 08:29:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:29:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 476 writes, 1864 keys, 476 commit groups, 1.0 writes per commit group, ingest: 2.57 MB, 0.00 MB/s
                                                          Interval WAL: 476 writes, 169 syncs, 2.82 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:29:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:29:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 522 writes, 1999 keys, 522 commit groups, 1.0 writes per commit group, ingest: 2.25 MB, 0.00 MB/s
                                                          Interval WAL: 522 writes, 182 syncs, 2.87 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:29:53 np0005625204.localdomain sshd[89492]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:29:54 np0005625204.localdomain sshd[89492]: Invalid user ts2 from 178.217.173.50 port 38758
Feb 20 08:29:54 np0005625204.localdomain sshd[89492]: Received disconnect from 178.217.173.50 port 38758:11: Bye Bye [preauth]
Feb 20 08:29:54 np0005625204.localdomain sshd[89492]: Disconnected from invalid user ts2 178.217.173.50 port 38758 [preauth]
Feb 20 08:29:58 np0005625204.localdomain sshd[89539]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:29:58 np0005625204.localdomain sshd[89539]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:30:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:30:00 np0005625204.localdomain podman[89541]: 2026-02-20 08:30:00.149213697 +0000 UTC m=+0.087754891 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:30:00 np0005625204.localdomain podman[89541]: 2026-02-20 08:30:00.36802533 +0000 UTC m=+0.306566474 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:30:00 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: tmp-crun.Fb208A.mount: Deactivated successfully.
Feb 20 08:30:02 np0005625204.localdomain podman[89571]: 2026-02-20 08:30:02.203847611 +0000 UTC m=+0.143543649 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:30:02 np0005625204.localdomain podman[89573]: 2026-02-20 08:30:02.212436816 +0000 UTC m=+0.143301173 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5)
Feb 20 08:30:02 np0005625204.localdomain podman[89572]: 2026-02-20 08:30:02.251832008 +0000 UTC m=+0.187668547 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:30:02 np0005625204.localdomain podman[89571]: 2026-02-20 08:30:02.265336153 +0000 UTC m=+0.205032191 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 20 08:30:02 np0005625204.localdomain podman[89570]: 2026-02-20 08:30:02.170536326 +0000 UTC m=+0.110959996 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.)
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:30:02 np0005625204.localdomain podman[89572]: 2026-02-20 08:30:02.293092807 +0000 UTC m=+0.228929326 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1766032510, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:30:02 np0005625204.localdomain podman[89570]: 2026-02-20 08:30:02.305449518 +0000 UTC m=+0.245873168 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-type=git)
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:30:02 np0005625204.localdomain podman[89573]: 2026-02-20 08:30:02.31951224 +0000 UTC m=+0.250376607 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:30:02 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:30:03 np0005625204.localdomain systemd[1]: tmp-crun.wtxT2t.mount: Deactivated successfully.
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: tmp-crun.vCYBVE.mount: Deactivated successfully.
Feb 20 08:30:09 np0005625204.localdomain podman[89661]: 2026-02-20 08:30:09.203118234 +0000 UTC m=+0.139622418 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:30:09 np0005625204.localdomain podman[89661]: 2026-02-20 08:30:09.224072768 +0000 UTC m=+0.160576952 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:30:09 np0005625204.localdomain podman[89663]: 2026-02-20 08:30:09.267820955 +0000 UTC m=+0.194340561 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:30:09 np0005625204.localdomain podman[89662]: 2026-02-20 08:30:09.184052977 +0000 UTC m=+0.115551116 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, container_name=iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:30:09 np0005625204.localdomain podman[89669]: 2026-02-20 08:30:09.312953803 +0000 UTC m=+0.236759955 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:30:09 np0005625204.localdomain podman[89669]: 2026-02-20 08:30:09.342884845 +0000 UTC m=+0.266690997 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:30:09 np0005625204.localdomain podman[89662]: 2026-02-20 08:30:09.36680797 +0000 UTC m=+0.298306129 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team)
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:30:09 np0005625204.localdomain podman[89663]: 2026-02-20 08:30:09.393352028 +0000 UTC m=+0.319871634 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:30:09 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:30:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:30:13 np0005625204.localdomain podman[89757]: 2026-02-20 08:30:13.145306798 +0000 UTC m=+0.085296015 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Feb 20 08:30:13 np0005625204.localdomain podman[89757]: 2026-02-20 08:30:13.509386331 +0000 UTC m=+0.449375638 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.buildah.version=1.41.5, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:30:13 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:30:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:30:31 np0005625204.localdomain podman[89780]: 2026-02-20 08:30:31.127527396 +0000 UTC m=+0.068014784 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 20 08:30:31 np0005625204.localdomain podman[89780]: 2026-02-20 08:30:31.319966837 +0000 UTC m=+0.260454285 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:30:31 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:30:33 np0005625204.localdomain podman[89811]: 2026-02-20 08:30:33.134158592 +0000 UTC m=+0.068224371 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git)
Feb 20 08:30:33 np0005625204.localdomain podman[89811]: 2026-02-20 08:30:33.142892571 +0000 UTC m=+0.076958390 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, architecture=x86_64)
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:30:33 np0005625204.localdomain podman[89809]: 2026-02-20 08:30:33.156158018 +0000 UTC m=+0.093886909 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:30:33 np0005625204.localdomain podman[89809]: 2026-02-20 08:30:33.207173499 +0000 UTC m=+0.144902360 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:30:33 np0005625204.localdomain podman[89810]: 2026-02-20 08:30:33.20952931 +0000 UTC m=+0.142741483 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.component=openstack-cron-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:30:33 np0005625204.localdomain podman[89810]: 2026-02-20 08:30:33.295409123 +0000 UTC m=+0.228621246 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, container_name=logrotate_crond, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:30:33 np0005625204.localdomain podman[89815]: 2026-02-20 08:30:33.266892976 +0000 UTC m=+0.195173587 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 20 08:30:33 np0005625204.localdomain podman[89815]: 2026-02-20 08:30:33.349939462 +0000 UTC m=+0.278220143 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute)
Feb 20 08:30:33 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:30:40 np0005625204.localdomain podman[89903]: 2026-02-20 08:30:40.157045391 +0000 UTC m=+0.086225884 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:30:40 np0005625204.localdomain podman[89903]: 2026-02-20 08:30:40.169007359 +0000 UTC m=+0.098187912 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5)
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: tmp-crun.MoTSp5.mount: Deactivated successfully.
Feb 20 08:30:40 np0005625204.localdomain podman[89905]: 2026-02-20 08:30:40.226006873 +0000 UTC m=+0.147767188 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Feb 20 08:30:40 np0005625204.localdomain podman[89905]: 2026-02-20 08:30:40.252982523 +0000 UTC m=+0.174742898 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true)
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:30:40 np0005625204.localdomain podman[89902]: 2026-02-20 08:30:40.301463145 +0000 UTC m=+0.232458714 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, container_name=ovn_controller, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:30:40 np0005625204.localdomain podman[89902]: 2026-02-20 08:30:40.352094293 +0000 UTC m=+0.283089872 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com)
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:30:40 np0005625204.localdomain podman[89904]: 2026-02-20 08:30:40.362912296 +0000 UTC m=+0.289124618 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_metadata_agent, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4)
Feb 20 08:30:40 np0005625204.localdomain podman[89904]: 2026-02-20 08:30:40.406976292 +0000 UTC m=+0.333188604 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:30:40 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:30:43 np0005625204.localdomain sshd[89996]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:30:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:30:44 np0005625204.localdomain systemd[1]: tmp-crun.Tr43b0.mount: Deactivated successfully.
Feb 20 08:30:44 np0005625204.localdomain podman[89998]: 2026-02-20 08:30:44.150698999 +0000 UTC m=+0.090137104 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:30:44 np0005625204.localdomain sshd[89996]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:30:44 np0005625204.localdomain podman[89998]: 2026-02-20 08:30:44.511861853 +0000 UTC m=+0.451299998 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, container_name=nova_migration_target, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:30:44 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:30:45 np0005625204.localdomain sudo[90022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:30:45 np0005625204.localdomain sudo[90022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:30:45 np0005625204.localdomain sudo[90022]: pam_unix(sudo:session): session closed for user root
Feb 20 08:30:45 np0005625204.localdomain sudo[90037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 08:30:45 np0005625204.localdomain sudo[90037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:30:46 np0005625204.localdomain sudo[90037]: pam_unix(sudo:session): session closed for user root
Feb 20 08:30:46 np0005625204.localdomain sudo[90074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:30:46 np0005625204.localdomain sudo[90074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:30:46 np0005625204.localdomain sudo[90074]: pam_unix(sudo:session): session closed for user root
Feb 20 08:30:46 np0005625204.localdomain sudo[90089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:30:46 np0005625204.localdomain sudo[90089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:30:46 np0005625204.localdomain sudo[90089]: pam_unix(sudo:session): session closed for user root
Feb 20 08:30:47 np0005625204.localdomain sudo[90136]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:30:47 np0005625204.localdomain sudo[90136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:30:47 np0005625204.localdomain sudo[90136]: pam_unix(sudo:session): session closed for user root
Feb 20 08:30:56 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:30:56 np0005625204.localdomain recover_tripleo_nova_virtqemud[90152]: 63005
Feb 20 08:30:56 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:30:56 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:31:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:31:02 np0005625204.localdomain sshd[90210]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:31:02 np0005625204.localdomain systemd[1]: tmp-crun.4j52RB.mount: Deactivated successfully.
Feb 20 08:31:02 np0005625204.localdomain podman[90198]: 2026-02-20 08:31:02.153022177 +0000 UTC m=+0.090037461 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team)
Feb 20 08:31:02 np0005625204.localdomain podman[90198]: 2026-02-20 08:31:02.360993407 +0000 UTC m=+0.298008641 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:31:02 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:31:02 np0005625204.localdomain sshd[90210]: Invalid user  from 194.187.176.243 port 28892
Feb 20 08:31:02 np0005625204.localdomain sshd[90210]: Connection closed by invalid user  194.187.176.243 port 28892 [preauth]
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: tmp-crun.PZpOP7.mount: Deactivated successfully.
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: tmp-crun.4gQc0J.mount: Deactivated successfully.
Feb 20 08:31:04 np0005625204.localdomain podman[90241]: 2026-02-20 08:31:04.195209168 +0000 UTC m=+0.120997075 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:31:04 np0005625204.localdomain podman[90229]: 2026-02-20 08:31:04.149284064 +0000 UTC m=+0.087769041 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible)
Feb 20 08:31:04 np0005625204.localdomain podman[90228]: 2026-02-20 08:31:04.215891965 +0000 UTC m=+0.153359421 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:31:04 np0005625204.localdomain podman[90229]: 2026-02-20 08:31:04.231990749 +0000 UTC m=+0.170475686 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=logrotate_crond)
Feb 20 08:31:04 np0005625204.localdomain podman[90228]: 2026-02-20 08:31:04.241834983 +0000 UTC m=+0.179302389 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:31:04 np0005625204.localdomain podman[90241]: 2026-02-20 08:31:04.242373169 +0000 UTC m=+0.168161066 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:31:04 np0005625204.localdomain podman[90230]: 2026-02-20 08:31:04.183967353 +0000 UTC m=+0.115073614 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:31:04 np0005625204.localdomain podman[90230]: 2026-02-20 08:31:04.313517017 +0000 UTC m=+0.244623328 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_id=tripleo_step3)
Feb 20 08:31:04 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:31:10 np0005625204.localdomain sshd[90323]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: tmp-crun.utdfF3.mount: Deactivated successfully.
Feb 20 08:31:11 np0005625204.localdomain podman[90328]: 2026-02-20 08:31:11.169718777 +0000 UTC m=+0.100732691 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_compute)
Feb 20 08:31:11 np0005625204.localdomain podman[90326]: 2026-02-20 08:31:11.211027798 +0000 UTC m=+0.147277483 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1766032510, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 20 08:31:11 np0005625204.localdomain podman[90326]: 2026-02-20 08:31:11.219846139 +0000 UTC m=+0.156095844 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:31:11 np0005625204.localdomain podman[90328]: 2026-02-20 08:31:11.305203635 +0000 UTC m=+0.236217489 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com)
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:31:11 np0005625204.localdomain podman[90325]: 2026-02-20 08:31:11.35735175 +0000 UTC m=+0.295116231 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1)
Feb 20 08:31:11 np0005625204.localdomain podman[90327]: 2026-02-20 08:31:11.408134613 +0000 UTC m=+0.338769515 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:31:11 np0005625204.localdomain podman[90325]: 2026-02-20 08:31:11.437144955 +0000 UTC m=+0.374909396 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64)
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:31:11 np0005625204.localdomain podman[90327]: 2026-02-20 08:31:11.483330747 +0000 UTC m=+0.413965659 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 20 08:31:11 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:31:13 np0005625204.localdomain sshd[90323]: Invalid user iksi from 103.157.25.4 port 59230
Feb 20 08:31:13 np0005625204.localdomain sshd[90323]: Received disconnect from 103.157.25.4 port 59230:11: Bye Bye [preauth]
Feb 20 08:31:13 np0005625204.localdomain sshd[90323]: Disconnected from invalid user iksi 103.157.25.4 port 59230 [preauth]
Feb 20 08:31:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:31:15 np0005625204.localdomain podman[90414]: 2026-02-20 08:31:15.141962958 +0000 UTC m=+0.077731243 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5)
Feb 20 08:31:15 np0005625204.localdomain podman[90414]: 2026-02-20 08:31:15.488022956 +0000 UTC m=+0.423791211 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target)
Feb 20 08:31:15 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:31:25 np0005625204.localdomain sshd[90437]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:31:26 np0005625204.localdomain sshd[90437]: Invalid user ts2 from 101.36.109.176 port 48462
Feb 20 08:31:26 np0005625204.localdomain sshd[90437]: Received disconnect from 101.36.109.176 port 48462:11: Bye Bye [preauth]
Feb 20 08:31:26 np0005625204.localdomain sshd[90437]: Disconnected from invalid user ts2 101.36.109.176 port 48462 [preauth]
Feb 20 08:31:27 np0005625204.localdomain sshd[90439]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:31:28 np0005625204.localdomain sshd[90441]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:31:28 np0005625204.localdomain sshd[90441]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:31:28 np0005625204.localdomain sshd[90439]: Invalid user viewtinet from 83.235.16.111 port 54396
Feb 20 08:31:28 np0005625204.localdomain sshd[90439]: Received disconnect from 83.235.16.111 port 54396:11: Bye Bye [preauth]
Feb 20 08:31:28 np0005625204.localdomain sshd[90439]: Disconnected from invalid user viewtinet 83.235.16.111 port 54396 [preauth]
Feb 20 08:31:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:31:33 np0005625204.localdomain systemd[1]: tmp-crun.Z6uuEL.mount: Deactivated successfully.
Feb 20 08:31:33 np0005625204.localdomain podman[90443]: 2026-02-20 08:31:33.144305097 +0000 UTC m=+0.083823820 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 20 08:31:33 np0005625204.localdomain podman[90443]: 2026-02-20 08:31:33.363014607 +0000 UTC m=+0.302533260 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13)
Feb 20 08:31:33 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:31:35 np0005625204.localdomain podman[90473]: 2026-02-20 08:31:35.129321118 +0000 UTC m=+0.070012525 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:31:35 np0005625204.localdomain podman[90474]: 2026-02-20 08:31:35.152046577 +0000 UTC m=+0.086350078 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond)
Feb 20 08:31:35 np0005625204.localdomain podman[90475]: 2026-02-20 08:31:35.210794545 +0000 UTC m=+0.141573278 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Feb 20 08:31:35 np0005625204.localdomain podman[90473]: 2026-02-20 08:31:35.258817873 +0000 UTC m=+0.199509290 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:31:35 np0005625204.localdomain podman[90481]: 2026-02-20 08:31:35.268322295 +0000 UTC m=+0.195005981 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:31:35 np0005625204.localdomain podman[90474]: 2026-02-20 08:31:35.290390044 +0000 UTC m=+0.224693575 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:10:15Z, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:31:35 np0005625204.localdomain podman[90475]: 2026-02-20 08:31:35.298213945 +0000 UTC m=+0.228992718 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, container_name=collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, build-date=2026-01-12T22:10:15Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:31:35 np0005625204.localdomain podman[90481]: 2026-02-20 08:31:35.34742996 +0000 UTC m=+0.274113686 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:31:35 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: tmp-crun.EoiROg.mount: Deactivated successfully.
Feb 20 08:31:42 np0005625204.localdomain podman[90566]: 2026-02-20 08:31:42.510090198 +0000 UTC m=+0.091398683 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:31:42 np0005625204.localdomain podman[90564]: 2026-02-20 08:31:42.5725371 +0000 UTC m=+0.160173660 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:31:42 np0005625204.localdomain podman[90564]: 2026-02-20 08:31:42.596164267 +0000 UTC m=+0.183800847 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:31:42 np0005625204.localdomain podman[90567]: 2026-02-20 08:31:42.602858193 +0000 UTC m=+0.181289299 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:31:42 np0005625204.localdomain podman[90567]: 2026-02-20 08:31:42.65800604 +0000 UTC m=+0.236437146 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:31:42 np0005625204.localdomain podman[90565]: 2026-02-20 08:31:42.673389804 +0000 UTC m=+0.256772113 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:31:42 np0005625204.localdomain podman[90566]: 2026-02-20 08:31:42.687719525 +0000 UTC m=+0.269027970 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64)
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:31:42 np0005625204.localdomain podman[90565]: 2026-02-20 08:31:42.711113524 +0000 UTC m=+0.294495813 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1)
Feb 20 08:31:42 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:31:43 np0005625204.localdomain systemd[1]: tmp-crun.Gvb4hb.mount: Deactivated successfully.
Feb 20 08:31:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:31:46 np0005625204.localdomain systemd[1]: tmp-crun.n31or9.mount: Deactivated successfully.
Feb 20 08:31:46 np0005625204.localdomain podman[90659]: 2026-02-20 08:31:46.155233434 +0000 UTC m=+0.085191393 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:31:46 np0005625204.localdomain podman[90659]: 2026-02-20 08:31:46.513459316 +0000 UTC m=+0.443417315 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:31:46 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:31:47 np0005625204.localdomain sudo[90681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:31:47 np0005625204.localdomain sudo[90681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:31:47 np0005625204.localdomain sudo[90681]: pam_unix(sudo:session): session closed for user root
Feb 20 08:31:47 np0005625204.localdomain sudo[90696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:31:47 np0005625204.localdomain sudo[90696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:31:48 np0005625204.localdomain sudo[90696]: pam_unix(sudo:session): session closed for user root
Feb 20 08:31:49 np0005625204.localdomain sudo[90743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:31:49 np0005625204.localdomain sudo[90743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:31:49 np0005625204.localdomain sudo[90743]: pam_unix(sudo:session): session closed for user root
Feb 20 08:32:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:32:04 np0005625204.localdomain systemd[1]: tmp-crun.nRlWSG.mount: Deactivated successfully.
Feb 20 08:32:04 np0005625204.localdomain podman[90781]: 2026-02-20 08:32:04.167803395 +0000 UTC m=+0.095815339 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 08:32:04 np0005625204.localdomain podman[90781]: 2026-02-20 08:32:04.366582712 +0000 UTC m=+0.294594676 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1)
Feb 20 08:32:04 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:32:05 np0005625204.localdomain sshd[90812]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:32:06 np0005625204.localdomain recover_tripleo_nova_virtqemud[90834]: 63005
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: tmp-crun.8evlXO.mount: Deactivated successfully.
Feb 20 08:32:06 np0005625204.localdomain podman[90814]: 2026-02-20 08:32:06.168428807 +0000 UTC m=+0.102394963 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4)
Feb 20 08:32:06 np0005625204.localdomain podman[90817]: 2026-02-20 08:32:06.16823964 +0000 UTC m=+0.093275291 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:32:06 np0005625204.localdomain podman[90816]: 2026-02-20 08:32:06.255824636 +0000 UTC m=+0.183609252 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, tcib_managed=true, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:32:06 np0005625204.localdomain podman[90816]: 2026-02-20 08:32:06.266937708 +0000 UTC m=+0.194722304 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:32:06 np0005625204.localdomain podman[90815]: 2026-02-20 08:32:06.309130756 +0000 UTC m=+0.241621817 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:32:06 np0005625204.localdomain podman[90814]: 2026-02-20 08:32:06.321671681 +0000 UTC m=+0.255637897 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:32:06 np0005625204.localdomain podman[90815]: 2026-02-20 08:32:06.344211045 +0000 UTC m=+0.276702126 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, release=1766032510)
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:32:06 np0005625204.localdomain podman[90817]: 2026-02-20 08:32:06.403019515 +0000 UTC m=+0.328055226 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:32:06 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:32:06 np0005625204.localdomain sshd[90812]: Received disconnect from 77.232.138.190 port 44264:11: Bye Bye [preauth]
Feb 20 08:32:06 np0005625204.localdomain sshd[90812]: Disconnected from authenticating user root 77.232.138.190 port 44264 [preauth]
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:32:13 np0005625204.localdomain podman[90908]: 2026-02-20 08:32:13.149170349 +0000 UTC m=+0.076948618 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=)
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: tmp-crun.F66HKU.mount: Deactivated successfully.
Feb 20 08:32:13 np0005625204.localdomain podman[90908]: 2026-02-20 08:32:13.216042447 +0000 UTC m=+0.143820736 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=ovn_metadata_agent, distribution-scope=public)
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:32:13 np0005625204.localdomain podman[90909]: 2026-02-20 08:32:13.269461121 +0000 UTC m=+0.190987008 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:32:13 np0005625204.localdomain podman[90907]: 2026-02-20 08:32:13.218442321 +0000 UTC m=+0.146735426 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:32:13 np0005625204.localdomain podman[90906]: 2026-02-20 08:32:13.321457681 +0000 UTC m=+0.250162958 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:32:13 np0005625204.localdomain podman[90907]: 2026-02-20 08:32:13.358126689 +0000 UTC m=+0.286419794 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:32:13 np0005625204.localdomain podman[90906]: 2026-02-20 08:32:13.373998778 +0000 UTC m=+0.302704105 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com)
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:32:13 np0005625204.localdomain podman[90909]: 2026-02-20 08:32:13.426584236 +0000 UTC m=+0.348110133 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:32:13 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:32:13 np0005625204.localdomain sshd[90997]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:32:14 np0005625204.localdomain sshd[90997]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:32:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:32:17 np0005625204.localdomain podman[90999]: 2026-02-20 08:32:17.141051993 +0000 UTC m=+0.079421244 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:32:17 np0005625204.localdomain podman[90999]: 2026-02-20 08:32:17.513792803 +0000 UTC m=+0.452162084 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4)
Feb 20 08:32:17 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:32:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:32:35 np0005625204.localdomain systemd[1]: tmp-crun.lF7ev8.mount: Deactivated successfully.
Feb 20 08:32:35 np0005625204.localdomain podman[91022]: 2026-02-20 08:32:35.154753253 +0000 UTC m=+0.096502381 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, batch=17.1_20260112.1, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Feb 20 08:32:35 np0005625204.localdomain podman[91022]: 2026-02-20 08:32:35.350091283 +0000 UTC m=+0.291840401 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:32:35 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: tmp-crun.gyw7w9.mount: Deactivated successfully.
Feb 20 08:32:37 np0005625204.localdomain podman[91053]: 2026-02-20 08:32:37.166883827 +0000 UTC m=+0.099433611 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, version=17.1.13)
Feb 20 08:32:37 np0005625204.localdomain podman[91052]: 2026-02-20 08:32:37.218529816 +0000 UTC m=+0.148182891 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond)
Feb 20 08:32:37 np0005625204.localdomain podman[91052]: 2026-02-20 08:32:37.226841452 +0000 UTC m=+0.156494537 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:32:37 np0005625204.localdomain podman[91054]: 2026-02-20 08:32:37.269904168 +0000 UTC m=+0.198452518 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, tcib_managed=true, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:32:37 np0005625204.localdomain podman[91053]: 2026-02-20 08:32:37.281872126 +0000 UTC m=+0.214421990 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:32:37 np0005625204.localdomain podman[91054]: 2026-02-20 08:32:37.327122478 +0000 UTC m=+0.255670788 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5)
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:32:37 np0005625204.localdomain sshd[91121]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:32:37 np0005625204.localdomain podman[91051]: 2026-02-20 08:32:37.378292962 +0000 UTC m=+0.310108982 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:32:37 np0005625204.localdomain podman[91051]: 2026-02-20 08:32:37.439173166 +0000 UTC m=+0.370989196 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 20 08:32:37 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:32:38 np0005625204.localdomain sshd[91121]: Received disconnect from 152.32.189.21 port 53394:11: Bye Bye [preauth]
Feb 20 08:32:38 np0005625204.localdomain sshd[91121]: Disconnected from authenticating user root 152.32.189.21 port 53394 [preauth]
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: tmp-crun.aBNPpw.mount: Deactivated successfully.
Feb 20 08:32:44 np0005625204.localdomain podman[91139]: 2026-02-20 08:32:44.205876875 +0000 UTC m=+0.133402727 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 20 08:32:44 np0005625204.localdomain podman[91138]: 2026-02-20 08:32:44.176016925 +0000 UTC m=+0.106186088 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510)
Feb 20 08:32:44 np0005625204.localdomain podman[91138]: 2026-02-20 08:32:44.262019232 +0000 UTC m=+0.192188345 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1)
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:32:44 np0005625204.localdomain podman[91145]: 2026-02-20 08:32:44.273917978 +0000 UTC m=+0.197866510 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:32:44 np0005625204.localdomain podman[91145]: 2026-02-20 08:32:44.307057858 +0000 UTC m=+0.231006430 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:32:44 np0005625204.localdomain podman[91137]: 2026-02-20 08:32:44.316416335 +0000 UTC m=+0.251642364 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., vcs-type=git)
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:32:44 np0005625204.localdomain podman[91139]: 2026-02-20 08:32:44.335182973 +0000 UTC m=+0.262708825 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:32:44 np0005625204.localdomain podman[91137]: 2026-02-20 08:32:44.369152158 +0000 UTC m=+0.304378187 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:32:44 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:32:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:32:48 np0005625204.localdomain systemd[1]: tmp-crun.njZnIe.mount: Deactivated successfully.
Feb 20 08:32:48 np0005625204.localdomain podman[91227]: 2026-02-20 08:32:48.144566671 +0000 UTC m=+0.088985980 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64)
Feb 20 08:32:48 np0005625204.localdomain podman[91227]: 2026-02-20 08:32:48.517086164 +0000 UTC m=+0.461505413 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:32:48 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:32:49 np0005625204.localdomain sudo[91250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:32:49 np0005625204.localdomain sudo[91250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:32:49 np0005625204.localdomain sudo[91250]: pam_unix(sudo:session): session closed for user root
Feb 20 08:32:49 np0005625204.localdomain sudo[91265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 08:32:49 np0005625204.localdomain sudo[91265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:32:50 np0005625204.localdomain podman[91350]: 2026-02-20 08:32:50.285358295 +0000 UTC m=+0.095870882 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_BRANCH=main, io.buildah.version=1.42.2, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc.)
Feb 20 08:32:50 np0005625204.localdomain podman[91350]: 2026-02-20 08:32:50.418255344 +0000 UTC m=+0.228767931 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, vcs-type=git, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 20 08:32:50 np0005625204.localdomain sudo[91265]: pam_unix(sudo:session): session closed for user root
Feb 20 08:32:50 np0005625204.localdomain sudo[91415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:32:50 np0005625204.localdomain sudo[91415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:32:50 np0005625204.localdomain sudo[91415]: pam_unix(sudo:session): session closed for user root
Feb 20 08:32:50 np0005625204.localdomain sudo[91430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:32:50 np0005625204.localdomain sudo[91430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:32:51 np0005625204.localdomain sudo[91430]: pam_unix(sudo:session): session closed for user root
Feb 20 08:32:52 np0005625204.localdomain sudo[91476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:32:52 np0005625204.localdomain sudo[91476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:32:52 np0005625204.localdomain sudo[91476]: pam_unix(sudo:session): session closed for user root
Feb 20 08:32:58 np0005625204.localdomain sshd[91491]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:32:59 np0005625204.localdomain sshd[91491]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:33:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:33:06 np0005625204.localdomain podman[91516]: 2026-02-20 08:33:06.160441225 +0000 UTC m=+0.094430527 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, version=17.1.13)
Feb 20 08:33:06 np0005625204.localdomain podman[91516]: 2026-02-20 08:33:06.371133938 +0000 UTC m=+0.305123200 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:33:06 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:33:08 np0005625204.localdomain podman[91547]: 2026-02-20 08:33:08.156717721 +0000 UTC m=+0.089157074 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:33:08 np0005625204.localdomain podman[91547]: 2026-02-20 08:33:08.191323267 +0000 UTC m=+0.123762670 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:33:08 np0005625204.localdomain podman[91546]: 2026-02-20 08:33:08.210511616 +0000 UTC m=+0.145661933 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 20 08:33:08 np0005625204.localdomain podman[91546]: 2026-02-20 08:33:08.216595664 +0000 UTC m=+0.151745991 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond)
Feb 20 08:33:08 np0005625204.localdomain podman[91548]: 2026-02-20 08:33:08.176742777 +0000 UTC m=+0.102413441 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:33:08 np0005625204.localdomain podman[91548]: 2026-02-20 08:33:08.256902104 +0000 UTC m=+0.182572718 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:33:08 np0005625204.localdomain podman[91545]: 2026-02-20 08:33:08.3097378 +0000 UTC m=+0.244561307 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=)
Feb 20 08:33:08 np0005625204.localdomain podman[91545]: 2026-02-20 08:33:08.341050574 +0000 UTC m=+0.275874101 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:33:08 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: tmp-crun.NbN9Yy.mount: Deactivated successfully.
Feb 20 08:33:15 np0005625204.localdomain podman[91639]: 2026-02-20 08:33:15.218615522 +0000 UTC m=+0.145729696 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, batch=17.1_20260112.1, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:33:15 np0005625204.localdomain podman[91639]: 2026-02-20 08:33:15.250935637 +0000 UTC m=+0.178049791 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5)
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:33:15 np0005625204.localdomain podman[91638]: 2026-02-20 08:33:15.265534026 +0000 UTC m=+0.196819937 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, tcib_managed=true, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:33:15 np0005625204.localdomain podman[91637]: 2026-02-20 08:33:15.183262093 +0000 UTC m=+0.114376189 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:33:15 np0005625204.localdomain podman[91637]: 2026-02-20 08:33:15.317910587 +0000 UTC m=+0.249024633 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:33:15 np0005625204.localdomain podman[91636]: 2026-02-20 08:33:15.374665194 +0000 UTC m=+0.306848084 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, release=1766032510)
Feb 20 08:33:15 np0005625204.localdomain podman[91638]: 2026-02-20 08:33:15.389571682 +0000 UTC m=+0.320857623 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:33:15 np0005625204.localdomain podman[91636]: 2026-02-20 08:33:15.428106738 +0000 UTC m=+0.360289628 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-type=git, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_controller)
Feb 20 08:33:15 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:33:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:33:19 np0005625204.localdomain podman[91731]: 2026-02-20 08:33:19.139961806 +0000 UTC m=+0.079632482 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:33:19 np0005625204.localdomain podman[91731]: 2026-02-20 08:33:19.522465605 +0000 UTC m=+0.462136251 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:33:19 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:33:23 np0005625204.localdomain sshd[91753]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:33:24 np0005625204.localdomain sshd[91753]: Invalid user oracle from 178.217.173.50 port 45294
Feb 20 08:33:24 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:33:24 np0005625204.localdomain recover_tripleo_nova_virtqemud[91756]: 63005
Feb 20 08:33:24 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:33:24 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:33:24 np0005625204.localdomain sshd[91753]: Received disconnect from 178.217.173.50 port 45294:11: Bye Bye [preauth]
Feb 20 08:33:24 np0005625204.localdomain sshd[91753]: Disconnected from invalid user oracle 178.217.173.50 port 45294 [preauth]
Feb 20 08:33:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:33:37 np0005625204.localdomain podman[91757]: 2026-02-20 08:33:37.128774586 +0000 UTC m=+0.073317257 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=)
Feb 20 08:33:37 np0005625204.localdomain podman[91757]: 2026-02-20 08:33:37.369784162 +0000 UTC m=+0.314326853 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:33:37 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: tmp-crun.j607v8.mount: Deactivated successfully.
Feb 20 08:33:39 np0005625204.localdomain podman[91787]: 2026-02-20 08:33:39.156599654 +0000 UTC m=+0.090539296 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public)
Feb 20 08:33:39 np0005625204.localdomain podman[91788]: 2026-02-20 08:33:39.208174451 +0000 UTC m=+0.133623692 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, vendor=Red Hat, Inc.)
Feb 20 08:33:39 np0005625204.localdomain podman[91788]: 2026-02-20 08:33:39.218404916 +0000 UTC m=+0.143854177 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:33:39 np0005625204.localdomain podman[91787]: 2026-02-20 08:33:39.271230001 +0000 UTC m=+0.205169673 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vendor=Red Hat, Inc.)
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:33:39 np0005625204.localdomain podman[91786]: 2026-02-20 08:33:39.316916347 +0000 UTC m=+0.253087128 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 20 08:33:39 np0005625204.localdomain podman[91790]: 2026-02-20 08:33:39.275460201 +0000 UTC m=+0.199649924 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.buildah.version=1.41.5)
Feb 20 08:33:39 np0005625204.localdomain podman[91790]: 2026-02-20 08:33:39.358261849 +0000 UTC m=+0.282451532 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:33:39 np0005625204.localdomain podman[91786]: 2026-02-20 08:33:39.375094497 +0000 UTC m=+0.311265258 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 20 08:33:39 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:33:40 np0005625204.localdomain systemd[1]: tmp-crun.zJ7Cw5.mount: Deactivated successfully.
Feb 20 08:33:44 np0005625204.localdomain sshd[91877]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:33:45 np0005625204.localdomain sshd[91877]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: tmp-crun.zxpavL.mount: Deactivated successfully.
Feb 20 08:33:46 np0005625204.localdomain podman[91879]: 2026-02-20 08:33:46.174591774 +0000 UTC m=+0.101276417 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: tmp-crun.kB9T6H.mount: Deactivated successfully.
Feb 20 08:33:46 np0005625204.localdomain podman[91880]: 2026-02-20 08:33:46.230843475 +0000 UTC m=+0.157499067 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 20 08:33:46 np0005625204.localdomain podman[91881]: 2026-02-20 08:33:46.271670401 +0000 UTC m=+0.197252341 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=)
Feb 20 08:33:46 np0005625204.localdomain podman[91880]: 2026-02-20 08:33:46.277285164 +0000 UTC m=+0.203940766 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:33:46 np0005625204.localdomain podman[91881]: 2026-02-20 08:33:46.322604888 +0000 UTC m=+0.248186828 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510)
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:33:46 np0005625204.localdomain podman[91882]: 2026-02-20 08:33:46.337979322 +0000 UTC m=+0.257957049 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, tcib_managed=true)
Feb 20 08:33:46 np0005625204.localdomain podman[91879]: 2026-02-20 08:33:46.3639262 +0000 UTC m=+0.290610843 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, version=17.1.13)
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:33:46 np0005625204.localdomain podman[91882]: 2026-02-20 08:33:46.393266803 +0000 UTC m=+0.313244560 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc.)
Feb 20 08:33:46 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:33:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:33:50 np0005625204.localdomain podman[91970]: 2026-02-20 08:33:50.148491294 +0000 UTC m=+0.083496760 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vendor=Red Hat, Inc., container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-type=git)
Feb 20 08:33:50 np0005625204.localdomain podman[91970]: 2026-02-20 08:33:50.550336789 +0000 UTC m=+0.485342235 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=nova_migration_target, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:33:50 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:33:52 np0005625204.localdomain sudo[91993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:33:52 np0005625204.localdomain sudo[91993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:33:52 np0005625204.localdomain sudo[91993]: pam_unix(sudo:session): session closed for user root
Feb 20 08:33:52 np0005625204.localdomain sudo[92008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:33:52 np0005625204.localdomain sudo[92008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:33:53 np0005625204.localdomain sudo[92008]: pam_unix(sudo:session): session closed for user root
Feb 20 08:33:53 np0005625204.localdomain sudo[92055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:33:53 np0005625204.localdomain sudo[92055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:33:53 np0005625204.localdomain sudo[92055]: pam_unix(sudo:session): session closed for user root
Feb 20 08:34:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:34:08 np0005625204.localdomain podman[92093]: 2026-02-20 08:34:08.183549069 +0000 UTC m=+0.117869558 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:34:08 np0005625204.localdomain podman[92093]: 2026-02-20 08:34:08.352053244 +0000 UTC m=+0.286373743 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:34:08 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: tmp-crun.Uhh0EO.mount: Deactivated successfully.
Feb 20 08:34:10 np0005625204.localdomain podman[92123]: 2026-02-20 08:34:10.165958959 +0000 UTC m=+0.096027266 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true)
Feb 20 08:34:10 np0005625204.localdomain podman[92122]: 2026-02-20 08:34:10.142675903 +0000 UTC m=+0.079284970 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., tcib_managed=true)
Feb 20 08:34:10 np0005625204.localdomain podman[92121]: 2026-02-20 08:34:10.19719955 +0000 UTC m=+0.135990305 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z)
Feb 20 08:34:10 np0005625204.localdomain podman[92123]: 2026-02-20 08:34:10.223695986 +0000 UTC m=+0.153764323 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, version=17.1.13)
Feb 20 08:34:10 np0005625204.localdomain podman[92121]: 2026-02-20 08:34:10.233064394 +0000 UTC m=+0.171855189 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 20 08:34:10 np0005625204.localdomain podman[92129]: 2026-02-20 08:34:10.178824045 +0000 UTC m=+0.102957269 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:34:10 np0005625204.localdomain podman[92122]: 2026-02-20 08:34:10.276431239 +0000 UTC m=+0.213040376 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:34:10 np0005625204.localdomain podman[92129]: 2026-02-20 08:34:10.318240125 +0000 UTC m=+0.242373399 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible)
Feb 20 08:34:10 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:34:17 np0005625204.localdomain podman[92208]: 2026-02-20 08:34:17.142708049 +0000 UTC m=+0.082488349 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, batch=17.1_20260112.1)
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: tmp-crun.YmfCqC.mount: Deactivated successfully.
Feb 20 08:34:17 np0005625204.localdomain podman[92210]: 2026-02-20 08:34:17.161522298 +0000 UTC m=+0.092515788 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 20 08:34:17 np0005625204.localdomain podman[92208]: 2026-02-20 08:34:17.203137008 +0000 UTC m=+0.142917288 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:34:17 np0005625204.localdomain podman[92209]: 2026-02-20 08:34:17.203717427 +0000 UTC m=+0.137942727 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., container_name=iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64)
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:34:17 np0005625204.localdomain podman[92210]: 2026-02-20 08:34:17.226063924 +0000 UTC m=+0.157057404 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13)
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:34:17 np0005625204.localdomain podman[92216]: 2026-02-20 08:34:17.316289281 +0000 UTC m=+0.244169825 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step5, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, version=17.1.13, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git)
Feb 20 08:34:17 np0005625204.localdomain podman[92209]: 2026-02-20 08:34:17.338527765 +0000 UTC m=+0.272753105 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:34:17 np0005625204.localdomain podman[92216]: 2026-02-20 08:34:17.403800483 +0000 UTC m=+0.331680997 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:34:17 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:34:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:34:21 np0005625204.localdomain podman[92301]: 2026-02-20 08:34:21.136990596 +0000 UTC m=+0.075619918 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 20 08:34:21 np0005625204.localdomain podman[92301]: 2026-02-20 08:34:21.520273269 +0000 UTC m=+0.458902571 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:34:21 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:34:30 np0005625204.localdomain sshd[92324]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:34:30 np0005625204.localdomain sshd[92324]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:34:35 np0005625204.localdomain sshd[92326]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:34:36 np0005625204.localdomain sshd[92326]: Invalid user oracle from 83.235.16.111 port 60372
Feb 20 08:34:36 np0005625204.localdomain sshd[92326]: Received disconnect from 83.235.16.111 port 60372:11: Bye Bye [preauth]
Feb 20 08:34:36 np0005625204.localdomain sshd[92326]: Disconnected from invalid user oracle 83.235.16.111 port 60372 [preauth]
Feb 20 08:34:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:34:39 np0005625204.localdomain systemd[1]: tmp-crun.sVaaUm.mount: Deactivated successfully.
Feb 20 08:34:39 np0005625204.localdomain podman[92328]: 2026-02-20 08:34:39.154975706 +0000 UTC m=+0.089294959 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:34:39 np0005625204.localdomain podman[92328]: 2026-02-20 08:34:39.354470136 +0000 UTC m=+0.288789369 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1)
Feb 20 08:34:39 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:34:41 np0005625204.localdomain podman[92356]: 2026-02-20 08:34:41.162774039 +0000 UTC m=+0.098591935 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:30Z, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com)
Feb 20 08:34:41 np0005625204.localdomain podman[92356]: 2026-02-20 08:34:41.195153806 +0000 UTC m=+0.130971702 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: tmp-crun.88M1hr.mount: Deactivated successfully.
Feb 20 08:34:41 np0005625204.localdomain podman[92357]: 2026-02-20 08:34:41.256774972 +0000 UTC m=+0.190202424 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, version=17.1.13, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-cron-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z)
Feb 20 08:34:41 np0005625204.localdomain podman[92359]: 2026-02-20 08:34:41.235489166 +0000 UTC m=+0.163757289 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git)
Feb 20 08:34:41 np0005625204.localdomain podman[92357]: 2026-02-20 08:34:41.290148909 +0000 UTC m=+0.223576341 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:34:41 np0005625204.localdomain podman[92359]: 2026-02-20 08:34:41.320247904 +0000 UTC m=+0.248515937 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:34:41 np0005625204.localdomain podman[92358]: 2026-02-20 08:34:41.375909417 +0000 UTC m=+0.305616795 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible)
Feb 20 08:34:41 np0005625204.localdomain podman[92358]: 2026-02-20 08:34:41.415102843 +0000 UTC m=+0.344810181 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:34:41 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: tmp-crun.dse0Vf.mount: Deactivated successfully.
Feb 20 08:34:48 np0005625204.localdomain podman[92448]: 2026-02-20 08:34:48.18957127 +0000 UTC m=+0.129317420 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:34:48 np0005625204.localdomain podman[92449]: 2026-02-20 08:34:48.145803294 +0000 UTC m=+0.085845073 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 20 08:34:48 np0005625204.localdomain podman[92448]: 2026-02-20 08:34:48.234964798 +0000 UTC m=+0.174710978 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13)
Feb 20 08:34:48 np0005625204.localdomain podman[92450]: 2026-02-20 08:34:48.24188439 +0000 UTC m=+0.174058817 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:34:48 np0005625204.localdomain podman[92460]: 2026-02-20 08:34:48.16843465 +0000 UTC m=+0.092969761 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step5, managed_by=tripleo_ansible, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, vcs-type=git)
Feb 20 08:34:48 np0005625204.localdomain podman[92449]: 2026-02-20 08:34:48.283058028 +0000 UTC m=+0.223099857 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64)
Feb 20 08:34:48 np0005625204.localdomain podman[92450]: 2026-02-20 08:34:48.291519928 +0000 UTC m=+0.223694415 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=)
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:34:48 np0005625204.localdomain podman[92460]: 2026-02-20 08:34:48.299357499 +0000 UTC m=+0.223892630 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vcs-type=git, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64)
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:34:48 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:34:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:34:52 np0005625204.localdomain podman[92541]: 2026-02-20 08:34:52.112381159 +0000 UTC m=+0.057497171 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:34:52 np0005625204.localdomain podman[92541]: 2026-02-20 08:34:52.512107848 +0000 UTC m=+0.457223920 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z)
Feb 20 08:34:52 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:34:53 np0005625204.localdomain sudo[92565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:34:53 np0005625204.localdomain sudo[92565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:34:53 np0005625204.localdomain sudo[92565]: pam_unix(sudo:session): session closed for user root
Feb 20 08:34:53 np0005625204.localdomain sudo[92580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:34:53 np0005625204.localdomain sudo[92580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:34:54 np0005625204.localdomain sudo[92580]: pam_unix(sudo:session): session closed for user root
Feb 20 08:34:55 np0005625204.localdomain sudo[92628]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:34:55 np0005625204.localdomain sudo[92628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:34:55 np0005625204.localdomain sudo[92628]: pam_unix(sudo:session): session closed for user root
Feb 20 08:35:06 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:35:06 np0005625204.localdomain recover_tripleo_nova_virtqemud[92644]: 63005
Feb 20 08:35:06 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:35:06 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:35:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:35:10 np0005625204.localdomain podman[92645]: 2026-02-20 08:35:10.157892876 +0000 UTC m=+0.090607709 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 20 08:35:10 np0005625204.localdomain podman[92645]: 2026-02-20 08:35:10.341923869 +0000 UTC m=+0.274638672 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=)
Feb 20 08:35:10 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:35:12 np0005625204.localdomain podman[92681]: 2026-02-20 08:35:12.157447224 +0000 UTC m=+0.082007204 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: tmp-crun.whPCoY.mount: Deactivated successfully.
Feb 20 08:35:12 np0005625204.localdomain podman[92682]: 2026-02-20 08:35:12.224811237 +0000 UTC m=+0.146130888 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:35:12 np0005625204.localdomain podman[92674]: 2026-02-20 08:35:12.19827652 +0000 UTC m=+0.135093598 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:35:12 np0005625204.localdomain podman[92675]: 2026-02-20 08:35:12.259878256 +0000 UTC m=+0.188866393 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:35:12 np0005625204.localdomain podman[92675]: 2026-02-20 08:35:12.266921662 +0000 UTC m=+0.195909729 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510)
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:35:12 np0005625204.localdomain podman[92682]: 2026-02-20 08:35:12.278064535 +0000 UTC m=+0.199384186 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:35:12 np0005625204.localdomain podman[92681]: 2026-02-20 08:35:12.276353103 +0000 UTC m=+0.200913073 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5)
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:35:12 np0005625204.localdomain podman[92674]: 2026-02-20 08:35:12.382950963 +0000 UTC m=+0.319768071 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:35:12 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:35:13 np0005625204.localdomain sshd[92765]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:35:14 np0005625204.localdomain sshd[92765]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:35:15 np0005625204.localdomain sshd[92767]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:35:15 np0005625204.localdomain sshd[92767]: Invalid user common from 77.232.138.190 port 37672
Feb 20 08:35:15 np0005625204.localdomain sshd[92767]: Received disconnect from 77.232.138.190 port 37672:11: Bye Bye [preauth]
Feb 20 08:35:15 np0005625204.localdomain sshd[92767]: Disconnected from invalid user common 77.232.138.190 port 37672 [preauth]
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: tmp-crun.fptsZk.mount: Deactivated successfully.
Feb 20 08:35:19 np0005625204.localdomain podman[92771]: 2026-02-20 08:35:19.163462665 +0000 UTC m=+0.097553423 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:35:19 np0005625204.localdomain podman[92769]: 2026-02-20 08:35:19.197825922 +0000 UTC m=+0.132887659 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com)
Feb 20 08:35:19 np0005625204.localdomain podman[92769]: 2026-02-20 08:35:19.221412919 +0000 UTC m=+0.156474656 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, release=1766032510, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:35:19 np0005625204.localdomain podman[92772]: 2026-02-20 08:35:19.298758998 +0000 UTC m=+0.227666776 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, distribution-scope=public)
Feb 20 08:35:19 np0005625204.localdomain podman[92771]: 2026-02-20 08:35:19.323243272 +0000 UTC m=+0.257333980 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:35:19 np0005625204.localdomain podman[92770]: 2026-02-20 08:35:19.409215817 +0000 UTC m=+0.340142068 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 20 08:35:19 np0005625204.localdomain podman[92770]: 2026-02-20 08:35:19.420915667 +0000 UTC m=+0.351841898 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, build-date=2026-01-12T22:34:43Z)
Feb 20 08:35:19 np0005625204.localdomain podman[92772]: 2026-02-20 08:35:19.428331316 +0000 UTC m=+0.357239154 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5)
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:35:19 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:35:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:35:23 np0005625204.localdomain podman[92859]: 2026-02-20 08:35:23.129850443 +0000 UTC m=+0.069003134 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, container_name=nova_migration_target, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510)
Feb 20 08:35:23 np0005625204.localdomain podman[92859]: 2026-02-20 08:35:23.532227414 +0000 UTC m=+0.471380135 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, version=17.1.13)
Feb 20 08:35:23 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:35:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:35:41 np0005625204.localdomain systemd[1]: tmp-crun.EVh6S3.mount: Deactivated successfully.
Feb 20 08:35:41 np0005625204.localdomain podman[92882]: 2026-02-20 08:35:41.163987563 +0000 UTC m=+0.096956874 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Feb 20 08:35:41 np0005625204.localdomain podman[92882]: 2026-02-20 08:35:41.357005432 +0000 UTC m=+0.289974693 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, container_name=metrics_qdr, io.buildah.version=1.41.5)
Feb 20 08:35:41 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:35:43 np0005625204.localdomain podman[92911]: 2026-02-20 08:35:43.165701948 +0000 UTC m=+0.099600696 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi)
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: tmp-crun.FE5lAj.mount: Deactivated successfully.
Feb 20 08:35:43 np0005625204.localdomain podman[92912]: 2026-02-20 08:35:43.229281865 +0000 UTC m=+0.161277644 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, version=17.1.13, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team)
Feb 20 08:35:43 np0005625204.localdomain podman[92912]: 2026-02-20 08:35:43.268200443 +0000 UTC m=+0.200196202 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:35:43 np0005625204.localdomain podman[92913]: 2026-02-20 08:35:43.322880945 +0000 UTC m=+0.251343775 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, container_name=collectd, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Feb 20 08:35:43 np0005625204.localdomain podman[92913]: 2026-02-20 08:35:43.337055971 +0000 UTC m=+0.265518841 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, distribution-scope=public)
Feb 20 08:35:43 np0005625204.localdomain podman[92911]: 2026-02-20 08:35:43.345669866 +0000 UTC m=+0.279568644 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, distribution-scope=public, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510)
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:35:43 np0005625204.localdomain podman[92914]: 2026-02-20 08:35:43.273751464 +0000 UTC m=+0.199133810 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:35:43 np0005625204.localdomain podman[92914]: 2026-02-20 08:35:43.411188183 +0000 UTC m=+0.336570489 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:35:43 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:35:50 np0005625204.localdomain podman[93002]: 2026-02-20 08:35:50.14549376 +0000 UTC m=+0.079148366 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:35:50 np0005625204.localdomain podman[93002]: 2026-02-20 08:35:50.190160654 +0000 UTC m=+0.123815270 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:35:50 np0005625204.localdomain podman[93001]: 2026-02-20 08:35:50.190208306 +0000 UTC m=+0.124143411 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:35:50 np0005625204.localdomain podman[93000]: 2026-02-20 08:35:50.258853318 +0000 UTC m=+0.196501097 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git)
Feb 20 08:35:50 np0005625204.localdomain podman[93001]: 2026-02-20 08:35:50.275102479 +0000 UTC m=+0.209037584 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510)
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:35:50 np0005625204.localdomain podman[93003]: 2026-02-20 08:35:50.359773914 +0000 UTC m=+0.288022584 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:35:50 np0005625204.localdomain podman[93000]: 2026-02-20 08:35:50.380734268 +0000 UTC m=+0.318382057 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:35:50 np0005625204.localdomain podman[93003]: 2026-02-20 08:35:50.392072978 +0000 UTC m=+0.320321658 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true)
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:35:50 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:35:51 np0005625204.localdomain systemd[1]: tmp-crun.t9BRVD.mount: Deactivated successfully.
Feb 20 08:35:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:35:54 np0005625204.localdomain podman[93087]: 2026-02-20 08:35:54.12624439 +0000 UTC m=+0.071743648 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510)
Feb 20 08:35:54 np0005625204.localdomain podman[93087]: 2026-02-20 08:35:54.527182778 +0000 UTC m=+0.472682046 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:35:54 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:35:55 np0005625204.localdomain sudo[93110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:35:55 np0005625204.localdomain sudo[93110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:35:55 np0005625204.localdomain sudo[93110]: pam_unix(sudo:session): session closed for user root
Feb 20 08:35:55 np0005625204.localdomain sudo[93125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:35:55 np0005625204.localdomain sudo[93125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:35:56 np0005625204.localdomain sudo[93125]: pam_unix(sudo:session): session closed for user root
Feb 20 08:35:56 np0005625204.localdomain sudo[93171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:35:56 np0005625204.localdomain sudo[93171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:35:56 np0005625204.localdomain sudo[93171]: pam_unix(sudo:session): session closed for user root
Feb 20 08:35:57 np0005625204.localdomain sshd[93186]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:35:57 np0005625204.localdomain sshd[93186]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:35:58 np0005625204.localdomain sshd[93188]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:35:59 np0005625204.localdomain sshd[93190]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:35:59 np0005625204.localdomain sshd[93188]: Invalid user n8n from 152.32.189.21 port 51106
Feb 20 08:36:00 np0005625204.localdomain sshd[93188]: Received disconnect from 152.32.189.21 port 51106:11: Bye Bye [preauth]
Feb 20 08:36:00 np0005625204.localdomain sshd[93188]: Disconnected from invalid user n8n 152.32.189.21 port 51106 [preauth]
Feb 20 08:36:01 np0005625204.localdomain sshd[93190]: Invalid user dixi from 103.157.25.4 port 37724
Feb 20 08:36:01 np0005625204.localdomain sshd[93190]: Received disconnect from 103.157.25.4 port 37724:11: Bye Bye [preauth]
Feb 20 08:36:01 np0005625204.localdomain sshd[93190]: Disconnected from invalid user dixi 103.157.25.4 port 37724 [preauth]
Feb 20 08:36:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:36:12 np0005625204.localdomain systemd[1]: tmp-crun.q3Qwka.mount: Deactivated successfully.
Feb 20 08:36:12 np0005625204.localdomain podman[93192]: 2026-02-20 08:36:12.15781451 +0000 UTC m=+0.095028525 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:36:12 np0005625204.localdomain podman[93192]: 2026-02-20 08:36:12.339849031 +0000 UTC m=+0.277063056 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 20 08:36:12 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: tmp-crun.dPVgkz.mount: Deactivated successfully.
Feb 20 08:36:14 np0005625204.localdomain podman[93222]: 2026-02-20 08:36:14.21431452 +0000 UTC m=+0.148351016 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: tmp-crun.zhzBXK.mount: Deactivated successfully.
Feb 20 08:36:14 np0005625204.localdomain podman[93223]: 2026-02-20 08:36:14.226587209 +0000 UTC m=+0.155567039 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:36:14 np0005625204.localdomain podman[93222]: 2026-02-20 08:36:14.273087399 +0000 UTC m=+0.207123895 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=)
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:36:14 np0005625204.localdomain podman[93223]: 2026-02-20 08:36:14.287174972 +0000 UTC m=+0.216154862 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=logrotate_crond, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:36:14 np0005625204.localdomain podman[93224]: 2026-02-20 08:36:14.275821883 +0000 UTC m=+0.201046048 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 20 08:36:14 np0005625204.localdomain podman[93224]: 2026-02-20 08:36:14.359270261 +0000 UTC m=+0.284494406 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, container_name=collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:36:14 np0005625204.localdomain podman[93225]: 2026-02-20 08:36:14.382734373 +0000 UTC m=+0.306375428 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:36:14 np0005625204.localdomain podman[93225]: 2026-02-20 08:36:14.439376856 +0000 UTC m=+0.363017901 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:36:14 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:36:15 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:36:15 np0005625204.localdomain recover_tripleo_nova_virtqemud[93315]: 63005
Feb 20 08:36:15 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:36:15 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: tmp-crun.YyP2J4.mount: Deactivated successfully.
Feb 20 08:36:21 np0005625204.localdomain podman[93317]: 2026-02-20 08:36:21.169421395 +0000 UTC m=+0.097279854 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, tcib_managed=true, container_name=iscsid, release=1766032510, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3)
Feb 20 08:36:21 np0005625204.localdomain podman[93317]: 2026-02-20 08:36:21.207422495 +0000 UTC m=+0.135280904 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:36:21 np0005625204.localdomain podman[93319]: 2026-02-20 08:36:21.260609841 +0000 UTC m=+0.181709092 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, container_name=nova_compute, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:36:21 np0005625204.localdomain podman[93316]: 2026-02-20 08:36:21.225898373 +0000 UTC m=+0.155822436 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:36:21 np0005625204.localdomain podman[93319]: 2026-02-20 08:36:21.289091728 +0000 UTC m=+0.210190969 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, config_id=tripleo_step5, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:36:21 np0005625204.localdomain podman[93316]: 2026-02-20 08:36:21.305597076 +0000 UTC m=+0.235521069 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public)
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:36:21 np0005625204.localdomain podman[93318]: 2026-02-20 08:36:21.322202006 +0000 UTC m=+0.244518275 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510)
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:36:21 np0005625204.localdomain podman[93318]: 2026-02-20 08:36:21.367890603 +0000 UTC m=+0.290206862 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:56:19Z, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64)
Feb 20 08:36:21 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:36:22 np0005625204.localdomain systemd[1]: tmp-crun.26zMlR.mount: Deactivated successfully.
Feb 20 08:36:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:36:25 np0005625204.localdomain podman[93412]: 2026-02-20 08:36:25.144193122 +0000 UTC m=+0.082290544 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:36:25 np0005625204.localdomain podman[93412]: 2026-02-20 08:36:25.516395965 +0000 UTC m=+0.454493387 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:36:25 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:36:40 np0005625204.localdomain sshd[93436]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:36:40 np0005625204.localdomain sshd[93436]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 08:36:40 np0005625204.localdomain sshd[93436]: Connection closed by 45.148.10.240 port 45538
Feb 20 08:36:41 np0005625204.localdomain sshd[93437]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:36:41 np0005625204.localdomain sshd[93437]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:36:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:36:43 np0005625204.localdomain systemd[1]: tmp-crun.th9NNV.mount: Deactivated successfully.
Feb 20 08:36:43 np0005625204.localdomain podman[93439]: 2026-02-20 08:36:43.283358373 +0000 UTC m=+0.107736297 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step1, release=1766032510, batch=17.1_20260112.1, vcs-type=git, version=17.1.13, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:36:43 np0005625204.localdomain podman[93439]: 2026-02-20 08:36:43.484180873 +0000 UTC m=+0.308558797 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Feb 20 08:36:43 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:36:45 np0005625204.localdomain podman[93472]: 2026-02-20 08:36:45.127572492 +0000 UTC m=+0.068114477 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64)
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: tmp-crun.oRLO6e.mount: Deactivated successfully.
Feb 20 08:36:45 np0005625204.localdomain podman[93469]: 2026-02-20 08:36:45.144711879 +0000 UTC m=+0.086553375 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.buildah.version=1.41.5)
Feb 20 08:36:45 np0005625204.localdomain podman[93472]: 2026-02-20 08:36:45.153151278 +0000 UTC m=+0.093693343 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc.)
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:36:45 np0005625204.localdomain podman[93470]: 2026-02-20 08:36:45.186531226 +0000 UTC m=+0.126226055 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:36:45 np0005625204.localdomain podman[93470]: 2026-02-20 08:36:45.196887035 +0000 UTC m=+0.136581834 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team)
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:36:45 np0005625204.localdomain podman[93471]: 2026-02-20 08:36:45.233714848 +0000 UTC m=+0.172702996 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 20 08:36:45 np0005625204.localdomain podman[93471]: 2026-02-20 08:36:45.244806909 +0000 UTC m=+0.183795097 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:36:45 np0005625204.localdomain podman[93469]: 2026-02-20 08:36:45.298726948 +0000 UTC m=+0.240568494 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:36:45 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:36:50 np0005625204.localdomain sshd[93559]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:36:51 np0005625204.localdomain sshd[93559]: Invalid user vpn from 178.217.173.50 port 51830
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: tmp-crun.w4YXm1.mount: Deactivated successfully.
Feb 20 08:36:51 np0005625204.localdomain podman[93561]: 2026-02-20 08:36:51.578582807 +0000 UTC m=+0.074449903 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, container_name=ovn_controller, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:36:51 np0005625204.localdomain podman[93561]: 2026-02-20 08:36:51.607130175 +0000 UTC m=+0.102997281 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=)
Feb 20 08:36:51 np0005625204.localdomain podman[93564]: 2026-02-20 08:36:51.632202797 +0000 UTC m=+0.124377858 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:36:51 np0005625204.localdomain sshd[93559]: Received disconnect from 178.217.173.50 port 51830:11: Bye Bye [preauth]
Feb 20 08:36:51 np0005625204.localdomain sshd[93559]: Disconnected from invalid user vpn 178.217.173.50 port 51830 [preauth]
Feb 20 08:36:51 np0005625204.localdomain podman[93563]: 2026-02-20 08:36:51.682842535 +0000 UTC m=+0.175879413 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:36:51 np0005625204.localdomain podman[93564]: 2026-02-20 08:36:51.691981756 +0000 UTC m=+0.184156787 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, distribution-scope=public, tcib_managed=true, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:36:51 np0005625204.localdomain podman[93563]: 2026-02-20 08:36:51.733015299 +0000 UTC m=+0.226052157 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=)
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:36:51 np0005625204.localdomain podman[93562]: 2026-02-20 08:36:51.745510414 +0000 UTC m=+0.239041937 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, container_name=iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1)
Feb 20 08:36:51 np0005625204.localdomain podman[93562]: 2026-02-20 08:36:51.753142118 +0000 UTC m=+0.246673631 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:36:51 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:36:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:36:56 np0005625204.localdomain systemd[1]: tmp-crun.yoKNgr.mount: Deactivated successfully.
Feb 20 08:36:56 np0005625204.localdomain podman[93654]: 2026-02-20 08:36:56.147885059 +0000 UTC m=+0.088862836 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target)
Feb 20 08:36:56 np0005625204.localdomain podman[93654]: 2026-02-20 08:36:56.480972248 +0000 UTC m=+0.421949965 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, version=17.1.13, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true)
Feb 20 08:36:56 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:36:57 np0005625204.localdomain sudo[93679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:36:57 np0005625204.localdomain sudo[93679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:36:57 np0005625204.localdomain sudo[93679]: pam_unix(sudo:session): session closed for user root
Feb 20 08:36:57 np0005625204.localdomain sudo[93694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:36:57 np0005625204.localdomain sudo[93694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:36:57 np0005625204.localdomain sudo[93694]: pam_unix(sudo:session): session closed for user root
Feb 20 08:36:58 np0005625204.localdomain sudo[93740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:36:58 np0005625204.localdomain sudo[93740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:36:58 np0005625204.localdomain sudo[93740]: pam_unix(sudo:session): session closed for user root
Feb 20 08:37:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:37:14 np0005625204.localdomain podman[93755]: 2026-02-20 08:37:14.146336747 +0000 UTC m=+0.082752688 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, batch=17.1_20260112.1)
Feb 20 08:37:14 np0005625204.localdomain podman[93755]: 2026-02-20 08:37:14.340148671 +0000 UTC m=+0.276564622 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5)
Feb 20 08:37:14 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: tmp-crun.P7LJh3.mount: Deactivated successfully.
Feb 20 08:37:16 np0005625204.localdomain podman[93786]: 2026-02-20 08:37:16.192763178 +0000 UTC m=+0.125221894 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:37:16 np0005625204.localdomain podman[93785]: 2026-02-20 08:37:16.203259141 +0000 UTC m=+0.136558733 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:37:16 np0005625204.localdomain podman[93786]: 2026-02-20 08:37:16.204296943 +0000 UTC m=+0.136755739 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 20 08:37:16 np0005625204.localdomain podman[93793]: 2026-02-20 08:37:16.162717444 +0000 UTC m=+0.086384399 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:47Z)
Feb 20 08:37:16 np0005625204.localdomain podman[93793]: 2026-02-20 08:37:16.250117193 +0000 UTC m=+0.173784108 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 20 08:37:16 np0005625204.localdomain podman[93785]: 2026-02-20 08:37:16.260152061 +0000 UTC m=+0.193451653 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:37:16 np0005625204.localdomain podman[93787]: 2026-02-20 08:37:16.290371951 +0000 UTC m=+0.221017322 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:37:16 np0005625204.localdomain podman[93787]: 2026-02-20 08:37:16.33124682 +0000 UTC m=+0.261892181 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:37:16 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: tmp-crun.ERRnGS.mount: Deactivated successfully.
Feb 20 08:37:22 np0005625204.localdomain podman[93876]: 2026-02-20 08:37:22.154547657 +0000 UTC m=+0.081587701 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:37:22 np0005625204.localdomain podman[93875]: 2026-02-20 08:37:22.174054368 +0000 UTC m=+0.099686858 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, release=1766032510, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible)
Feb 20 08:37:22 np0005625204.localdomain podman[93874]: 2026-02-20 08:37:22.212065018 +0000 UTC m=+0.145174619 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 20 08:37:22 np0005625204.localdomain podman[93876]: 2026-02-20 08:37:22.219071524 +0000 UTC m=+0.146111578 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:37:22 np0005625204.localdomain podman[93874]: 2026-02-20 08:37:22.232463605 +0000 UTC m=+0.165573236 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1766032510, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:37:22 np0005625204.localdomain podman[93875]: 2026-02-20 08:37:22.308196435 +0000 UTC m=+0.233828925 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid)
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:37:22 np0005625204.localdomain podman[93882]: 2026-02-20 08:37:22.321606698 +0000 UTC m=+0.242518823 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5)
Feb 20 08:37:22 np0005625204.localdomain podman[93882]: 2026-02-20 08:37:22.38014368 +0000 UTC m=+0.301055795 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13)
Feb 20 08:37:22 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:37:23 np0005625204.localdomain systemd[1]: tmp-crun.J5bfoq.mount: Deactivated successfully.
Feb 20 08:37:24 np0005625204.localdomain sshd[93965]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:37:25 np0005625204.localdomain sshd[93965]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:37:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:37:27 np0005625204.localdomain podman[93967]: 2026-02-20 08:37:27.123700473 +0000 UTC m=+0.065198907 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1)
Feb 20 08:37:27 np0005625204.localdomain podman[93967]: 2026-02-20 08:37:27.500054574 +0000 UTC m=+0.441553058 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Feb 20 08:37:27 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:37:36 np0005625204.localdomain sshd[93991]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:37:37 np0005625204.localdomain sshd[93991]: Invalid user bitrix from 83.235.16.111 port 38116
Feb 20 08:37:37 np0005625204.localdomain sshd[93991]: Received disconnect from 83.235.16.111 port 38116:11: Bye Bye [preauth]
Feb 20 08:37:37 np0005625204.localdomain sshd[93991]: Disconnected from invalid user bitrix 83.235.16.111 port 38116 [preauth]
Feb 20 08:37:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:37:45 np0005625204.localdomain systemd[1]: tmp-crun.N2fX2x.mount: Deactivated successfully.
Feb 20 08:37:45 np0005625204.localdomain podman[93993]: 2026-02-20 08:37:45.156817166 +0000 UTC m=+0.090122844 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:37:45 np0005625204.localdomain podman[93993]: 2026-02-20 08:37:45.332068549 +0000 UTC m=+0.265374157 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 20 08:37:45 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: tmp-crun.HUPehq.mount: Deactivated successfully.
Feb 20 08:37:47 np0005625204.localdomain podman[94024]: 2026-02-20 08:37:47.17117045 +0000 UTC m=+0.104695283 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 08:37:47 np0005625204.localdomain podman[94026]: 2026-02-20 08:37:47.217878936 +0000 UTC m=+0.145421965 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, vcs-type=git, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:37:47 np0005625204.localdomain podman[94024]: 2026-02-20 08:37:47.225598904 +0000 UTC m=+0.159123737 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:37:47 np0005625204.localdomain podman[94026]: 2026-02-20 08:37:47.259330062 +0000 UTC m=+0.186873061 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=collectd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:37:47 np0005625204.localdomain podman[94027]: 2026-02-20 08:37:47.316333046 +0000 UTC m=+0.240830202 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:37:47 np0005625204.localdomain podman[94025]: 2026-02-20 08:37:47.367201222 +0000 UTC m=+0.297997761 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:37:47 np0005625204.localdomain podman[94025]: 2026-02-20 08:37:47.380083158 +0000 UTC m=+0.310879637 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:37:47 np0005625204.localdomain podman[94027]: 2026-02-20 08:37:47.431326074 +0000 UTC m=+0.355823230 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T23:07:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:37:47 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:37:53 np0005625204.localdomain podman[94116]: 2026-02-20 08:37:53.147992711 +0000 UTC m=+0.082457428 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: tmp-crun.RlVZvF.mount: Deactivated successfully.
Feb 20 08:37:53 np0005625204.localdomain podman[94116]: 2026-02-20 08:37:53.19713202 +0000 UTC m=+0.131596747 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container)
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:37:53 np0005625204.localdomain podman[94119]: 2026-02-20 08:37:53.176990097 +0000 UTC m=+0.101609400 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true)
Feb 20 08:37:53 np0005625204.localdomain podman[94119]: 2026-02-20 08:37:53.311421121 +0000 UTC m=+0.236040434 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git)
Feb 20 08:37:53 np0005625204.localdomain podman[94117]: 2026-02-20 08:37:53.325714924 +0000 UTC m=+0.257536719 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, architecture=x86_64, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:37:53 np0005625204.localdomain podman[94117]: 2026-02-20 08:37:53.364897764 +0000 UTC m=+0.296719569 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:37:53 np0005625204.localdomain podman[94118]: 2026-02-20 08:37:53.199157622 +0000 UTC m=+0.129558224 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com)
Feb 20 08:37:53 np0005625204.localdomain podman[94118]: 2026-02-20 08:37:53.449249381 +0000 UTC m=+0.379650003 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:37:53 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:37:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:37:58 np0005625204.localdomain systemd[1]: tmp-crun.LkwIx4.mount: Deactivated successfully.
Feb 20 08:37:58 np0005625204.localdomain podman[94208]: 2026-02-20 08:37:58.17031129 +0000 UTC m=+0.105922625 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, config_id=tripleo_step4, container_name=nova_migration_target, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:37:58 np0005625204.localdomain podman[94208]: 2026-02-20 08:37:58.570207907 +0000 UTC m=+0.505819212 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:37:58 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:37:58 np0005625204.localdomain sudo[94231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:37:58 np0005625204.localdomain sudo[94231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:37:58 np0005625204.localdomain sudo[94231]: pam_unix(sudo:session): session closed for user root
Feb 20 08:37:58 np0005625204.localdomain sudo[94246]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:37:58 np0005625204.localdomain sudo[94246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:37:59 np0005625204.localdomain sudo[94246]: pam_unix(sudo:session): session closed for user root
Feb 20 08:38:03 np0005625204.localdomain sudo[94292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:38:03 np0005625204.localdomain sudo[94292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:38:03 np0005625204.localdomain sudo[94292]: pam_unix(sudo:session): session closed for user root
Feb 20 08:38:10 np0005625204.localdomain sshd[94307]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:38:10 np0005625204.localdomain sshd[94307]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:38:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:38:16 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:38:16 np0005625204.localdomain recover_tripleo_nova_virtqemud[94311]: 63005
Feb 20 08:38:16 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:38:16 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:38:16 np0005625204.localdomain podman[94309]: 2026-02-20 08:38:16.151014577 +0000 UTC m=+0.087637819 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, architecture=x86_64)
Feb 20 08:38:16 np0005625204.localdomain podman[94309]: 2026-02-20 08:38:16.344785815 +0000 UTC m=+0.281408997 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, release=1766032510, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z)
Feb 20 08:38:16 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: tmp-crun.faAJTs.mount: Deactivated successfully.
Feb 20 08:38:18 np0005625204.localdomain podman[94343]: 2026-02-20 08:38:18.218968331 +0000 UTC m=+0.141049760 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, build-date=2026-01-12T22:10:15Z)
Feb 20 08:38:18 np0005625204.localdomain podman[94343]: 2026-02-20 08:38:18.229438384 +0000 UTC m=+0.151519813 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, managed_by=tripleo_ansible)
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:38:18 np0005625204.localdomain podman[94342]: 2026-02-20 08:38:18.185959251 +0000 UTC m=+0.115663696 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z)
Feb 20 08:38:18 np0005625204.localdomain podman[94344]: 2026-02-20 08:38:18.289775229 +0000 UTC m=+0.210153645 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 20 08:38:18 np0005625204.localdomain podman[94342]: 2026-02-20 08:38:18.315060481 +0000 UTC m=+0.244764936 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:38:18 np0005625204.localdomain podman[94341]: 2026-02-20 08:38:18.378828521 +0000 UTC m=+0.310967940 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:38:18 np0005625204.localdomain podman[94344]: 2026-02-20 08:38:18.399389776 +0000 UTC m=+0.319768222 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64)
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:38:18 np0005625204.localdomain podman[94341]: 2026-02-20 08:38:18.412883314 +0000 UTC m=+0.345022753 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git)
Feb 20 08:38:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:38:23 np0005625204.localdomain sshd[94433]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:38:23 np0005625204.localdomain sshd[94433]: Invalid user claude from 77.232.138.190 port 34356
Feb 20 08:38:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:38:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:38:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:38:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:38:24 np0005625204.localdomain systemd[1]: tmp-crun.gkH0eJ.mount: Deactivated successfully.
Feb 20 08:38:24 np0005625204.localdomain podman[94435]: 2026-02-20 08:38:24.067787109 +0000 UTC m=+0.082661995 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5)
Feb 20 08:38:24 np0005625204.localdomain systemd[1]: tmp-crun.3ZoIUs.mount: Deactivated successfully.
Feb 20 08:38:24 np0005625204.localdomain sshd[94433]: Received disconnect from 77.232.138.190 port 34356:11: Bye Bye [preauth]
Feb 20 08:38:24 np0005625204.localdomain sshd[94433]: Disconnected from invalid user claude 77.232.138.190 port 34356 [preauth]
Feb 20 08:38:24 np0005625204.localdomain podman[94436]: 2026-02-20 08:38:24.086844868 +0000 UTC m=+0.097034339 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64)
Feb 20 08:38:24 np0005625204.localdomain podman[94436]: 2026-02-20 08:38:24.099948963 +0000 UTC m=+0.110138464 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.openshift.expose-services=, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:38:24 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:38:24 np0005625204.localdomain podman[94435]: 2026-02-20 08:38:24.120166878 +0000 UTC m=+0.135041764 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:38:24 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Deactivated successfully.
Feb 20 08:38:24 np0005625204.localdomain podman[94438]: 2026-02-20 08:38:24.19172917 +0000 UTC m=+0.194988747 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 20 08:38:24 np0005625204.localdomain podman[94437]: 2026-02-20 08:38:24.168831901 +0000 UTC m=+0.178848467 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:38:24 np0005625204.localdomain podman[94438]: 2026-02-20 08:38:24.247009697 +0000 UTC m=+0.250269234 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, architecture=x86_64, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 20 08:38:24 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:38:24 np0005625204.localdomain podman[94437]: 2026-02-20 08:38:24.30341045 +0000 UTC m=+0.313427006 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, release=1766032510, distribution-scope=public, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64)
Feb 20 08:38:24 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:38:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:38:29 np0005625204.localdomain systemd[1]: tmp-crun.cDLPeY.mount: Deactivated successfully.
Feb 20 08:38:29 np0005625204.localdomain podman[94527]: 2026-02-20 08:38:29.147033787 +0000 UTC m=+0.085740359 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:38:29 np0005625204.localdomain podman[94527]: 2026-02-20 08:38:29.525126901 +0000 UTC m=+0.463833493 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, release=1766032510, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:38:29 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:38:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:38:47 np0005625204.localdomain podman[94551]: 2026-02-20 08:38:47.13766635 +0000 UTC m=+0.074913435 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, version=17.1.13, managed_by=tripleo_ansible)
Feb 20 08:38:47 np0005625204.localdomain podman[94551]: 2026-02-20 08:38:47.36802444 +0000 UTC m=+0.305271525 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, release=1766032510, container_name=metrics_qdr, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:38:47 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: tmp-crun.jaQU2P.mount: Deactivated successfully.
Feb 20 08:38:49 np0005625204.localdomain podman[94582]: 2026-02-20 08:38:49.165322557 +0000 UTC m=+0.099289108 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team)
Feb 20 08:38:49 np0005625204.localdomain podman[94582]: 2026-02-20 08:38:49.205222821 +0000 UTC m=+0.139189372 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:38:49 np0005625204.localdomain podman[94581]: 2026-02-20 08:38:49.205488289 +0000 UTC m=+0.141367040 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.buildah.version=1.41.5)
Feb 20 08:38:49 np0005625204.localdomain podman[94583]: 2026-02-20 08:38:49.264763091 +0000 UTC m=+0.194424600 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-collectd-container, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:38:49 np0005625204.localdomain podman[94583]: 2026-02-20 08:38:49.276007588 +0000 UTC m=+0.205669107 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.13, container_name=collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:38:49 np0005625204.localdomain podman[94581]: 2026-02-20 08:38:49.311383671 +0000 UTC m=+0.247262422 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, tcib_managed=true, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:38:49 np0005625204.localdomain podman[94585]: 2026-02-20 08:38:49.369756275 +0000 UTC m=+0.295076349 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z)
Feb 20 08:38:49 np0005625204.localdomain podman[94585]: 2026-02-20 08:38:49.427040516 +0000 UTC m=+0.352360580 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4)
Feb 20 08:38:49 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:38:50 np0005625204.localdomain systemd[1]: tmp-crun.dHmQTI.mount: Deactivated successfully.
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: tmp-crun.1IBQlj.mount: Deactivated successfully.
Feb 20 08:38:55 np0005625204.localdomain podman[94681]: 2026-02-20 08:38:55.180936181 +0000 UTC m=+0.110703392 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step5, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:38:55 np0005625204.localdomain podman[94676]: 2026-02-20 08:38:55.145606379 +0000 UTC m=+0.085818903 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, io.openshift.expose-services=, container_name=ovn_controller, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:38:55 np0005625204.localdomain podman[94678]: 2026-02-20 08:38:55.201070573 +0000 UTC m=+0.133931799 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_metadata_agent, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:38:55 np0005625204.localdomain podman[94676]: 2026-02-20 08:38:55.224308261 +0000 UTC m=+0.164520775 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 20 08:38:55 np0005625204.localdomain podman[94676]: unhealthy
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:38:55 np0005625204.localdomain podman[94681]: 2026-02-20 08:38:55.266043201 +0000 UTC m=+0.195810402 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5)
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:38:55 np0005625204.localdomain podman[94677]: 2026-02-20 08:38:55.317516261 +0000 UTC m=+0.255682692 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:38:55 np0005625204.localdomain podman[94678]: 2026-02-20 08:38:55.327900042 +0000 UTC m=+0.260761278 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:38:55 np0005625204.localdomain podman[94677]: 2026-02-20 08:38:55.357235789 +0000 UTC m=+0.295402250 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, vendor=Red Hat, Inc., container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, tcib_managed=true, url=https://www.redhat.com)
Feb 20 08:38:55 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:38:57 np0005625204.localdomain sshd[94771]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:38:57 np0005625204.localdomain sshd[94771]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:39:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:39:00 np0005625204.localdomain systemd[1]: tmp-crun.EI8Tkp.mount: Deactivated successfully.
Feb 20 08:39:00 np0005625204.localdomain podman[94773]: 2026-02-20 08:39:00.141004545 +0000 UTC m=+0.083510752 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 20 08:39:00 np0005625204.localdomain podman[94773]: 2026-02-20 08:39:00.513058042 +0000 UTC m=+0.455564289 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510)
Feb 20 08:39:00 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:39:04 np0005625204.localdomain sudo[94794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:39:04 np0005625204.localdomain sudo[94794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:39:04 np0005625204.localdomain sudo[94794]: pam_unix(sudo:session): session closed for user root
Feb 20 08:39:04 np0005625204.localdomain sudo[94809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:39:04 np0005625204.localdomain sudo[94809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:39:04 np0005625204.localdomain sudo[94809]: pam_unix(sudo:session): session closed for user root
Feb 20 08:39:05 np0005625204.localdomain sudo[94855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:39:05 np0005625204.localdomain sudo[94855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:39:05 np0005625204.localdomain sudo[94855]: pam_unix(sudo:session): session closed for user root
Feb 20 08:39:05 np0005625204.localdomain sudo[94870]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 08:39:05 np0005625204.localdomain sudo[94870]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:39:05 np0005625204.localdomain sudo[94870]: pam_unix(sudo:session): session closed for user root
Feb 20 08:39:08 np0005625204.localdomain sudo[94903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:39:08 np0005625204.localdomain sudo[94903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:39:08 np0005625204.localdomain sudo[94903]: pam_unix(sudo:session): session closed for user root
Feb 20 08:39:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:39:18 np0005625204.localdomain systemd[1]: tmp-crun.oe2DZI.mount: Deactivated successfully.
Feb 20 08:39:18 np0005625204.localdomain podman[94918]: 2026-02-20 08:39:18.159045102 +0000 UTC m=+0.101023023 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:39:18 np0005625204.localdomain podman[94918]: 2026-02-20 08:39:18.353440369 +0000 UTC m=+0.295418300 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64)
Feb 20 08:39:18 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:39:19 np0005625204.localdomain sshd[94946]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:39:20 np0005625204.localdomain podman[94949]: 2026-02-20 08:39:20.168492047 +0000 UTC m=+0.093017175 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, container_name=logrotate_crond)
Feb 20 08:39:20 np0005625204.localdomain podman[94949]: 2026-02-20 08:39:20.208952318 +0000 UTC m=+0.133477446 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container)
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:39:20 np0005625204.localdomain podman[94948]: 2026-02-20 08:39:20.210070982 +0000 UTC m=+0.139127930 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:39:20 np0005625204.localdomain podman[94956]: 2026-02-20 08:39:20.263003018 +0000 UTC m=+0.177958010 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:39:20 np0005625204.localdomain podman[94956]: 2026-02-20 08:39:20.315887522 +0000 UTC m=+0.230842504 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:39:20 np0005625204.localdomain podman[94950]: 2026-02-20 08:39:20.330110372 +0000 UTC m=+0.250920736 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:39:20 np0005625204.localdomain podman[94948]: 2026-02-20 08:39:20.342341439 +0000 UTC m=+0.271398437 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:39:20 np0005625204.localdomain podman[94950]: 2026-02-20 08:39:20.363096991 +0000 UTC m=+0.283907355 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git)
Feb 20 08:39:20 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:39:20 np0005625204.localdomain sshd[94946]: Invalid user bitrix from 152.32.189.21 port 48014
Feb 20 08:39:21 np0005625204.localdomain sshd[94946]: Received disconnect from 152.32.189.21 port 48014:11: Bye Bye [preauth]
Feb 20 08:39:21 np0005625204.localdomain sshd[94946]: Disconnected from invalid user bitrix 152.32.189.21 port 48014 [preauth]
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: tmp-crun.80yMm2.mount: Deactivated successfully.
Feb 20 08:39:26 np0005625204.localdomain podman[95043]: 2026-02-20 08:39:26.153731441 +0000 UTC m=+0.089255249 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: tmp-crun.xw4tMN.mount: Deactivated successfully.
Feb 20 08:39:26 np0005625204.localdomain podman[95044]: 2026-02-20 08:39:26.215294693 +0000 UTC m=+0.147861700 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, config_id=tripleo_step5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 20 08:39:26 np0005625204.localdomain podman[95043]: 2026-02-20 08:39:26.241849533 +0000 UTC m=+0.177373311 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, distribution-scope=public, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:39:26 np0005625204.localdomain podman[95044]: 2026-02-20 08:39:26.242246166 +0000 UTC m=+0.174813173 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:39:26 np0005625204.localdomain podman[95042]: 2026-02-20 08:39:26.253250296 +0000 UTC m=+0.190978173 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:39:26 np0005625204.localdomain podman[95042]: 2026-02-20 08:39:26.261890933 +0000 UTC m=+0.199618800 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, container_name=iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Deactivated successfully.
Feb 20 08:39:26 np0005625204.localdomain podman[95041]: 2026-02-20 08:39:26.307522093 +0000 UTC m=+0.248393777 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_controller, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:39:26 np0005625204.localdomain podman[95041]: 2026-02-20 08:39:26.328850512 +0000 UTC m=+0.269722236 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Feb 20 08:39:26 np0005625204.localdomain podman[95041]: unhealthy
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:39:26 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:39:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:39:31 np0005625204.localdomain podman[95135]: 2026-02-20 08:39:31.14398142 +0000 UTC m=+0.078503667 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64)
Feb 20 08:39:31 np0005625204.localdomain podman[95135]: 2026-02-20 08:39:31.504938974 +0000 UTC m=+0.439461231 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, version=17.1.13)
Feb 20 08:39:31 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:39:44 np0005625204.localdomain sshd[95158]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:39:44 np0005625204.localdomain sshd[95158]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:39:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:39:49 np0005625204.localdomain podman[95160]: 2026-02-20 08:39:49.140764695 +0000 UTC m=+0.082858181 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_id=tripleo_step1, tcib_managed=true, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, release=1766032510)
Feb 20 08:39:49 np0005625204.localdomain podman[95160]: 2026-02-20 08:39:49.319835019 +0000 UTC m=+0.261928445 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, distribution-scope=public, release=1766032510, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 20 08:39:49 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: tmp-crun.jd7FuK.mount: Deactivated successfully.
Feb 20 08:39:51 np0005625204.localdomain podman[95189]: 2026-02-20 08:39:51.165257637 +0000 UTC m=+0.099301299 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:39:51 np0005625204.localdomain podman[95189]: 2026-02-20 08:39:51.203958883 +0000 UTC m=+0.138002485 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: tmp-crun.Y0Y2pd.mount: Deactivated successfully.
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:39:51 np0005625204.localdomain podman[95190]: 2026-02-20 08:39:51.208219264 +0000 UTC m=+0.138660345 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3)
Feb 20 08:39:51 np0005625204.localdomain podman[95190]: 2026-02-20 08:39:51.294073397 +0000 UTC m=+0.224514488 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 08:39:51 np0005625204.localdomain podman[95188]: 2026-02-20 08:39:51.305595853 +0000 UTC m=+0.241367109 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team)
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:39:51 np0005625204.localdomain podman[95191]: 2026-02-20 08:39:51.259992934 +0000 UTC m=+0.187643029 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:39:51 np0005625204.localdomain podman[95191]: 2026-02-20 08:39:51.343894197 +0000 UTC m=+0.271544222 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:39:51 np0005625204.localdomain podman[95188]: 2026-02-20 08:39:51.361901753 +0000 UTC m=+0.297673009 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:39:51 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:39:57 np0005625204.localdomain podman[95277]: 2026-02-20 08:39:57.13812844 +0000 UTC m=+0.075901126 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, vcs-type=git)
Feb 20 08:39:57 np0005625204.localdomain podman[95277]: 2026-02-20 08:39:57.150896245 +0000 UTC m=+0.088668941 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:39:57 np0005625204.localdomain podman[95277]: unhealthy
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: tmp-crun.vAJkNJ.mount: Deactivated successfully.
Feb 20 08:39:57 np0005625204.localdomain podman[95278]: 2026-02-20 08:39:57.204797931 +0000 UTC m=+0.139488712 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:39:57 np0005625204.localdomain podman[95278]: 2026-02-20 08:39:57.21187166 +0000 UTC m=+0.146562431 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git)
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:39:57 np0005625204.localdomain podman[95279]: 2026-02-20 08:39:57.251717521 +0000 UTC m=+0.183529093 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, container_name=ovn_metadata_agent)
Feb 20 08:39:57 np0005625204.localdomain podman[95279]: 2026-02-20 08:39:57.271360458 +0000 UTC m=+0.203172100 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:39:57 np0005625204.localdomain podman[95280]: 2026-02-20 08:39:57.307163734 +0000 UTC m=+0.235547131 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 20 08:39:57 np0005625204.localdomain podman[95279]: unhealthy
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:39:57 np0005625204.localdomain podman[95280]: 2026-02-20 08:39:57.343986002 +0000 UTC m=+0.272369409 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, release=1766032510, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com)
Feb 20 08:39:57 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:40:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:40:02 np0005625204.localdomain systemd[1]: tmp-crun.o3o9JG.mount: Deactivated successfully.
Feb 20 08:40:02 np0005625204.localdomain podman[95361]: 2026-02-20 08:40:02.168512938 +0000 UTC m=+0.096429712 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:40:02 np0005625204.localdomain podman[95361]: 2026-02-20 08:40:02.52099134 +0000 UTC m=+0.448908064 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, release=1766032510, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team)
Feb 20 08:40:02 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:40:09 np0005625204.localdomain sudo[95385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:40:09 np0005625204.localdomain sudo[95385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:40:09 np0005625204.localdomain sudo[95385]: pam_unix(sudo:session): session closed for user root
Feb 20 08:40:09 np0005625204.localdomain sudo[95400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:40:09 np0005625204.localdomain sudo[95400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:40:09 np0005625204.localdomain sudo[95400]: pam_unix(sudo:session): session closed for user root
Feb 20 08:40:10 np0005625204.localdomain sudo[95447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:40:10 np0005625204.localdomain sudo[95447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:40:10 np0005625204.localdomain sudo[95447]: pam_unix(sudo:session): session closed for user root
Feb 20 08:40:10 np0005625204.localdomain sudo[95462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- inventory --format=json-pretty --filter-for-batch
Feb 20 08:40:10 np0005625204.localdomain sudo[95462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 2026-02-20 08:40:10.781207863 +0000 UTC m=+0.085749900 container create 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, RELEASE=main, vendor=Red Hat, Inc., release=1770267347, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, version=7)
Feb 20 08:40:10 np0005625204.localdomain systemd[1]: Started libpod-conmon-81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6.scope.
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 2026-02-20 08:40:10.743021503 +0000 UTC m=+0.047563580 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 08:40:10 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 2026-02-20 08:40:10.865182388 +0000 UTC m=+0.169724425 container init 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, release=1770267347)
Feb 20 08:40:10 np0005625204.localdomain systemd[1]: tmp-crun.p4Vysq.mount: Deactivated successfully.
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 2026-02-20 08:40:10.882134902 +0000 UTC m=+0.186676939 container start 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, vcs-type=git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.buildah.version=1.42.2, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, build-date=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 2026-02-20 08:40:10.88241478 +0000 UTC m=+0.186956817 container attach 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, architecture=x86_64, ceph=True, version=7, GIT_CLEAN=True, RELEASE=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph)
Feb 20 08:40:10 np0005625204.localdomain systemd[1]: libpod-81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6.scope: Deactivated successfully.
Feb 20 08:40:10 np0005625204.localdomain affectionate_carson[95532]: 167 167
Feb 20 08:40:10 np0005625204.localdomain podman[95517]: 2026-02-20 08:40:10.886950151 +0000 UTC m=+0.191492198 container died 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.42.2, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 20 08:40:10 np0005625204.localdomain podman[95537]: 2026-02-20 08:40:10.97948537 +0000 UTC m=+0.081248021 container remove 81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_carson, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, RELEASE=main, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_CLEAN=True)
Feb 20 08:40:10 np0005625204.localdomain systemd[1]: libpod-conmon-81d3cb686ca13243b90b060fad232cd422fe06e3dcc770d41576815ed233bef6.scope: Deactivated successfully.
Feb 20 08:40:11 np0005625204.localdomain podman[95559]: 
Feb 20 08:40:11 np0005625204.localdomain podman[95559]: 2026-02-20 08:40:11.20503135 +0000 UTC m=+0.070721837 container create 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, version=7, GIT_BRANCH=main, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main)
Feb 20 08:40:11 np0005625204.localdomain systemd[1]: Started libpod-conmon-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope.
Feb 20 08:40:11 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 08:40:11 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 08:40:11 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 08:40:11 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 08:40:11 np0005625204.localdomain podman[95559]: 2026-02-20 08:40:11.179846012 +0000 UTC m=+0.045536599 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 08:40:11 np0005625204.localdomain podman[95559]: 2026-02-20 08:40:11.28237393 +0000 UTC m=+0.148064397 container init 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:40:11 np0005625204.localdomain podman[95559]: 2026-02-20 08:40:11.289368406 +0000 UTC m=+0.155058883 container start 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, build-date=2026-02-09T10:25:24Z, vcs-type=git, RELEASE=main, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.42.2, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph)
Feb 20 08:40:11 np0005625204.localdomain podman[95559]: 2026-02-20 08:40:11.289537012 +0000 UTC m=+0.155227509 container attach 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=)
Feb 20 08:40:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba5b980df86d8244b5e3c3b4fdfad53b098ba29b6a857bea57aa5794c7e56fd7-merged.mount: Deactivated successfully.
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]: [
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:     {
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "available": false,
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "ceph_device": false,
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "lsm_data": {},
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "lvs": [],
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "path": "/dev/sr0",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "rejected_reasons": [
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "Insufficient space (<5GB)",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "Has a FileSystem"
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         ],
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         "sys_api": {
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "actuators": null,
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "device_nodes": "sr0",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "human_readable_size": "482.00 KB",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "id_bus": "ata",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "model": "QEMU DVD-ROM",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "nr_requests": "2",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "partitions": {},
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "path": "/dev/sr0",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "removable": "1",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "rev": "2.5+",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "ro": "0",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "rotational": "1",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "sas_address": "",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "sas_device_handle": "",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "scheduler_mode": "mq-deadline",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "sectors": 0,
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "sectorsize": "2048",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "size": 493568.0,
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "support_discard": "0",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "type": "disk",
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:             "vendor": "QEMU"
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:         }
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]:     }
Feb 20 08:40:12 np0005625204.localdomain amazing_clarke[95575]: ]
Feb 20 08:40:12 np0005625204.localdomain systemd[1]: libpod-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope: Deactivated successfully.
Feb 20 08:40:12 np0005625204.localdomain systemd[1]: libpod-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope: Consumed 1.007s CPU time.
Feb 20 08:40:12 np0005625204.localdomain podman[97688]: 2026-02-20 08:40:12.307386045 +0000 UTC m=+0.035235590 container died 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container)
Feb 20 08:40:12 np0005625204.localdomain systemd[1]: tmp-crun.Bn8LDt.mount: Deactivated successfully.
Feb 20 08:40:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-6ec2e015be892dc0cd8bf4eaf88422a2929d5af981474516572c1812a94c0617-merged.mount: Deactivated successfully.
Feb 20 08:40:12 np0005625204.localdomain podman[97688]: 2026-02-20 08:40:12.342692086 +0000 UTC m=+0.070541611 container remove 8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=amazing_clarke, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, vendor=Red Hat, Inc., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 08:40:12 np0005625204.localdomain systemd[1]: libpod-conmon-8ba590cc051fcc46ed70e4253cddd2fddb471c8dbe0d60ec1ff6e3e413c6228b.scope: Deactivated successfully.
Feb 20 08:40:12 np0005625204.localdomain sudo[95462]: pam_unix(sudo:session): session closed for user root
Feb 20 08:40:12 np0005625204.localdomain sudo[97702]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:40:12 np0005625204.localdomain sudo[97702]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:40:12 np0005625204.localdomain sudo[97702]: pam_unix(sudo:session): session closed for user root
Feb 20 08:40:16 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:40:16 np0005625204.localdomain recover_tripleo_nova_virtqemud[97718]: 63005
Feb 20 08:40:16 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:40:16 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:40:18 np0005625204.localdomain sshd[97719]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:40:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:40:20 np0005625204.localdomain podman[97721]: 2026-02-20 08:40:20.160565221 +0000 UTC m=+0.090835388 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Feb 20 08:40:20 np0005625204.localdomain podman[97721]: 2026-02-20 08:40:20.373714737 +0000 UTC m=+0.303984894 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, batch=17.1_20260112.1)
Feb 20 08:40:20 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:40:21 np0005625204.localdomain sshd[97719]: Invalid user sol from 45.148.10.240 port 50094
Feb 20 08:40:21 np0005625204.localdomain sshd[97719]: Connection closed by invalid user sol 45.148.10.240 port 50094 [preauth]
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: tmp-crun.iBgIBg.mount: Deactivated successfully.
Feb 20 08:40:21 np0005625204.localdomain podman[97751]: 2026-02-20 08:40:21.510214918 +0000 UTC m=+0.104230232 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, distribution-scope=public, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:40:21 np0005625204.localdomain podman[97751]: 2026-02-20 08:40:21.547016285 +0000 UTC m=+0.141031629 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:40:21 np0005625204.localdomain podman[97753]: 2026-02-20 08:40:21.593499862 +0000 UTC m=+0.182907753 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:40:21 np0005625204.localdomain podman[97756]: 2026-02-20 08:40:21.603705067 +0000 UTC m=+0.186694590 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi)
Feb 20 08:40:21 np0005625204.localdomain podman[97752]: 2026-02-20 08:40:21.570093729 +0000 UTC m=+0.162107411 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, container_name=collectd, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, build-date=2026-01-12T22:10:15Z)
Feb 20 08:40:21 np0005625204.localdomain podman[97752]: 2026-02-20 08:40:21.653064672 +0000 UTC m=+0.245078334 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510)
Feb 20 08:40:21 np0005625204.localdomain podman[97756]: 2026-02-20 08:40:21.660171062 +0000 UTC m=+0.243160585 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, build-date=2026-01-12T23:07:30Z, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:40:21 np0005625204.localdomain podman[97753]: 2026-02-20 08:40:21.672888975 +0000 UTC m=+0.262296856 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510)
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:40:21 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:40:28 np0005625204.localdomain podman[97849]: 2026-02-20 08:40:28.15971977 +0000 UTC m=+0.085642598 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step5, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:40:28 np0005625204.localdomain podman[97848]: 2026-02-20 08:40:28.211063286 +0000 UTC m=+0.137585843 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, version=17.1.13, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:40:28 np0005625204.localdomain podman[97849]: 2026-02-20 08:40:28.21734624 +0000 UTC m=+0.143269148 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:40:28 np0005625204.localdomain podman[97846]: 2026-02-20 08:40:28.134355916 +0000 UTC m=+0.070770659 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4)
Feb 20 08:40:28 np0005625204.localdomain podman[97848]: 2026-02-20 08:40:28.258004077 +0000 UTC m=+0.184526644 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:40:28 np0005625204.localdomain podman[97848]: unhealthy
Feb 20 08:40:28 np0005625204.localdomain podman[97846]: 2026-02-20 08:40:28.264134696 +0000 UTC m=+0.200549479 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:40:28 np0005625204.localdomain podman[97846]: unhealthy
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:40:28 np0005625204.localdomain podman[97847]: 2026-02-20 08:40:28.355918792 +0000 UTC m=+0.287650740 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=iscsid, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:40:28 np0005625204.localdomain podman[97847]: 2026-02-20 08:40:28.370242645 +0000 UTC m=+0.301974623 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:40:28 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:40:32 np0005625204.localdomain sshd[97932]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:40:32 np0005625204.localdomain sshd[97932]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:40:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:40:32 np0005625204.localdomain systemd[1]: tmp-crun.AxbY6a.mount: Deactivated successfully.
Feb 20 08:40:32 np0005625204.localdomain podman[97934]: 2026-02-20 08:40:32.657333714 +0000 UTC m=+0.094764760 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:40:33 np0005625204.localdomain podman[97934]: 2026-02-20 08:40:33.02771548 +0000 UTC m=+0.465146586 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, release=1766032510)
Feb 20 08:40:33 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:40:50 np0005625204.localdomain sshd[97958]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:40:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:40:51 np0005625204.localdomain podman[97960]: 2026-02-20 08:40:51.135027331 +0000 UTC m=+0.076700731 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:40:51 np0005625204.localdomain podman[97960]: 2026-02-20 08:40:51.356985199 +0000 UTC m=+0.298658529 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13)
Feb 20 08:40:51 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:40:52 np0005625204.localdomain podman[97988]: 2026-02-20 08:40:52.16537745 +0000 UTC m=+0.098533915 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:30Z, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 20 08:40:52 np0005625204.localdomain sshd[98040]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:40:52 np0005625204.localdomain podman[97988]: 2026-02-20 08:40:52.198123193 +0000 UTC m=+0.131279618 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:40:52 np0005625204.localdomain podman[97996]: 2026-02-20 08:40:52.226586602 +0000 UTC m=+0.148457889 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true)
Feb 20 08:40:52 np0005625204.localdomain podman[97990]: 2026-02-20 08:40:52.272205261 +0000 UTC m=+0.197543685 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 20 08:40:52 np0005625204.localdomain podman[97990]: 2026-02-20 08:40:52.310056472 +0000 UTC m=+0.235394886 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:40:52 np0005625204.localdomain podman[97989]: 2026-02-20 08:40:52.322726513 +0000 UTC m=+0.252943568 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5)
Feb 20 08:40:52 np0005625204.localdomain podman[97989]: 2026-02-20 08:40:52.337951864 +0000 UTC m=+0.268168909 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container)
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:40:52 np0005625204.localdomain podman[97996]: 2026-02-20 08:40:52.39478945 +0000 UTC m=+0.316660737 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:40:52 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:40:52 np0005625204.localdomain sshd[97958]: Invalid user sham from 103.157.25.4 port 44438
Feb 20 08:40:53 np0005625204.localdomain sshd[98040]: Received disconnect from 83.235.16.111 port 44106:11: Bye Bye [preauth]
Feb 20 08:40:53 np0005625204.localdomain sshd[98040]: Disconnected from authenticating user root 83.235.16.111 port 44106 [preauth]
Feb 20 08:40:53 np0005625204.localdomain sshd[97958]: Received disconnect from 103.157.25.4 port 44438:11: Bye Bye [preauth]
Feb 20 08:40:53 np0005625204.localdomain sshd[97958]: Disconnected from invalid user sham 103.157.25.4 port 44438 [preauth]
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:40:59 np0005625204.localdomain podman[98088]: 2026-02-20 08:40:59.162519726 +0000 UTC m=+0.092417677 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z)
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: tmp-crun.pnf8TQ.mount: Deactivated successfully.
Feb 20 08:40:59 np0005625204.localdomain podman[98085]: 2026-02-20 08:40:59.221851279 +0000 UTC m=+0.153400972 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 20 08:40:59 np0005625204.localdomain podman[98088]: 2026-02-20 08:40:59.241662211 +0000 UTC m=+0.171560202 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:40:59 np0005625204.localdomain podman[98085]: 2026-02-20 08:40:59.262880297 +0000 UTC m=+0.194430020 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:40:59 np0005625204.localdomain podman[98085]: unhealthy
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:40:59 np0005625204.localdomain podman[98084]: 2026-02-20 08:40:59.312129768 +0000 UTC m=+0.247030754 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:40:59 np0005625204.localdomain podman[98084]: 2026-02-20 08:40:59.349014218 +0000 UTC m=+0.283915264 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1)
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:40:59 np0005625204.localdomain podman[98083]: 2026-02-20 08:40:59.361938718 +0000 UTC m=+0.300964492 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510)
Feb 20 08:40:59 np0005625204.localdomain podman[98083]: 2026-02-20 08:40:59.376997333 +0000 UTC m=+0.316023137 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:40:59 np0005625204.localdomain podman[98083]: unhealthy
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:40:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:41:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:41:04 np0005625204.localdomain podman[98167]: 2026-02-20 08:41:04.139777182 +0000 UTC m=+0.078816897 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:41:04 np0005625204.localdomain podman[98167]: 2026-02-20 08:41:04.532093865 +0000 UTC m=+0.471133560 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510)
Feb 20 08:41:04 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:41:13 np0005625204.localdomain sudo[98190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:41:13 np0005625204.localdomain sudo[98190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:41:13 np0005625204.localdomain sudo[98190]: pam_unix(sudo:session): session closed for user root
Feb 20 08:41:13 np0005625204.localdomain sudo[98205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 08:41:13 np0005625204.localdomain sudo[98205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:41:13 np0005625204.localdomain sudo[98205]: pam_unix(sudo:session): session closed for user root
Feb 20 08:41:13 np0005625204.localdomain sudo[98241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:41:13 np0005625204.localdomain sudo[98241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:41:13 np0005625204.localdomain sudo[98241]: pam_unix(sudo:session): session closed for user root
Feb 20 08:41:13 np0005625204.localdomain sudo[98256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:41:13 np0005625204.localdomain sudo[98256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:41:14 np0005625204.localdomain sudo[98256]: pam_unix(sudo:session): session closed for user root
Feb 20 08:41:15 np0005625204.localdomain sudo[98304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:41:15 np0005625204.localdomain sudo[98304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:41:15 np0005625204.localdomain sudo[98304]: pam_unix(sudo:session): session closed for user root
Feb 20 08:41:17 np0005625204.localdomain sshd[98319]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:41:17 np0005625204.localdomain sshd[98319]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: tmp-crun.POIA64.mount: Deactivated successfully.
Feb 20 08:41:22 np0005625204.localdomain podman[98321]: 2026-02-20 08:41:22.155360438 +0000 UTC m=+0.088181096 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:41:22 np0005625204.localdomain podman[98321]: 2026-02-20 08:41:22.354185142 +0000 UTC m=+0.287005840 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: tmp-crun.6SUXfL.mount: Deactivated successfully.
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:41:22 np0005625204.localdomain podman[98350]: 2026-02-20 08:41:22.509998727 +0000 UTC m=+0.127551923 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T23:07:30Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13)
Feb 20 08:41:22 np0005625204.localdomain podman[98351]: 2026-02-20 08:41:22.556397971 +0000 UTC m=+0.166469295 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step4, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond)
Feb 20 08:41:22 np0005625204.localdomain podman[98350]: 2026-02-20 08:41:22.58906526 +0000 UTC m=+0.206618426 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z)
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:41:22 np0005625204.localdomain podman[98357]: 2026-02-20 08:41:22.601724272 +0000 UTC m=+0.209494205 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., tcib_managed=true)
Feb 20 08:41:22 np0005625204.localdomain podman[98357]: 2026-02-20 08:41:22.613800935 +0000 UTC m=+0.221570888 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.)
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:41:22 np0005625204.localdomain podman[98351]: 2026-02-20 08:41:22.641021346 +0000 UTC m=+0.251092690 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:41:22 np0005625204.localdomain podman[98388]: 2026-02-20 08:41:22.660442966 +0000 UTC m=+0.143677751 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute)
Feb 20 08:41:22 np0005625204.localdomain podman[98388]: 2026-02-20 08:41:22.692999032 +0000 UTC m=+0.176233847 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, container_name=ceilometer_agent_compute, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc.)
Feb 20 08:41:22 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: tmp-crun.8l4GVS.mount: Deactivated successfully.
Feb 20 08:41:30 np0005625204.localdomain podman[98445]: 2026-02-20 08:41:30.151073471 +0000 UTC m=+0.087744403 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public)
Feb 20 08:41:30 np0005625204.localdomain podman[98445]: 2026-02-20 08:41:30.195602257 +0000 UTC m=+0.132273139 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510)
Feb 20 08:41:30 np0005625204.localdomain podman[98445]: unhealthy
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:41:30 np0005625204.localdomain podman[98446]: 2026-02-20 08:41:30.208615959 +0000 UTC m=+0.139169032 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:41:30 np0005625204.localdomain podman[98447]: 2026-02-20 08:41:30.17758087 +0000 UTC m=+0.105249874 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z)
Feb 20 08:41:30 np0005625204.localdomain podman[98446]: 2026-02-20 08:41:30.245004924 +0000 UTC m=+0.175558037 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:41:30 np0005625204.localdomain podman[98447]: 2026-02-20 08:41:30.266994423 +0000 UTC m=+0.194663427 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, batch=17.1_20260112.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git)
Feb 20 08:41:30 np0005625204.localdomain podman[98447]: unhealthy
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:41:30 np0005625204.localdomain podman[98448]: 2026-02-20 08:41:30.313739608 +0000 UTC m=+0.238300525 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:41:30 np0005625204.localdomain podman[98448]: 2026-02-20 08:41:30.366095755 +0000 UTC m=+0.290656642 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:41:30 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:41:30 np0005625204.localdomain sshd[98527]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:41:30 np0005625204.localdomain sshd[98528]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:41:30 np0005625204.localdomain sshd[98528]: error: kex_exchange_identification: read: Connection reset by peer
Feb 20 08:41:30 np0005625204.localdomain sshd[98528]: Connection reset by 176.120.22.52 port 10213
Feb 20 08:41:31 np0005625204.localdomain systemd[1]: tmp-crun.juDpTM.mount: Deactivated successfully.
Feb 20 08:41:32 np0005625204.localdomain sshd[98529]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:41:33 np0005625204.localdomain sshd[98529]: Invalid user viewtinet from 77.232.138.190 port 43874
Feb 20 08:41:33 np0005625204.localdomain sshd[98529]: Received disconnect from 77.232.138.190 port 43874:11: Bye Bye [preauth]
Feb 20 08:41:33 np0005625204.localdomain sshd[98529]: Disconnected from invalid user viewtinet 77.232.138.190 port 43874 [preauth]
Feb 20 08:41:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:41:35 np0005625204.localdomain podman[98531]: 2026-02-20 08:41:35.155879848 +0000 UTC m=+0.094400978 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, version=17.1.13, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:41:35 np0005625204.localdomain podman[98531]: 2026-02-20 08:41:35.524999435 +0000 UTC m=+0.463520575 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible)
Feb 20 08:41:35 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: tmp-crun.aOqJYB.mount: Deactivated successfully.
Feb 20 08:41:53 np0005625204.localdomain podman[98554]: 2026-02-20 08:41:53.175611975 +0000 UTC m=+0.100479636 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=logrotate_crond, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, config_id=tripleo_step4)
Feb 20 08:41:53 np0005625204.localdomain podman[98554]: 2026-02-20 08:41:53.209895364 +0000 UTC m=+0.134763025 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:41:53 np0005625204.localdomain podman[98553]: 2026-02-20 08:41:53.221776831 +0000 UTC m=+0.150032308 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:41:53 np0005625204.localdomain podman[98555]: 2026-02-20 08:41:53.314804946 +0000 UTC m=+0.239372868 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:41:53 np0005625204.localdomain podman[98556]: 2026-02-20 08:41:53.156823934 +0000 UTC m=+0.082100679 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Feb 20 08:41:53 np0005625204.localdomain podman[98553]: 2026-02-20 08:41:53.321310327 +0000 UTC m=+0.249565774 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64)
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:41:53 np0005625204.localdomain podman[98555]: 2026-02-20 08:41:53.375028027 +0000 UTC m=+0.299595999 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step3, container_name=collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:41:53 np0005625204.localdomain podman[98557]: 2026-02-20 08:41:53.290200135 +0000 UTC m=+0.211148336 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:41:53 np0005625204.localdomain podman[98556]: 2026-02-20 08:41:53.397884243 +0000 UTC m=+0.323160998 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z)
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:41:53 np0005625204.localdomain podman[98557]: 2026-02-20 08:41:53.508898043 +0000 UTC m=+0.429846224 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible)
Feb 20 08:41:53 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: tmp-crun.ux8NCJ.mount: Deactivated successfully.
Feb 20 08:42:01 np0005625204.localdomain podman[98670]: 2026-02-20 08:42:01.174591126 +0000 UTC m=+0.104988576 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Feb 20 08:42:01 np0005625204.localdomain podman[98669]: 2026-02-20 08:42:01.14072456 +0000 UTC m=+0.077905199 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, version=17.1.13, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Feb 20 08:42:01 np0005625204.localdomain podman[98668]: 2026-02-20 08:42:01.197007499 +0000 UTC m=+0.134007092 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com)
Feb 20 08:42:01 np0005625204.localdomain podman[98668]: 2026-02-20 08:42:01.213894151 +0000 UTC m=+0.150893744 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 20 08:42:01 np0005625204.localdomain podman[98668]: unhealthy
Feb 20 08:42:01 np0005625204.localdomain podman[98669]: 2026-02-20 08:42:01.220253397 +0000 UTC m=+0.157434016 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com)
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:42:01 np0005625204.localdomain podman[98670]: 2026-02-20 08:42:01.311316191 +0000 UTC m=+0.241713591 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:42:01 np0005625204.localdomain podman[98670]: unhealthy
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:42:01 np0005625204.localdomain podman[98671]: 2026-02-20 08:42:01.36176552 +0000 UTC m=+0.290309672 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step5)
Feb 20 08:42:01 np0005625204.localdomain podman[98671]: 2026-02-20 08:42:01.389685823 +0000 UTC m=+0.318229965 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Feb 20 08:42:01 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:42:01 np0005625204.localdomain sshd[98754]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:42:01 np0005625204.localdomain sshd[98754]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:42:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:42:06 np0005625204.localdomain podman[98756]: 2026-02-20 08:42:06.148698785 +0000 UTC m=+0.088130604 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:42:06 np0005625204.localdomain podman[98756]: 2026-02-20 08:42:06.523154126 +0000 UTC m=+0.462585995 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true)
Feb 20 08:42:06 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:42:15 np0005625204.localdomain sudo[98779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:42:15 np0005625204.localdomain sudo[98779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:42:15 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:42:15 np0005625204.localdomain sudo[98779]: pam_unix(sudo:session): session closed for user root
Feb 20 08:42:15 np0005625204.localdomain recover_tripleo_nova_virtqemud[98795]: 63005
Feb 20 08:42:15 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:42:15 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:42:15 np0005625204.localdomain sudo[98796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:42:15 np0005625204.localdomain sudo[98796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:42:15 np0005625204.localdomain sudo[98796]: pam_unix(sudo:session): session closed for user root
Feb 20 08:42:16 np0005625204.localdomain sudo[98842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:42:16 np0005625204.localdomain sudo[98842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:42:16 np0005625204.localdomain sudo[98842]: pam_unix(sudo:session): session closed for user root
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:42:24 np0005625204.localdomain podman[98861]: 2026-02-20 08:42:24.146088049 +0000 UTC m=+0.071411659 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 20 08:42:24 np0005625204.localdomain podman[98857]: 2026-02-20 08:42:24.207741284 +0000 UTC m=+0.138276144 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:42:24 np0005625204.localdomain podman[98858]: 2026-02-20 08:42:24.246530423 +0000 UTC m=+0.174024099 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container)
Feb 20 08:42:24 np0005625204.localdomain podman[98858]: 2026-02-20 08:42:24.257057587 +0000 UTC m=+0.184551283 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:42:24 np0005625204.localdomain podman[98860]: 2026-02-20 08:42:24.312496541 +0000 UTC m=+0.237525422 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 20 08:42:24 np0005625204.localdomain podman[98857]: 2026-02-20 08:42:24.313024067 +0000 UTC m=+0.243558907 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64)
Feb 20 08:42:24 np0005625204.localdomain podman[98861]: 2026-02-20 08:42:24.350242128 +0000 UTC m=+0.275565758 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.13, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1)
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:42:24 np0005625204.localdomain podman[98859]: 2026-02-20 08:42:24.360083961 +0000 UTC m=+0.287787283 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3)
Feb 20 08:42:24 np0005625204.localdomain podman[98859]: 2026-02-20 08:42:24.45225582 +0000 UTC m=+0.379959252 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git)
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:42:24 np0005625204.localdomain podman[98860]: 2026-02-20 08:42:24.470917387 +0000 UTC m=+0.395946298 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 20 08:42:24 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:42:32 np0005625204.localdomain podman[98976]: 2026-02-20 08:42:32.152120382 +0000 UTC m=+0.084517792 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:42:32 np0005625204.localdomain podman[98976]: 2026-02-20 08:42:32.16853853 +0000 UTC m=+0.100935900 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container)
Feb 20 08:42:32 np0005625204.localdomain podman[98976]: unhealthy
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:42:32 np0005625204.localdomain podman[98977]: 2026-02-20 08:42:32.251800462 +0000 UTC m=+0.182580293 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13)
Feb 20 08:42:32 np0005625204.localdomain podman[98977]: 2026-02-20 08:42:32.262183533 +0000 UTC m=+0.192963404 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, release=1766032510)
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:42:32 np0005625204.localdomain podman[98978]: 2026-02-20 08:42:32.315751228 +0000 UTC m=+0.244708092 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:42:32 np0005625204.localdomain podman[98978]: 2026-02-20 08:42:32.362148693 +0000 UTC m=+0.291105477 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team)
Feb 20 08:42:32 np0005625204.localdomain podman[98978]: unhealthy
Feb 20 08:42:32 np0005625204.localdomain podman[98979]: 2026-02-20 08:42:32.373358979 +0000 UTC m=+0.300187587 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, vcs-type=git, container_name=nova_compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:42:32 np0005625204.localdomain podman[98979]: 2026-02-20 08:42:32.402157239 +0000 UTC m=+0.328985867 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step5)
Feb 20 08:42:32 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:42:35 np0005625204.localdomain sshd[99053]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:42:36 np0005625204.localdomain sshd[99053]: Invalid user viewtinet from 152.32.189.21 port 50768
Feb 20 08:42:36 np0005625204.localdomain sshd[99053]: Received disconnect from 152.32.189.21 port 50768:11: Bye Bye [preauth]
Feb 20 08:42:36 np0005625204.localdomain sshd[99053]: Disconnected from invalid user viewtinet 152.32.189.21 port 50768 [preauth]
Feb 20 08:42:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:42:36 np0005625204.localdomain podman[99055]: 2026-02-20 08:42:36.819197265 +0000 UTC m=+0.087166215 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:42:37 np0005625204.localdomain podman[99055]: 2026-02-20 08:42:37.193771279 +0000 UTC m=+0.461740249 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:42:37 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:42:44 np0005625204.localdomain sshd[99078]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:42:44 np0005625204.localdomain sshd[99078]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: tmp-crun.QA6ySW.mount: Deactivated successfully.
Feb 20 08:42:55 np0005625204.localdomain podman[99080]: 2026-02-20 08:42:55.13345043 +0000 UTC m=+0.071440588 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64)
Feb 20 08:42:55 np0005625204.localdomain podman[99081]: 2026-02-20 08:42:55.187374347 +0000 UTC m=+0.120674921 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:42:55 np0005625204.localdomain podman[99080]: 2026-02-20 08:42:55.212690639 +0000 UTC m=+0.150680817 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, release=1766032510)
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:42:55 np0005625204.localdomain podman[99094]: 2026-02-20 08:42:55.260857978 +0000 UTC m=+0.181095928 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:42:55 np0005625204.localdomain podman[99081]: 2026-02-20 08:42:55.266586255 +0000 UTC m=+0.199886879 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:42:55 np0005625204.localdomain podman[99086]: 2026-02-20 08:42:55.162690554 +0000 UTC m=+0.089666522 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-collectd-container, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=)
Feb 20 08:42:55 np0005625204.localdomain podman[99088]: 2026-02-20 08:42:55.318212129 +0000 UTC m=+0.242524294 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=)
Feb 20 08:42:55 np0005625204.localdomain podman[99088]: 2026-02-20 08:42:55.346959527 +0000 UTC m=+0.271271703 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, version=17.1.13, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:42:55 np0005625204.localdomain podman[99086]: 2026-02-20 08:42:55.397424478 +0000 UTC m=+0.324400526 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step3, url=https://www.redhat.com)
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:42:55 np0005625204.localdomain podman[99094]: 2026-02-20 08:42:55.463151669 +0000 UTC m=+0.383389669 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:42:55 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:43:01 np0005625204.localdomain sshd[99198]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:43:02 np0005625204.localdomain sshd[99198]: Invalid user albi from 27.112.79.3 port 60480
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: tmp-crun.um9af3.mount: Deactivated successfully.
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: tmp-crun.lO3DaM.mount: Deactivated successfully.
Feb 20 08:43:02 np0005625204.localdomain podman[99201]: 2026-02-20 08:43:02.656221608 +0000 UTC m=+0.068789628 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510)
Feb 20 08:43:02 np0005625204.localdomain podman[99202]: 2026-02-20 08:43:02.726413327 +0000 UTC m=+0.134505938 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:43:02 np0005625204.localdomain podman[99201]: 2026-02-20 08:43:02.745192606 +0000 UTC m=+0.157760666 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, container_name=iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64)
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:43:02 np0005625204.localdomain podman[99202]: 2026-02-20 08:43:02.79642334 +0000 UTC m=+0.204515941 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:43:02 np0005625204.localdomain podman[99202]: unhealthy
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:43:02 np0005625204.localdomain podman[99200]: 2026-02-20 08:43:02.697930707 +0000 UTC m=+0.111435115 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=ovn_controller, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, version=17.1.13)
Feb 20 08:43:02 np0005625204.localdomain podman[99203]: 2026-02-20 08:43:02.885890765 +0000 UTC m=+0.289465057 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:43:02 np0005625204.localdomain podman[99200]: 2026-02-20 08:43:02.932292348 +0000 UTC m=+0.345796806 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true)
Feb 20 08:43:02 np0005625204.localdomain podman[99200]: unhealthy
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:43:02 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:43:02 np0005625204.localdomain sshd[99281]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:43:02 np0005625204.localdomain podman[99203]: 2026-02-20 08:43:02.989595959 +0000 UTC m=+0.393170231 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5)
Feb 20 08:43:03 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:43:03 np0005625204.localdomain sshd[99198]: Received disconnect from 27.112.79.3 port 60480:11: Bye Bye [preauth]
Feb 20 08:43:03 np0005625204.localdomain sshd[99198]: Disconnected from invalid user albi 27.112.79.3 port 60480 [preauth]
Feb 20 08:43:04 np0005625204.localdomain sshd[99281]: Invalid user solana from 45.148.10.240 port 52942
Feb 20 08:43:04 np0005625204.localdomain sshd[99281]: Connection closed by invalid user solana 45.148.10.240 port 52942 [preauth]
Feb 20 08:43:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:43:08 np0005625204.localdomain systemd[1]: tmp-crun.FAMDnz.mount: Deactivated successfully.
Feb 20 08:43:08 np0005625204.localdomain podman[99285]: 2026-02-20 08:43:08.159975123 +0000 UTC m=+0.095429960 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:43:08 np0005625204.localdomain podman[99285]: 2026-02-20 08:43:08.530518564 +0000 UTC m=+0.465973361 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, container_name=nova_migration_target, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=)
Feb 20 08:43:08 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:43:17 np0005625204.localdomain sudo[99310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:43:17 np0005625204.localdomain sudo[99310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:43:17 np0005625204.localdomain sudo[99310]: pam_unix(sudo:session): session closed for user root
Feb 20 08:43:17 np0005625204.localdomain sudo[99325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 08:43:17 np0005625204.localdomain sudo[99325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:43:17 np0005625204.localdomain podman[99414]: 2026-02-20 08:43:17.924155684 +0000 UTC m=+0.085338488 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=)
Feb 20 08:43:18 np0005625204.localdomain podman[99414]: 2026-02-20 08:43:18.031888893 +0000 UTC m=+0.193071757 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, architecture=x86_64, release=1770267347, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph)
Feb 20 08:43:18 np0005625204.localdomain sudo[99325]: pam_unix(sudo:session): session closed for user root
Feb 20 08:43:18 np0005625204.localdomain sudo[99481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:43:18 np0005625204.localdomain sudo[99481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:43:18 np0005625204.localdomain sudo[99481]: pam_unix(sudo:session): session closed for user root
Feb 20 08:43:18 np0005625204.localdomain sudo[99496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:43:18 np0005625204.localdomain sudo[99496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:43:19 np0005625204.localdomain sudo[99496]: pam_unix(sudo:session): session closed for user root
Feb 20 08:43:19 np0005625204.localdomain sudo[99542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:43:19 np0005625204.localdomain sudo[99542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:43:19 np0005625204.localdomain sudo[99542]: pam_unix(sudo:session): session closed for user root
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: tmp-crun.29YShZ.mount: Deactivated successfully.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: tmp-crun.1CSJfB.mount: Deactivated successfully.
Feb 20 08:43:26 np0005625204.localdomain podman[99557]: 2026-02-20 08:43:26.142143473 +0000 UTC m=+0.078327061 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:43:26 np0005625204.localdomain podman[99560]: 2026-02-20 08:43:26.175645729 +0000 UTC m=+0.103658224 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, distribution-scope=public, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Feb 20 08:43:26 np0005625204.localdomain podman[99558]: 2026-02-20 08:43:26.235775657 +0000 UTC m=+0.172739919 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=logrotate_crond, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5)
Feb 20 08:43:26 np0005625204.localdomain podman[99558]: 2026-02-20 08:43:26.246404276 +0000 UTC m=+0.183368568 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4)
Feb 20 08:43:26 np0005625204.localdomain podman[99560]: 2026-02-20 08:43:26.260079738 +0000 UTC m=+0.188092263 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:43:26 np0005625204.localdomain podman[99557]: 2026-02-20 08:43:26.278286671 +0000 UTC m=+0.214470289 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, version=17.1.13, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1)
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:43:26 np0005625204.localdomain podman[99559]: 2026-02-20 08:43:26.195964517 +0000 UTC m=+0.131498314 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z)
Feb 20 08:43:26 np0005625204.localdomain podman[99559]: 2026-02-20 08:43:26.325773628 +0000 UTC m=+0.261307435 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc.)
Feb 20 08:43:26 np0005625204.localdomain podman[99561]: 2026-02-20 08:43:26.338534842 +0000 UTC m=+0.271137309 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:43:26 np0005625204.localdomain podman[99561]: 2026-02-20 08:43:26.532058092 +0000 UTC m=+0.464660589 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, tcib_managed=true, container_name=metrics_qdr, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5)
Feb 20 08:43:26 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:43:30 np0005625204.localdomain sshd[99677]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:43:30 np0005625204.localdomain sshd[99677]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:43:33 np0005625204.localdomain podman[99679]: 2026-02-20 08:43:33.15684785 +0000 UTC m=+0.095008978 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 20 08:43:33 np0005625204.localdomain podman[99679]: 2026-02-20 08:43:33.197003961 +0000 UTC m=+0.135165099 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:43:33 np0005625204.localdomain podman[99679]: unhealthy
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: tmp-crun.bNiCFW.mount: Deactivated successfully.
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:43:33 np0005625204.localdomain podman[99680]: 2026-02-20 08:43:33.225216823 +0000 UTC m=+0.158666905 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:43:33 np0005625204.localdomain podman[99680]: 2026-02-20 08:43:33.260115681 +0000 UTC m=+0.193565703 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, release=1766032510, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z)
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:43:33 np0005625204.localdomain podman[99681]: 2026-02-20 08:43:33.272604157 +0000 UTC m=+0.204550572 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, version=17.1.13, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:43:33 np0005625204.localdomain podman[99681]: 2026-02-20 08:43:33.310376674 +0000 UTC m=+0.242323129 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:43:33 np0005625204.localdomain podman[99681]: unhealthy
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:43:33 np0005625204.localdomain podman[99682]: 2026-02-20 08:43:33.360010478 +0000 UTC m=+0.288085943 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:43:33 np0005625204.localdomain podman[99682]: 2026-02-20 08:43:33.385050522 +0000 UTC m=+0.313126007 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, tcib_managed=true, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:43:33 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:43:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:43:39 np0005625204.localdomain systemd[1]: tmp-crun.NPBO40.mount: Deactivated successfully.
Feb 20 08:43:39 np0005625204.localdomain podman[99770]: 2026-02-20 08:43:39.159177992 +0000 UTC m=+0.096007317 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:43:39 np0005625204.localdomain podman[99770]: 2026-02-20 08:43:39.550459504 +0000 UTC m=+0.487288789 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Feb 20 08:43:39 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:43:49 np0005625204.localdomain sshd[99793]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:43:49 np0005625204.localdomain sshd[99793]: Received disconnect from 83.235.16.111 port 50118:11: Bye Bye [preauth]
Feb 20 08:43:49 np0005625204.localdomain sshd[99793]: Disconnected from authenticating user root 83.235.16.111 port 50118 [preauth]
Feb 20 08:43:49 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:43:50 np0005625204.localdomain recover_tripleo_nova_virtqemud[99796]: 63005
Feb 20 08:43:50 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:43:50 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:43:57 np0005625204.localdomain podman[99801]: 2026-02-20 08:43:57.137916609 +0000 UTC m=+0.072314275 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=metrics_qdr, config_id=tripleo_step1, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:43:57 np0005625204.localdomain podman[99797]: 2026-02-20 08:43:57.174524331 +0000 UTC m=+0.114068237 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z)
Feb 20 08:43:57 np0005625204.localdomain podman[99800]: 2026-02-20 08:43:57.186549202 +0000 UTC m=+0.121087303 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=)
Feb 20 08:43:57 np0005625204.localdomain podman[99800]: 2026-02-20 08:43:57.204821707 +0000 UTC m=+0.139359778 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, build-date=2026-01-12T23:07:47Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:43:57 np0005625204.localdomain podman[99798]: 2026-02-20 08:43:57.122445681 +0000 UTC m=+0.063486092 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z)
Feb 20 08:43:57 np0005625204.localdomain podman[99798]: 2026-02-20 08:43:57.255884935 +0000 UTC m=+0.196925366 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container)
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:43:57 np0005625204.localdomain podman[99797]: 2026-02-20 08:43:57.274595512 +0000 UTC m=+0.214139438 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:43:57 np0005625204.localdomain podman[99799]: 2026-02-20 08:43:57.34991082 +0000 UTC m=+0.290038363 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 20 08:43:57 np0005625204.localdomain podman[99801]: 2026-02-20 08:43:57.386973126 +0000 UTC m=+0.321370752 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:43:57 np0005625204.localdomain podman[99799]: 2026-02-20 08:43:57.38743718 +0000 UTC m=+0.327564723 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible)
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:43:57 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:43:58 np0005625204.localdomain systemd[1]: tmp-crun.8l1bOS.mount: Deactivated successfully.
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: tmp-crun.IzS6rz.mount: Deactivated successfully.
Feb 20 08:44:04 np0005625204.localdomain podman[99920]: 2026-02-20 08:44:04.224202608 +0000 UTC m=+0.143672670 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 20 08:44:04 np0005625204.localdomain podman[99915]: 2026-02-20 08:44:04.198737191 +0000 UTC m=+0.125461688 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:44:04 np0005625204.localdomain podman[99914]: 2026-02-20 08:44:04.264448953 +0000 UTC m=+0.191690806 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, version=17.1.13, release=1766032510, architecture=x86_64, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public)
Feb 20 08:44:04 np0005625204.localdomain podman[99915]: 2026-02-20 08:44:04.282004845 +0000 UTC m=+0.208729322 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1)
Feb 20 08:44:04 np0005625204.localdomain podman[99913]: 2026-02-20 08:44:04.17702094 +0000 UTC m=+0.106190601 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64)
Feb 20 08:44:04 np0005625204.localdomain podman[99915]: unhealthy
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:44:04 np0005625204.localdomain podman[99914]: 2026-02-20 08:44:04.303887231 +0000 UTC m=+0.231129084 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:44:04 np0005625204.localdomain podman[99920]: 2026-02-20 08:44:04.354290219 +0000 UTC m=+0.273760301 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:44:04 np0005625204.localdomain podman[99913]: 2026-02-20 08:44:04.365705062 +0000 UTC m=+0.294874703 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:44:04 np0005625204.localdomain podman[99913]: unhealthy
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:44:04 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:44:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:44:10 np0005625204.localdomain systemd[1]: tmp-crun.iSw0Cf.mount: Deactivated successfully.
Feb 20 08:44:10 np0005625204.localdomain podman[99996]: 2026-02-20 08:44:10.142900608 +0000 UTC m=+0.082661215 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:44:10 np0005625204.localdomain podman[99996]: 2026-02-20 08:44:10.536977046 +0000 UTC m=+0.476737643 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:44:10 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:44:14 np0005625204.localdomain sshd[100020]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:44:14 np0005625204.localdomain sshd[100020]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:44:19 np0005625204.localdomain sudo[100022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:44:19 np0005625204.localdomain sudo[100022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:44:19 np0005625204.localdomain sudo[100022]: pam_unix(sudo:session): session closed for user root
Feb 20 08:44:19 np0005625204.localdomain sudo[100037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:44:19 np0005625204.localdomain sudo[100037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:44:20 np0005625204.localdomain sudo[100037]: pam_unix(sudo:session): session closed for user root
Feb 20 08:44:21 np0005625204.localdomain sudo[100083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:44:21 np0005625204.localdomain sudo[100083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:44:21 np0005625204.localdomain sudo[100083]: pam_unix(sudo:session): session closed for user root
Feb 20 08:44:26 np0005625204.localdomain sshd[100098]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:44:27 np0005625204.localdomain sshd[100098]: Invalid user bitrix from 77.232.138.190 port 56660
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: tmp-crun.Zbn6xR.mount: Deactivated successfully.
Feb 20 08:44:27 np0005625204.localdomain sshd[100098]: Received disconnect from 77.232.138.190 port 56660:11: Bye Bye [preauth]
Feb 20 08:44:27 np0005625204.localdomain sshd[100098]: Disconnected from invalid user bitrix 77.232.138.190 port 56660 [preauth]
Feb 20 08:44:27 np0005625204.localdomain podman[100103]: 2026-02-20 08:44:27.569803913 +0000 UTC m=+0.139653357 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:44:27 np0005625204.localdomain podman[100103]: 2026-02-20 08:44:27.578807221 +0000 UTC m=+0.148656675 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-collectd-container)
Feb 20 08:44:27 np0005625204.localdomain podman[100101]: 2026-02-20 08:44:27.536739251 +0000 UTC m=+0.109040701 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, config_id=tripleo_step4, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:44:27 np0005625204.localdomain podman[100101]: 2026-02-20 08:44:27.621073417 +0000 UTC m=+0.193374837 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., container_name=logrotate_crond, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron)
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:44:27 np0005625204.localdomain podman[100102]: 2026-02-20 08:44:27.531127997 +0000 UTC m=+0.103598552 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:44:27 np0005625204.localdomain podman[100102]: 2026-02-20 08:44:27.663404445 +0000 UTC m=+0.235874990 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:44:27 np0005625204.localdomain podman[100151]: 2026-02-20 08:44:27.711391408 +0000 UTC m=+0.178984622 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr)
Feb 20 08:44:27 np0005625204.localdomain podman[100100]: 2026-02-20 08:44:27.763028114 +0000 UTC m=+0.335084286 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, tcib_managed=true)
Feb 20 08:44:27 np0005625204.localdomain podman[100100]: 2026-02-20 08:44:27.791993149 +0000 UTC m=+0.364049281 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:44:27 np0005625204.localdomain podman[100151]: 2026-02-20 08:44:27.936009299 +0000 UTC m=+0.403602523 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, tcib_managed=true)
Feb 20 08:44:27 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:44:35 np0005625204.localdomain podman[100220]: 2026-02-20 08:44:35.149779398 +0000 UTC m=+0.080268132 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_controller)
Feb 20 08:44:35 np0005625204.localdomain podman[100220]: 2026-02-20 08:44:35.160925192 +0000 UTC m=+0.091413986 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 20 08:44:35 np0005625204.localdomain podman[100220]: unhealthy
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:44:35 np0005625204.localdomain podman[100223]: 2026-02-20 08:44:35.248491528 +0000 UTC m=+0.175015590 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, distribution-scope=public, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:44:35 np0005625204.localdomain podman[100222]: 2026-02-20 08:44:35.295587594 +0000 UTC m=+0.221078883 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13)
Feb 20 08:44:35 np0005625204.localdomain podman[100222]: 2026-02-20 08:44:35.312010371 +0000 UTC m=+0.237501670 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc.)
Feb 20 08:44:35 np0005625204.localdomain podman[100222]: unhealthy
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:44:35 np0005625204.localdomain podman[100223]: 2026-02-20 08:44:35.329662587 +0000 UTC m=+0.256186649 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_id=tripleo_step5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:44:35 np0005625204.localdomain podman[100221]: 2026-02-20 08:44:35.400607799 +0000 UTC m=+0.329100691 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid)
Feb 20 08:44:35 np0005625204.localdomain podman[100221]: 2026-02-20 08:44:35.438105218 +0000 UTC m=+0.366598100 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=)
Feb 20 08:44:35 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:44:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:44:41 np0005625204.localdomain systemd[1]: tmp-crun.V0WMSn.mount: Deactivated successfully.
Feb 20 08:44:41 np0005625204.localdomain podman[100304]: 2026-02-20 08:44:41.141908036 +0000 UTC m=+0.081730527 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:44:41 np0005625204.localdomain podman[100304]: 2026-02-20 08:44:41.472106159 +0000 UTC m=+0.411928640 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=nova_migration_target)
Feb 20 08:44:41 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: tmp-crun.MSrB0o.mount: Deactivated successfully.
Feb 20 08:44:58 np0005625204.localdomain podman[100328]: 2026-02-20 08:44:58.203694275 +0000 UTC m=+0.137106427 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, config_id=tripleo_step4)
Feb 20 08:44:58 np0005625204.localdomain podman[100328]: 2026-02-20 08:44:58.210856617 +0000 UTC m=+0.144268759 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-cron-container, distribution-scope=public, version=17.1.13, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, release=1766032510)
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:44:58 np0005625204.localdomain podman[100327]: 2026-02-20 08:44:58.293747388 +0000 UTC m=+0.230885505 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public)
Feb 20 08:44:58 np0005625204.localdomain podman[100335]: 2026-02-20 08:44:58.250628316 +0000 UTC m=+0.175842885 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 20 08:44:58 np0005625204.localdomain podman[100327]: 2026-02-20 08:44:58.322211207 +0000 UTC m=+0.259349374 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:44:58 np0005625204.localdomain podman[100335]: 2026-02-20 08:44:58.336035655 +0000 UTC m=+0.261250264 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13)
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:44:58 np0005625204.localdomain podman[100329]: 2026-02-20 08:44:58.177192797 +0000 UTC m=+0.106366458 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, release=1766032510, distribution-scope=public)
Feb 20 08:44:58 np0005625204.localdomain podman[100341]: 2026-02-20 08:44:58.278384904 +0000 UTC m=+0.199444824 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 20 08:44:58 np0005625204.localdomain podman[100341]: 2026-02-20 08:44:58.445039363 +0000 UTC m=+0.366099213 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, container_name=metrics_qdr, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:44:58 np0005625204.localdomain podman[100329]: 2026-02-20 08:44:58.461991017 +0000 UTC m=+0.391164718 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.5)
Feb 20 08:44:58 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:44:59 np0005625204.localdomain sshd[100450]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:44:59 np0005625204.localdomain sshd[100450]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: tmp-crun.o80s7g.mount: Deactivated successfully.
Feb 20 08:45:06 np0005625204.localdomain podman[100452]: 2026-02-20 08:45:06.160808725 +0000 UTC m=+0.098858566 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 20 08:45:06 np0005625204.localdomain podman[100452]: 2026-02-20 08:45:06.178219753 +0000 UTC m=+0.116269604 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, container_name=ovn_controller, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc.)
Feb 20 08:45:06 np0005625204.localdomain podman[100454]: 2026-02-20 08:45:06.178220063 +0000 UTC m=+0.108107782 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:45:06 np0005625204.localdomain podman[100452]: unhealthy
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:45:06 np0005625204.localdomain podman[100453]: 2026-02-20 08:45:06.24187383 +0000 UTC m=+0.174441312 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:45:06 np0005625204.localdomain podman[100453]: 2026-02-20 08:45:06.248562687 +0000 UTC m=+0.181130159 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:45:06 np0005625204.localdomain podman[100454]: 2026-02-20 08:45:06.26128737 +0000 UTC m=+0.191175139 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:45:06 np0005625204.localdomain podman[100454]: unhealthy
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:45:06 np0005625204.localdomain podman[100460]: 2026-02-20 08:45:06.314713641 +0000 UTC m=+0.238448810 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public)
Feb 20 08:45:06 np0005625204.localdomain podman[100460]: 2026-02-20 08:45:06.366709128 +0000 UTC m=+0.290444307 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, vcs-type=git, container_name=nova_compute, url=https://www.redhat.com)
Feb 20 08:45:06 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:45:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:45:12 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:45:12 np0005625204.localdomain recover_tripleo_nova_virtqemud[100539]: 63005
Feb 20 08:45:12 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:45:12 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:45:12 np0005625204.localdomain podman[100537]: 2026-02-20 08:45:12.152780288 +0000 UTC m=+0.087201525 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 20 08:45:12 np0005625204.localdomain podman[100537]: 2026-02-20 08:45:12.551069827 +0000 UTC m=+0.485491014 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 20 08:45:12 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:45:21 np0005625204.localdomain sudo[100563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:45:21 np0005625204.localdomain sudo[100563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:45:21 np0005625204.localdomain sudo[100563]: pam_unix(sudo:session): session closed for user root
Feb 20 08:45:21 np0005625204.localdomain sudo[100578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:45:21 np0005625204.localdomain sudo[100578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:45:22 np0005625204.localdomain sudo[100578]: pam_unix(sudo:session): session closed for user root
Feb 20 08:45:23 np0005625204.localdomain sudo[100624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:45:23 np0005625204.localdomain sudo[100624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:45:23 np0005625204.localdomain sudo[100624]: pam_unix(sudo:session): session closed for user root
Feb 20 08:45:23 np0005625204.localdomain sshd[100639]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:45:24 np0005625204.localdomain sshd[100639]: Invalid user sol from 45.148.10.240 port 45290
Feb 20 08:45:24 np0005625204.localdomain sshd[100639]: Connection closed by invalid user sol 45.148.10.240 port 45290 [preauth]
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: tmp-crun.WUwYP1.mount: Deactivated successfully.
Feb 20 08:45:29 np0005625204.localdomain podman[100644]: 2026-02-20 08:45:29.20447822 +0000 UTC m=+0.123175007 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:45:29 np0005625204.localdomain podman[100641]: 2026-02-20 08:45:29.16825135 +0000 UTC m=+0.092733536 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:45:29 np0005625204.localdomain podman[100643]: 2026-02-20 08:45:29.226010706 +0000 UTC m=+0.147852720 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:45:29 np0005625204.localdomain podman[100645]: 2026-02-20 08:45:29.309520276 +0000 UTC m=+0.226372916 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:45:29 np0005625204.localdomain podman[100641]: 2026-02-20 08:45:29.322444085 +0000 UTC m=+0.246926301 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com)
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:45:29 np0005625204.localdomain podman[100642]: 2026-02-20 08:45:29.283863414 +0000 UTC m=+0.207807444 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:45:29 np0005625204.localdomain podman[100642]: 2026-02-20 08:45:29.363902917 +0000 UTC m=+0.287846937 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:45:29 np0005625204.localdomain podman[100643]: 2026-02-20 08:45:29.373930887 +0000 UTC m=+0.295772971 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, release=1766032510, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:45:29 np0005625204.localdomain podman[100644]: 2026-02-20 08:45:29.428887144 +0000 UTC m=+0.347583911 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:45:29 np0005625204.localdomain podman[100645]: 2026-02-20 08:45:29.539101101 +0000 UTC m=+0.455953731 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Feb 20 08:45:29 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: tmp-crun.Pj2UiB.mount: Deactivated successfully.
Feb 20 08:45:37 np0005625204.localdomain podman[100763]: 2026-02-20 08:45:37.156453571 +0000 UTC m=+0.088600809 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:45:37 np0005625204.localdomain podman[100763]: 2026-02-20 08:45:37.167981147 +0000 UTC m=+0.100128345 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, batch=17.1_20260112.1, container_name=iscsid, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, distribution-scope=public)
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:45:37 np0005625204.localdomain podman[100765]: 2026-02-20 08:45:37.208070836 +0000 UTC m=+0.130716060 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:45:37 np0005625204.localdomain podman[100765]: 2026-02-20 08:45:37.241033465 +0000 UTC m=+0.163678749 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:45:37 np0005625204.localdomain podman[100762]: 2026-02-20 08:45:37.248197066 +0000 UTC m=+0.181584982 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 20 08:45:37 np0005625204.localdomain podman[100764]: 2026-02-20 08:45:37.31628396 +0000 UTC m=+0.240838423 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step4, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Feb 20 08:45:37 np0005625204.localdomain podman[100762]: 2026-02-20 08:45:37.327495687 +0000 UTC m=+0.260883633 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510)
Feb 20 08:45:37 np0005625204.localdomain podman[100762]: unhealthy
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:45:37 np0005625204.localdomain podman[100764]: 2026-02-20 08:45:37.363025235 +0000 UTC m=+0.287579678 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:45:37 np0005625204.localdomain podman[100764]: unhealthy
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:45:37 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:45:43 np0005625204.localdomain sshd[100844]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:45:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:45:44 np0005625204.localdomain podman[100846]: 2026-02-20 08:45:44.034585989 +0000 UTC m=+0.086734211 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:45:44 np0005625204.localdomain sshd[100868]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:45:44 np0005625204.localdomain podman[100846]: 2026-02-20 08:45:44.431302219 +0000 UTC m=+0.483450421 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, distribution-scope=public, config_id=tripleo_step4)
Feb 20 08:45:44 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:45:44 np0005625204.localdomain sshd[100868]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:45:44 np0005625204.localdomain sshd[100844]: Invalid user common from 152.32.189.21 port 41930
Feb 20 08:45:44 np0005625204.localdomain sshd[100844]: Received disconnect from 152.32.189.21 port 41930:11: Bye Bye [preauth]
Feb 20 08:45:44 np0005625204.localdomain sshd[100844]: Disconnected from invalid user common 152.32.189.21 port 41930 [preauth]
Feb 20 08:45:48 np0005625204.localdomain sshd[100870]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:45:50 np0005625204.localdomain sshd[100870]: Received disconnect from 103.157.25.4 port 51128:11: Bye Bye [preauth]
Feb 20 08:45:50 np0005625204.localdomain sshd[100870]: Disconnected from authenticating user root 103.157.25.4 port 51128 [preauth]
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: tmp-crun.cT3iAC.mount: Deactivated successfully.
Feb 20 08:46:00 np0005625204.localdomain podman[100876]: 2026-02-20 08:46:00.153189314 +0000 UTC m=+0.080400385 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:46:00 np0005625204.localdomain podman[100872]: 2026-02-20 08:46:00.215047216 +0000 UTC m=+0.147930743 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:46:00 np0005625204.localdomain podman[100875]: 2026-02-20 08:46:00.180896891 +0000 UTC m=+0.102675944 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, container_name=ceilometer_agent_compute)
Feb 20 08:46:00 np0005625204.localdomain podman[100875]: 2026-02-20 08:46:00.264030159 +0000 UTC m=+0.185809272 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:46:00 np0005625204.localdomain podman[100873]: 2026-02-20 08:46:00.24466191 +0000 UTC m=+0.174714019 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., architecture=x86_64, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:46:00 np0005625204.localdomain podman[100873]: 2026-02-20 08:46:00.327968135 +0000 UTC m=+0.258020224 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git)
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:46:00 np0005625204.localdomain podman[100876]: 2026-02-20 08:46:00.352773192 +0000 UTC m=+0.279984243 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-type=git, version=17.1.13)
Feb 20 08:46:00 np0005625204.localdomain podman[100872]: 2026-02-20 08:46:00.351145381 +0000 UTC m=+0.284028878 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:46:00 np0005625204.localdomain podman[100874]: 2026-02-20 08:46:00.268775536 +0000 UTC m=+0.196569665 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:46:00 np0005625204.localdomain podman[100874]: 2026-02-20 08:46:00.399034882 +0000 UTC m=+0.326828961 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd, container_name=collectd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, architecture=x86_64)
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:46:00 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:46:08 np0005625204.localdomain podman[100991]: 2026-02-20 08:46:08.153229981 +0000 UTC m=+0.090323763 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:46:08 np0005625204.localdomain podman[100991]: 2026-02-20 08:46:08.171059341 +0000 UTC m=+0.108153113 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:46:08 np0005625204.localdomain podman[100991]: unhealthy
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: tmp-crun.kzTDFg.mount: Deactivated successfully.
Feb 20 08:46:08 np0005625204.localdomain podman[100993]: 2026-02-20 08:46:08.270269757 +0000 UTC m=+0.198559877 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:46:08 np0005625204.localdomain podman[100992]: 2026-02-20 08:46:08.300747038 +0000 UTC m=+0.233848686 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step3)
Feb 20 08:46:08 np0005625204.localdomain podman[100992]: 2026-02-20 08:46:08.310090388 +0000 UTC m=+0.243192046 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=iscsid, config_id=tripleo_step3, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64)
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:46:08 np0005625204.localdomain podman[100999]: 2026-02-20 08:46:08.225915297 +0000 UTC m=+0.148865212 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com)
Feb 20 08:46:08 np0005625204.localdomain podman[100999]: 2026-02-20 08:46:08.356669437 +0000 UTC m=+0.279619372 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, version=17.1.13, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:46:08 np0005625204.localdomain podman[100993]: 2026-02-20 08:46:08.411407418 +0000 UTC m=+0.339697488 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:46:08 np0005625204.localdomain podman[100993]: unhealthy
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:46:08 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:46:10 np0005625204.localdomain sshd[101076]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:46:10 np0005625204.localdomain sshd[101076]: Received disconnect from 43.245.222.27 port 46692:11: Bye Bye [preauth]
Feb 20 08:46:10 np0005625204.localdomain sshd[101076]: Disconnected from 43.245.222.27 port 46692 [preauth]
Feb 20 08:46:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:46:15 np0005625204.localdomain podman[101078]: 2026-02-20 08:46:15.135543948 +0000 UTC m=+0.069265342 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 20 08:46:15 np0005625204.localdomain podman[101078]: 2026-02-20 08:46:15.516522911 +0000 UTC m=+0.450244385 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:46:15 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:46:23 np0005625204.localdomain sudo[101101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:46:23 np0005625204.localdomain sudo[101101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:46:23 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:46:23 np0005625204.localdomain sudo[101101]: pam_unix(sudo:session): session closed for user root
Feb 20 08:46:23 np0005625204.localdomain recover_tripleo_nova_virtqemud[101117]: 63005
Feb 20 08:46:23 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:46:23 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:46:23 np0005625204.localdomain sudo[101118]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:46:23 np0005625204.localdomain sudo[101118]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:46:23 np0005625204.localdomain sudo[101118]: pam_unix(sudo:session): session closed for user root
Feb 20 08:46:24 np0005625204.localdomain sudo[101164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:46:24 np0005625204.localdomain sudo[101164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:46:24 np0005625204.localdomain sudo[101164]: pam_unix(sudo:session): session closed for user root
Feb 20 08:46:28 np0005625204.localdomain sshd[101179]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:46:29 np0005625204.localdomain sshd[101179]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:46:31 np0005625204.localdomain podman[101183]: 2026-02-20 08:46:31.157451566 +0000 UTC m=+0.084610437 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, container_name=collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, architecture=x86_64)
Feb 20 08:46:31 np0005625204.localdomain podman[101183]: 2026-02-20 08:46:31.168007122 +0000 UTC m=+0.095166023 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, config_id=tripleo_step3)
Feb 20 08:46:31 np0005625204.localdomain podman[101195]: 2026-02-20 08:46:31.180081475 +0000 UTC m=+0.100043443 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64)
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:46:31 np0005625204.localdomain podman[101181]: 2026-02-20 08:46:31.260978677 +0000 UTC m=+0.193131173 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 20 08:46:31 np0005625204.localdomain podman[101181]: 2026-02-20 08:46:31.310024444 +0000 UTC m=+0.242176960 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13)
Feb 20 08:46:31 np0005625204.localdomain podman[101182]: 2026-02-20 08:46:31.315371079 +0000 UTC m=+0.247597397 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, distribution-scope=public)
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:46:31 np0005625204.localdomain podman[101182]: 2026-02-20 08:46:31.322184389 +0000 UTC m=+0.254410697 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, release=1766032510)
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:46:31 np0005625204.localdomain podman[101184]: 2026-02-20 08:46:31.364298681 +0000 UTC m=+0.290071879 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:46:31 np0005625204.localdomain podman[101195]: 2026-02-20 08:46:31.407094784 +0000 UTC m=+0.327056802 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, release=1766032510, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:46:31 np0005625204.localdomain podman[101184]: 2026-02-20 08:46:31.417344241 +0000 UTC m=+0.343117469 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:46:31 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:46:32 np0005625204.localdomain systemd[1]: tmp-crun.xGKa2T.mount: Deactivated successfully.
Feb 20 08:46:37 np0005625204.localdomain sshd[101298]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:46:38 np0005625204.localdomain sshd[101298]: Invalid user albi from 96.78.175.36 port 58438
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: tmp-crun.KVAkKb.mount: Deactivated successfully.
Feb 20 08:46:38 np0005625204.localdomain sshd[101298]: Received disconnect from 96.78.175.36 port 58438:11: Bye Bye [preauth]
Feb 20 08:46:38 np0005625204.localdomain sshd[101298]: Disconnected from invalid user albi 96.78.175.36 port 58438 [preauth]
Feb 20 08:46:38 np0005625204.localdomain podman[101301]: 2026-02-20 08:46:38.69433985 +0000 UTC m=+0.085998770 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true)
Feb 20 08:46:38 np0005625204.localdomain podman[101301]: 2026-02-20 08:46:38.710747538 +0000 UTC m=+0.102406438 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:46:38 np0005625204.localdomain podman[101303]: 2026-02-20 08:46:38.790841114 +0000 UTC m=+0.174236498 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:46:38 np0005625204.localdomain podman[101302]: 2026-02-20 08:46:38.713126491 +0000 UTC m=+0.096744672 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:46:38 np0005625204.localdomain podman[101300]: 2026-02-20 08:46:38.844558694 +0000 UTC m=+0.236344168 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:46:38 np0005625204.localdomain podman[101303]: 2026-02-20 08:46:38.851982854 +0000 UTC m=+0.235378148 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z)
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:46:38 np0005625204.localdomain podman[101300]: 2026-02-20 08:46:38.884978424 +0000 UTC m=+0.276763808 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 20 08:46:38 np0005625204.localdomain podman[101300]: unhealthy
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:46:38 np0005625204.localdomain podman[101302]: 2026-02-20 08:46:38.898038107 +0000 UTC m=+0.281656288 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:46:38 np0005625204.localdomain podman[101302]: unhealthy
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:46:38 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:46:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:46:46 np0005625204.localdomain podman[101382]: 2026-02-20 08:46:46.145268314 +0000 UTC m=+0.082118030 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, distribution-scope=public, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:46:46 np0005625204.localdomain podman[101382]: 2026-02-20 08:46:46.510143716 +0000 UTC m=+0.446993432 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, release=1766032510, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:46:46 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:46:50 np0005625204.localdomain sshd[101405]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:46:51 np0005625204.localdomain sshd[101405]: Invalid user shreyas from 83.235.16.111 port 56108
Feb 20 08:46:52 np0005625204.localdomain sshd[101405]: Received disconnect from 83.235.16.111 port 56108:11: Bye Bye [preauth]
Feb 20 08:46:52 np0005625204.localdomain sshd[101405]: Disconnected from invalid user shreyas 83.235.16.111 port 56108 [preauth]
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:47:02 np0005625204.localdomain podman[101411]: 2026-02-20 08:47:02.139482298 +0000 UTC m=+0.064768583 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, release=1766032510, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc.)
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: tmp-crun.tSGOav.mount: Deactivated successfully.
Feb 20 08:47:02 np0005625204.localdomain podman[101420]: 2026-02-20 08:47:02.181856998 +0000 UTC m=+0.098140965 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true)
Feb 20 08:47:02 np0005625204.localdomain podman[101410]: 2026-02-20 08:47:02.199958347 +0000 UTC m=+0.128616768 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:47:02 np0005625204.localdomain podman[101408]: 2026-02-20 08:47:02.201703862 +0000 UTC m=+0.130920890 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z)
Feb 20 08:47:02 np0005625204.localdomain podman[101409]: 2026-02-20 08:47:02.259112386 +0000 UTC m=+0.186816436 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, container_name=logrotate_crond, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 20 08:47:02 np0005625204.localdomain podman[101411]: 2026-02-20 08:47:02.279298031 +0000 UTC m=+0.204584386 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:47:02 np0005625204.localdomain podman[101408]: 2026-02-20 08:47:02.281362214 +0000 UTC m=+0.210579232 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:47:02 np0005625204.localdomain podman[101409]: 2026-02-20 08:47:02.292949773 +0000 UTC m=+0.220653833 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:47:02 np0005625204.localdomain podman[101420]: 2026-02-20 08:47:02.413770428 +0000 UTC m=+0.330054415 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:47:02 np0005625204.localdomain podman[101410]: 2026-02-20 08:47:02.436814511 +0000 UTC m=+0.365472942 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, name=rhosp-rhel9/openstack-collectd)
Feb 20 08:47:02 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:47:05 np0005625204.localdomain sshd[36740]: Received disconnect from 192.168.122.100 port 46784:11: disconnected by user
Feb 20 08:47:05 np0005625204.localdomain sshd[36740]: Disconnected from user tripleo-admin 192.168.122.100 port 46784
Feb 20 08:47:05 np0005625204.localdomain sshd[36720]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 20 08:47:05 np0005625204.localdomain systemd[1]: session-29.scope: Deactivated successfully.
Feb 20 08:47:05 np0005625204.localdomain systemd[1]: session-29.scope: Consumed 7min 7.963s CPU time.
Feb 20 08:47:05 np0005625204.localdomain systemd-logind[759]: Session 29 logged out. Waiting for processes to exit.
Feb 20 08:47:05 np0005625204.localdomain systemd-logind[759]: Removed session 29.
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: tmp-crun.bhG1ie.mount: Deactivated successfully.
Feb 20 08:47:09 np0005625204.localdomain podman[101525]: 2026-02-20 08:47:09.173799194 +0000 UTC m=+0.105033239 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z)
Feb 20 08:47:09 np0005625204.localdomain podman[101525]: 2026-02-20 08:47:09.192936135 +0000 UTC m=+0.124170180 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, container_name=ovn_controller, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 20 08:47:09 np0005625204.localdomain podman[101527]: 2026-02-20 08:47:09.224693947 +0000 UTC m=+0.149650358 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:47:09 np0005625204.localdomain podman[101527]: 2026-02-20 08:47:09.240150064 +0000 UTC m=+0.165106495 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public)
Feb 20 08:47:09 np0005625204.localdomain podman[101527]: unhealthy
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:47:09 np0005625204.localdomain podman[101526]: 2026-02-20 08:47:09.322908173 +0000 UTC m=+0.247035318 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:47:09 np0005625204.localdomain podman[101526]: 2026-02-20 08:47:09.332011945 +0000 UTC m=+0.256139130 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:47:09 np0005625204.localdomain podman[101528]: 2026-02-20 08:47:09.369440853 +0000 UTC m=+0.291460733 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible)
Feb 20 08:47:09 np0005625204.localdomain podman[101525]: unhealthy
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:47:09 np0005625204.localdomain podman[101528]: 2026-02-20 08:47:09.448345342 +0000 UTC m=+0.370365272 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5)
Feb 20 08:47:09 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:47:10 np0005625204.localdomain systemd[1]: tmp-crun.7Dg22G.mount: Deactivated successfully.
Feb 20 08:47:13 np0005625204.localdomain sshd[101606]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:47:13 np0005625204.localdomain sshd[101606]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Activating special unit Exit the Session...
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Removed slice User Background Tasks Slice.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped target Main User Target.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped target Basic System.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped target Paths.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped target Sockets.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped target Timers.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Closed D-Bus User Message Bus Socket.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Stopped Create User's Volatile Files and Directories.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Removed slice User Application Slice.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Reached target Shutdown.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Finished Exit the Session.
Feb 20 08:47:15 np0005625204.localdomain systemd[36724]: Reached target Exit the Session.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: user@1003.service: Consumed 4.783s CPU time, read 0B from disk, written 7.0K to disk.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 20 08:47:15 np0005625204.localdomain systemd[1]: user-1003.slice: Consumed 7min 12.779s CPU time.
Feb 20 08:47:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:47:17 np0005625204.localdomain systemd[1]: tmp-crun.c6Yt0C.mount: Deactivated successfully.
Feb 20 08:47:17 np0005625204.localdomain podman[101610]: 2026-02-20 08:47:17.148392681 +0000 UTC m=+0.086598998 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, distribution-scope=public, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com)
Feb 20 08:47:17 np0005625204.localdomain podman[101610]: 2026-02-20 08:47:17.538110771 +0000 UTC m=+0.476317118 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com)
Feb 20 08:47:17 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:47:24 np0005625204.localdomain sudo[101634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:47:24 np0005625204.localdomain sudo[101634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:47:24 np0005625204.localdomain sudo[101634]: pam_unix(sudo:session): session closed for user root
Feb 20 08:47:24 np0005625204.localdomain sudo[101649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:47:24 np0005625204.localdomain sudo[101649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:47:25 np0005625204.localdomain sudo[101649]: pam_unix(sudo:session): session closed for user root
Feb 20 08:47:26 np0005625204.localdomain sudo[101697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:47:26 np0005625204.localdomain sudo[101697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:47:26 np0005625204.localdomain sudo[101697]: pam_unix(sudo:session): session closed for user root
Feb 20 08:47:27 np0005625204.localdomain sshd[101712]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:47:27 np0005625204.localdomain sshd[101712]: Invalid user ubuntu from 45.148.10.240 port 36432
Feb 20 08:47:27 np0005625204.localdomain sshd[101712]: Connection closed by invalid user ubuntu 45.148.10.240 port 36432 [preauth]
Feb 20 08:47:30 np0005625204.localdomain sshd[101714]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:47:31 np0005625204.localdomain sshd[101714]: Received disconnect from 77.232.138.190 port 47990:11: Bye Bye [preauth]
Feb 20 08:47:31 np0005625204.localdomain sshd[101714]: Disconnected from authenticating user root 77.232.138.190 port 47990 [preauth]
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:47:33 np0005625204.localdomain podman[101716]: 2026-02-20 08:47:33.217253192 +0000 UTC m=+0.153180036 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, distribution-scope=public)
Feb 20 08:47:33 np0005625204.localdomain podman[101717]: 2026-02-20 08:47:33.183017874 +0000 UTC m=+0.116106850 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z)
Feb 20 08:47:33 np0005625204.localdomain podman[101718]: 2026-02-20 08:47:33.152732207 +0000 UTC m=+0.087064723 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Feb 20 08:47:33 np0005625204.localdomain podman[101719]: 2026-02-20 08:47:33.270281412 +0000 UTC m=+0.199865530 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510)
Feb 20 08:47:33 np0005625204.localdomain podman[101718]: 2026-02-20 08:47:33.283650595 +0000 UTC m=+0.217983131 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z)
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:47:33 np0005625204.localdomain podman[101719]: 2026-02-20 08:47:33.297930057 +0000 UTC m=+0.227514135 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5)
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:47:33 np0005625204.localdomain podman[101720]: 2026-02-20 08:47:33.236193268 +0000 UTC m=+0.158341066 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=metrics_qdr, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-qdrouterd-container)
Feb 20 08:47:33 np0005625204.localdomain podman[101717]: 2026-02-20 08:47:33.369343825 +0000 UTC m=+0.302432721 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:47:33 np0005625204.localdomain podman[101716]: 2026-02-20 08:47:33.38731314 +0000 UTC m=+0.323239904 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:47:33 np0005625204.localdomain podman[101720]: 2026-02-20 08:47:33.434978484 +0000 UTC m=+0.357126322 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=metrics_qdr, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible)
Feb 20 08:47:33 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:47:34 np0005625204.localdomain systemd[1]: tmp-crun.ZKZ5wt.mount: Deactivated successfully.
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: tmp-crun.KGwjfq.mount: Deactivated successfully.
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: tmp-crun.bPsMEX.mount: Deactivated successfully.
Feb 20 08:47:40 np0005625204.localdomain podman[101837]: 2026-02-20 08:47:40.219051932 +0000 UTC m=+0.150297798 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, tcib_managed=true)
Feb 20 08:47:40 np0005625204.localdomain podman[101837]: 2026-02-20 08:47:40.233275142 +0000 UTC m=+0.164520978 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, batch=17.1_20260112.1, container_name=iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3)
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:47:40 np0005625204.localdomain podman[101844]: 2026-02-20 08:47:40.322073258 +0000 UTC m=+0.246444471 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:47:40 np0005625204.localdomain podman[101838]: 2026-02-20 08:47:40.184788273 +0000 UTC m=+0.109744595 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, vcs-type=git)
Feb 20 08:47:40 np0005625204.localdomain podman[101844]: 2026-02-20 08:47:40.351964461 +0000 UTC m=+0.276335634 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:47:40 np0005625204.localdomain podman[101838]: 2026-02-20 08:47:40.37036393 +0000 UTC m=+0.295320192 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:47:40 np0005625204.localdomain podman[101838]: unhealthy
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:47:40 np0005625204.localdomain podman[101836]: 2026-02-20 08:47:40.370969579 +0000 UTC m=+0.307490968 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, release=1766032510, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:47:40 np0005625204.localdomain podman[101836]: 2026-02-20 08:47:40.457060731 +0000 UTC m=+0.393582120 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1)
Feb 20 08:47:40 np0005625204.localdomain podman[101836]: unhealthy
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:47:40 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:47:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:47:48 np0005625204.localdomain podman[101922]: 2026-02-20 08:47:48.144344153 +0000 UTC m=+0.081006646 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 20 08:47:48 np0005625204.localdomain podman[101922]: 2026-02-20 08:47:48.517014395 +0000 UTC m=+0.453676838 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, release=1766032510)
Feb 20 08:47:48 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:47:58 np0005625204.localdomain sshd[101945]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:47:58 np0005625204.localdomain sshd[101945]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: tmp-crun.kQ6miA.mount: Deactivated successfully.
Feb 20 08:48:04 np0005625204.localdomain podman[101949]: 2026-02-20 08:48:04.224813135 +0000 UTC m=+0.156436718 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, container_name=collectd, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:48:04 np0005625204.localdomain podman[101951]: 2026-02-20 08:48:04.180048031 +0000 UTC m=+0.105993128 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13)
Feb 20 08:48:04 np0005625204.localdomain podman[101948]: 2026-02-20 08:48:04.314805667 +0000 UTC m=+0.245176651 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 20 08:48:04 np0005625204.localdomain podman[101949]: 2026-02-20 08:48:04.334228638 +0000 UTC m=+0.265852221 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 20 08:48:04 np0005625204.localdomain podman[101951]: 2026-02-20 08:48:04.361678807 +0000 UTC m=+0.287623934 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:48:04 np0005625204.localdomain podman[101947]: 2026-02-20 08:48:04.373832082 +0000 UTC m=+0.306262440 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, tcib_managed=true)
Feb 20 08:48:04 np0005625204.localdomain podman[101950]: 2026-02-20 08:48:04.418837224 +0000 UTC m=+0.341756298 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, vcs-type=git)
Feb 20 08:48:04 np0005625204.localdomain podman[101950]: 2026-02-20 08:48:04.442094074 +0000 UTC m=+0.365013178 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1)
Feb 20 08:48:04 np0005625204.localdomain podman[101948]: 2026-02-20 08:48:04.447244322 +0000 UTC m=+0.377615276 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64)
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:48:04 np0005625204.localdomain podman[101947]: 2026-02-20 08:48:04.493206074 +0000 UTC m=+0.425636472 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z)
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:48:04 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: tmp-crun.TvzD8b.mount: Deactivated successfully.
Feb 20 08:48:11 np0005625204.localdomain podman[102070]: 2026-02-20 08:48:11.171745661 +0000 UTC m=+0.100339354 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5)
Feb 20 08:48:11 np0005625204.localdomain podman[102070]: 2026-02-20 08:48:11.217170455 +0000 UTC m=+0.145764188 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_id=tripleo_step4, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:48:11 np0005625204.localdomain podman[102070]: unhealthy
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:48:11 np0005625204.localdomain podman[102069]: 2026-02-20 08:48:11.266624304 +0000 UTC m=+0.199490898 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 20 08:48:11 np0005625204.localdomain podman[102068]: 2026-02-20 08:48:11.221270502 +0000 UTC m=+0.158191572 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, container_name=ovn_controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:48:11 np0005625204.localdomain podman[102069]: 2026-02-20 08:48:11.304463804 +0000 UTC m=+0.237330438 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:48:11 np0005625204.localdomain podman[102076]: 2026-02-20 08:48:11.322189242 +0000 UTC m=+0.244122179 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, config_id=tripleo_step5, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 20 08:48:11 np0005625204.localdomain podman[102068]: 2026-02-20 08:48:11.356352458 +0000 UTC m=+0.293273508 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:48:11 np0005625204.localdomain podman[102068]: unhealthy
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:48:11 np0005625204.localdomain podman[102076]: 2026-02-20 08:48:11.380230636 +0000 UTC m=+0.302163573 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=nova_compute, batch=17.1_20260112.1, config_id=tripleo_step5, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510)
Feb 20 08:48:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:48:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:48:19 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:48:19 np0005625204.localdomain recover_tripleo_nova_virtqemud[102154]: 63005
Feb 20 08:48:19 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:48:19 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:48:19 np0005625204.localdomain systemd[1]: tmp-crun.LTGRKy.mount: Deactivated successfully.
Feb 20 08:48:19 np0005625204.localdomain podman[102150]: 2026-02-20 08:48:19.159221556 +0000 UTC m=+0.097073223 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1)
Feb 20 08:48:19 np0005625204.localdomain podman[102150]: 2026-02-20 08:48:19.566107275 +0000 UTC m=+0.503958852 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:48:19 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:48:26 np0005625204.localdomain sudo[102175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:48:26 np0005625204.localdomain sudo[102175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:48:26 np0005625204.localdomain sudo[102175]: pam_unix(sudo:session): session closed for user root
Feb 20 08:48:26 np0005625204.localdomain sudo[102190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:48:26 np0005625204.localdomain sudo[102190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:48:27 np0005625204.localdomain sudo[102190]: pam_unix(sudo:session): session closed for user root
Feb 20 08:48:28 np0005625204.localdomain sudo[102236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:48:28 np0005625204.localdomain sudo[102236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:48:28 np0005625204.localdomain sudo[102236]: pam_unix(sudo:session): session closed for user root
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:48:35 np0005625204.localdomain podman[102253]: 2026-02-20 08:48:35.159357706 +0000 UTC m=+0.086486276 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:48:35 np0005625204.localdomain podman[102253]: 2026-02-20 08:48:35.196071201 +0000 UTC m=+0.123199741 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, release=1766032510)
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:48:35 np0005625204.localdomain podman[102255]: 2026-02-20 08:48:35.2119186 +0000 UTC m=+0.134138588 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, container_name=metrics_qdr, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510)
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: tmp-crun.sBl2Jr.mount: Deactivated successfully.
Feb 20 08:48:35 np0005625204.localdomain podman[102254]: 2026-02-20 08:48:35.270823771 +0000 UTC m=+0.193631327 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:48:35 np0005625204.localdomain podman[102254]: 2026-02-20 08:48:35.311364755 +0000 UTC m=+0.234172391 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, url=https://www.redhat.com)
Feb 20 08:48:35 np0005625204.localdomain podman[102251]: 2026-02-20 08:48:35.319411024 +0000 UTC m=+0.247912767 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:48:35 np0005625204.localdomain podman[102252]: 2026-02-20 08:48:35.376385495 +0000 UTC m=+0.304146025 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-cron-container, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:48:35 np0005625204.localdomain podman[102251]: 2026-02-20 08:48:35.387566411 +0000 UTC m=+0.316068164 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:48:35 np0005625204.localdomain podman[102255]: 2026-02-20 08:48:35.402921356 +0000 UTC m=+0.325141404 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team)
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:48:35 np0005625204.localdomain podman[102252]: 2026-02-20 08:48:35.439770165 +0000 UTC m=+0.367530655 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, architecture=x86_64, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:48:35 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:48:42 np0005625204.localdomain sshd[102367]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:48:42 np0005625204.localdomain podman[102369]: 2026-02-20 08:48:42.151053482 +0000 UTC m=+0.086615969 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, tcib_managed=true)
Feb 20 08:48:42 np0005625204.localdomain podman[102369]: 2026-02-20 08:48:42.168004206 +0000 UTC m=+0.103566693 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:48:42 np0005625204.localdomain podman[102369]: unhealthy
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:48:42 np0005625204.localdomain sshd[102367]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:48:42 np0005625204.localdomain podman[102376]: 2026-02-20 08:48:42.214948547 +0000 UTC m=+0.138626386 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 20 08:48:42 np0005625204.localdomain podman[102371]: 2026-02-20 08:48:42.264978664 +0000 UTC m=+0.192835513 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:48:42 np0005625204.localdomain podman[102370]: 2026-02-20 08:48:42.299720278 +0000 UTC m=+0.231974052 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:48:42 np0005625204.localdomain podman[102371]: 2026-02-20 08:48:42.306017324 +0000 UTC m=+0.233874133 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1)
Feb 20 08:48:42 np0005625204.localdomain podman[102371]: unhealthy
Feb 20 08:48:42 np0005625204.localdomain podman[102376]: 2026-02-20 08:48:42.320548923 +0000 UTC m=+0.244226752 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, tcib_managed=true, batch=17.1_20260112.1, container_name=nova_compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:48:42 np0005625204.localdomain podman[102370]: 2026-02-20 08:48:42.359960271 +0000 UTC m=+0.292214035 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:48:42 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:48:43 np0005625204.localdomain systemd[1]: tmp-crun.H0xRD2.mount: Deactivated successfully.
Feb 20 08:48:49 np0005625204.localdomain sshd[102453]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:48:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:48:50 np0005625204.localdomain podman[102455]: 2026-02-20 08:48:50.142282883 +0000 UTC m=+0.081028716 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:48:50 np0005625204.localdomain podman[102455]: 2026-02-20 08:48:50.513106718 +0000 UTC m=+0.451852521 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:48:50 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:48:50 np0005625204.localdomain sshd[102453]: Invalid user shreyas from 152.32.189.21 port 57940
Feb 20 08:48:50 np0005625204.localdomain sshd[102453]: Received disconnect from 152.32.189.21 port 57940:11: Bye Bye [preauth]
Feb 20 08:48:50 np0005625204.localdomain sshd[102453]: Disconnected from invalid user shreyas 152.32.189.21 port 57940 [preauth]
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:49:06 np0005625204.localdomain podman[102488]: 2026-02-20 08:49:06.150743216 +0000 UTC m=+0.080517700 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-type=git, container_name=metrics_qdr)
Feb 20 08:49:06 np0005625204.localdomain podman[102482]: 2026-02-20 08:49:06.208542373 +0000 UTC m=+0.140082482 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com)
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: tmp-crun.vqoOob.mount: Deactivated successfully.
Feb 20 08:49:06 np0005625204.localdomain podman[102480]: 2026-02-20 08:49:06.263781192 +0000 UTC m=+0.200889703 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=logrotate_crond, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 20 08:49:06 np0005625204.localdomain podman[102482]: 2026-02-20 08:49:06.316350477 +0000 UTC m=+0.247890576 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:49:06 np0005625204.localdomain podman[102481]: 2026-02-20 08:49:06.323329922 +0000 UTC m=+0.258363328 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, container_name=collectd, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z)
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:49:06 np0005625204.localdomain podman[102481]: 2026-02-20 08:49:06.360048158 +0000 UTC m=+0.295081584 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510)
Feb 20 08:49:06 np0005625204.localdomain podman[102488]: 2026-02-20 08:49:06.371322726 +0000 UTC m=+0.301097190 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13)
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:49:06 np0005625204.localdomain podman[102480]: 2026-02-20 08:49:06.395970258 +0000 UTC m=+0.333078769 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, release=1766032510, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:49:06 np0005625204.localdomain podman[102479]: 2026-02-20 08:49:06.364471865 +0000 UTC m=+0.303950329 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:49:06 np0005625204.localdomain podman[102479]: 2026-02-20 08:49:06.452060713 +0000 UTC m=+0.391539137 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 20 08:49:06 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: tmp-crun.1CsCQd.mount: Deactivated successfully.
Feb 20 08:49:13 np0005625204.localdomain podman[102600]: 2026-02-20 08:49:13.158183771 +0000 UTC m=+0.082615006 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:49:13 np0005625204.localdomain podman[102600]: 2026-02-20 08:49:13.210047124 +0000 UTC m=+0.134478369 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible)
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:49:13 np0005625204.localdomain podman[102599]: 2026-02-20 08:49:13.215548364 +0000 UTC m=+0.142642921 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:49:13 np0005625204.localdomain podman[102599]: 2026-02-20 08:49:13.299957344 +0000 UTC m=+0.227051971 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:49:13 np0005625204.localdomain podman[102599]: unhealthy
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:49:13 np0005625204.localdomain podman[102597]: 2026-02-20 08:49:13.315875726 +0000 UTC m=+0.246611425 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 20 08:49:13 np0005625204.localdomain podman[102597]: 2026-02-20 08:49:13.35640171 +0000 UTC m=+0.287137449 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible)
Feb 20 08:49:13 np0005625204.localdomain podman[102597]: unhealthy
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:49:13 np0005625204.localdomain podman[102598]: 2026-02-20 08:49:13.266869851 +0000 UTC m=+0.197983732 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public)
Feb 20 08:49:13 np0005625204.localdomain podman[102598]: 2026-02-20 08:49:13.402402401 +0000 UTC m=+0.333516242 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, release=1766032510, container_name=iscsid, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:49:13 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:49:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:49:21 np0005625204.localdomain podman[102682]: 2026-02-20 08:49:21.126013388 +0000 UTC m=+0.069045656 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:49:21 np0005625204.localdomain podman[102682]: 2026-02-20 08:49:21.489090354 +0000 UTC m=+0.432122572 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:49:21 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:49:25 np0005625204.localdomain sshd[102705]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:49:25 np0005625204.localdomain sshd[102705]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:49:28 np0005625204.localdomain sudo[102707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:49:28 np0005625204.localdomain sudo[102707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:49:28 np0005625204.localdomain sudo[102707]: pam_unix(sudo:session): session closed for user root
Feb 20 08:49:28 np0005625204.localdomain sudo[102722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:49:28 np0005625204.localdomain sudo[102722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:49:29 np0005625204.localdomain sudo[102722]: pam_unix(sudo:session): session closed for user root
Feb 20 08:49:30 np0005625204.localdomain sudo[102770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:49:30 np0005625204.localdomain sudo[102770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:49:30 np0005625204.localdomain sudo[102770]: pam_unix(sudo:session): session closed for user root
Feb 20 08:49:36 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:49:36 np0005625204.localdomain recover_tripleo_nova_virtqemud[102786]: 63005
Feb 20 08:49:36 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:49:36 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: tmp-crun.UBv4t6.mount: Deactivated successfully.
Feb 20 08:49:37 np0005625204.localdomain podman[102788]: 2026-02-20 08:49:37.173780039 +0000 UTC m=+0.096892587 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp-rhel9/openstack-cron)
Feb 20 08:49:37 np0005625204.localdomain podman[102788]: 2026-02-20 08:49:37.184969555 +0000 UTC m=+0.108082103 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public)
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: tmp-crun.Wcf7yq.mount: Deactivated successfully.
Feb 20 08:49:37 np0005625204.localdomain podman[102787]: 2026-02-20 08:49:37.27794772 +0000 UTC m=+0.202787522 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 20 08:49:37 np0005625204.localdomain podman[102789]: 2026-02-20 08:49:37.32648372 +0000 UTC m=+0.249950319 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, release=1766032510, container_name=collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:49:37 np0005625204.localdomain podman[102791]: 2026-02-20 08:49:37.245453055 +0000 UTC m=+0.162443174 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step1, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 20 08:49:37 np0005625204.localdomain podman[102789]: 2026-02-20 08:49:37.365005141 +0000 UTC m=+0.288471740 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:49:37 np0005625204.localdomain podman[102790]: 2026-02-20 08:49:37.379613023 +0000 UTC m=+0.300424349 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:49:37 np0005625204.localdomain podman[102787]: 2026-02-20 08:49:37.402238792 +0000 UTC m=+0.327078624 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.)
Feb 20 08:49:37 np0005625204.localdomain podman[102790]: 2026-02-20 08:49:37.409988352 +0000 UTC m=+0.330799638 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:49:37 np0005625204.localdomain podman[102791]: 2026-02-20 08:49:37.453989003 +0000 UTC m=+0.370979032 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:49:37 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:49:41 np0005625204.localdomain sshd[102904]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:49:42 np0005625204.localdomain sshd[102904]: Invalid user ubuntu from 45.148.10.240 port 35976
Feb 20 08:49:42 np0005625204.localdomain sshd[102904]: Connection closed by invalid user ubuntu 45.148.10.240 port 35976 [preauth]
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: tmp-crun.xCE988.mount: Deactivated successfully.
Feb 20 08:49:44 np0005625204.localdomain podman[102907]: 2026-02-20 08:49:44.157005945 +0000 UTC m=+0.095779613 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 20 08:49:44 np0005625204.localdomain podman[102907]: 2026-02-20 08:49:44.194088472 +0000 UTC m=+0.132862060 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Feb 20 08:49:44 np0005625204.localdomain podman[102909]: 2026-02-20 08:49:44.205488184 +0000 UTC m=+0.138214585 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:49:44 np0005625204.localdomain podman[102906]: 2026-02-20 08:49:44.256751179 +0000 UTC m=+0.196326971 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, distribution-scope=public, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:49:44 np0005625204.localdomain podman[102906]: 2026-02-20 08:49:44.274026213 +0000 UTC m=+0.213602005 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:49:44 np0005625204.localdomain podman[102906]: unhealthy
Feb 20 08:49:44 np0005625204.localdomain podman[102909]: 2026-02-20 08:49:44.28655399 +0000 UTC m=+0.219280441 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:49:44 np0005625204.localdomain podman[102908]: 2026-02-20 08:49:44.369812594 +0000 UTC m=+0.303560776 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Feb 20 08:49:44 np0005625204.localdomain podman[102908]: 2026-02-20 08:49:44.38812059 +0000 UTC m=+0.321868792 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:49:44 np0005625204.localdomain podman[102908]: unhealthy
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:49:44 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:49:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:49:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:49:49 np0005625204.localdomain sshd[102990]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:49:50 np0005625204.localdomain sshd[102990]: Invalid user common from 83.235.16.111 port 33858
Feb 20 08:49:50 np0005625204.localdomain sshd[102990]: Received disconnect from 83.235.16.111 port 33858:11: Bye Bye [preauth]
Feb 20 08:49:50 np0005625204.localdomain sshd[102990]: Disconnected from invalid user common 83.235.16.111 port 33858 [preauth]
Feb 20 08:49:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:49:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:49:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:49:52 np0005625204.localdomain podman[102992]: 2026-02-20 08:49:52.1407247 +0000 UTC m=+0.078660223 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:49:52 np0005625204.localdomain podman[102992]: 2026-02-20 08:49:52.514999702 +0000 UTC m=+0.452935215 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, version=17.1.13, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:49:52 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:50:08 np0005625204.localdomain podman[103017]: 2026-02-20 08:50:08.168010846 +0000 UTC m=+0.095091521 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13)
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: tmp-crun.KDiTRe.mount: Deactivated successfully.
Feb 20 08:50:08 np0005625204.localdomain podman[103015]: 2026-02-20 08:50:08.223232614 +0000 UTC m=+0.157031506 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 20 08:50:08 np0005625204.localdomain podman[103021]: 2026-02-20 08:50:08.280686661 +0000 UTC m=+0.200578473 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 20 08:50:08 np0005625204.localdomain podman[103021]: 2026-02-20 08:50:08.314153345 +0000 UTC m=+0.234045147 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64)
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:50:08 np0005625204.localdomain podman[103016]: 2026-02-20 08:50:08.339039274 +0000 UTC m=+0.269185584 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 20 08:50:08 np0005625204.localdomain podman[103017]: 2026-02-20 08:50:08.350489609 +0000 UTC m=+0.277570254 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:50:08 np0005625204.localdomain podman[103016]: 2026-02-20 08:50:08.378103952 +0000 UTC m=+0.308250222 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, vcs-type=git, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:50:08 np0005625204.localdomain podman[103025]: 2026-02-20 08:50:08.438910462 +0000 UTC m=+0.355422930 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=)
Feb 20 08:50:08 np0005625204.localdomain podman[103015]: 2026-02-20 08:50:08.456262598 +0000 UTC m=+0.390061440 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, vcs-type=git, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:50:08 np0005625204.localdomain podman[103025]: 2026-02-20 08:50:08.641412123 +0000 UTC m=+0.557924671 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 20 08:50:08 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:50:10 np0005625204.localdomain sshd[103133]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:50:10 np0005625204.localdomain sshd[103133]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:50:15 np0005625204.localdomain podman[103140]: 2026-02-20 08:50:15.187265595 +0000 UTC m=+0.111589781 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, version=17.1.13, container_name=nova_compute, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5)
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: tmp-crun.6BynHI.mount: Deactivated successfully.
Feb 20 08:50:15 np0005625204.localdomain podman[103135]: 2026-02-20 08:50:15.241161562 +0000 UTC m=+0.177173969 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4)
Feb 20 08:50:15 np0005625204.localdomain podman[103135]: 2026-02-20 08:50:15.257980932 +0000 UTC m=+0.193993329 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, container_name=ovn_controller)
Feb 20 08:50:15 np0005625204.localdomain podman[103140]: 2026-02-20 08:50:15.266878767 +0000 UTC m=+0.191202923 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, container_name=nova_compute, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:50:15 np0005625204.localdomain podman[103135]: unhealthy
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:50:15 np0005625204.localdomain podman[103137]: 2026-02-20 08:50:15.354771365 +0000 UTC m=+0.283829597 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:50:15 np0005625204.localdomain podman[103136]: 2026-02-20 08:50:15.397276239 +0000 UTC m=+0.329516980 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 20 08:50:15 np0005625204.localdomain podman[103136]: 2026-02-20 08:50:15.412087416 +0000 UTC m=+0.344328187 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13)
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:50:15 np0005625204.localdomain podman[103137]: 2026-02-20 08:50:15.450429422 +0000 UTC m=+0.379487664 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 20 08:50:15 np0005625204.localdomain podman[103137]: unhealthy
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:50:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:50:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:50:23 np0005625204.localdomain podman[103220]: 2026-02-20 08:50:23.145560507 +0000 UTC m=+0.082544014 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, vendor=Red Hat, Inc., version=17.1.13)
Feb 20 08:50:23 np0005625204.localdomain podman[103220]: 2026-02-20 08:50:23.513000057 +0000 UTC m=+0.449983584 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:50:23 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:50:30 np0005625204.localdomain sudo[103244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:50:30 np0005625204.localdomain sudo[103244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:50:30 np0005625204.localdomain sudo[103244]: pam_unix(sudo:session): session closed for user root
Feb 20 08:50:30 np0005625204.localdomain sudo[103259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:50:30 np0005625204.localdomain sudo[103259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:50:30 np0005625204.localdomain sudo[103259]: pam_unix(sudo:session): session closed for user root
Feb 20 08:50:33 np0005625204.localdomain sudo[103306]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:50:33 np0005625204.localdomain sudo[103306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:50:34 np0005625204.localdomain sudo[103306]: pam_unix(sudo:session): session closed for user root
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: tmp-crun.4QZLik.mount: Deactivated successfully.
Feb 20 08:50:39 np0005625204.localdomain podman[103324]: 2026-02-20 08:50:39.162193574 +0000 UTC m=+0.084933836 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: tmp-crun.4ihxHi.mount: Deactivated successfully.
Feb 20 08:50:39 np0005625204.localdomain podman[103321]: 2026-02-20 08:50:39.233420927 +0000 UTC m=+0.160148162 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5)
Feb 20 08:50:39 np0005625204.localdomain podman[103324]: 2026-02-20 08:50:39.238010378 +0000 UTC m=+0.160750690 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:50:39 np0005625204.localdomain podman[103321]: 2026-02-20 08:50:39.29302958 +0000 UTC m=+0.219756775 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi)
Feb 20 08:50:39 np0005625204.localdomain podman[103329]: 2026-02-20 08:50:39.292164093 +0000 UTC m=+0.205391592 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, config_id=tripleo_step1)
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:50:39 np0005625204.localdomain podman[103323]: 2026-02-20 08:50:39.259263966 +0000 UTC m=+0.181836123 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:50:39 np0005625204.localdomain podman[103323]: 2026-02-20 08:50:39.339159316 +0000 UTC m=+0.261731433 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:50:39 np0005625204.localdomain podman[103322]: 2026-02-20 08:50:39.431934745 +0000 UTC m=+0.357152084 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com)
Feb 20 08:50:39 np0005625204.localdomain podman[103322]: 2026-02-20 08:50:39.468417843 +0000 UTC m=+0.393635252 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:50:39 np0005625204.localdomain podman[103329]: 2026-02-20 08:50:39.511154084 +0000 UTC m=+0.424381583 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step1, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:50:39 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:50:46 np0005625204.localdomain podman[103448]: 2026-02-20 08:50:46.160457064 +0000 UTC m=+0.089188257 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: tmp-crun.satLMg.mount: Deactivated successfully.
Feb 20 08:50:46 np0005625204.localdomain podman[103448]: 2026-02-20 08:50:46.197459659 +0000 UTC m=+0.126190872 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:50:46 np0005625204.localdomain podman[103441]: 2026-02-20 08:50:46.216454575 +0000 UTC m=+0.149780661 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, vcs-type=git, build-date=2026-01-12T22:34:43Z, container_name=iscsid, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:50:46 np0005625204.localdomain podman[103440]: 2026-02-20 08:50:46.200473452 +0000 UTC m=+0.139002609 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 20 08:50:46 np0005625204.localdomain podman[103442]: 2026-02-20 08:50:46.253844742 +0000 UTC m=+0.183611168 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4)
Feb 20 08:50:46 np0005625204.localdomain podman[103442]: 2026-02-20 08:50:46.270953721 +0000 UTC m=+0.200720107 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:50:46 np0005625204.localdomain podman[103442]: unhealthy
Feb 20 08:50:46 np0005625204.localdomain podman[103441]: 2026-02-20 08:50:46.278300568 +0000 UTC m=+0.211626664 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, container_name=iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public)
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:50:46 np0005625204.localdomain podman[103440]: 2026-02-20 08:50:46.333975189 +0000 UTC m=+0.272504306 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:50:46 np0005625204.localdomain podman[103440]: unhealthy
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:50:46 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:50:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:50:54 np0005625204.localdomain podman[103524]: 2026-02-20 08:50:54.14197936 +0000 UTC m=+0.081271484 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, architecture=x86_64)
Feb 20 08:50:54 np0005625204.localdomain podman[103524]: 2026-02-20 08:50:54.512280759 +0000 UTC m=+0.451572913 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_migration_target, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:50:54 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:50:56 np0005625204.localdomain sshd[103548]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:50:56 np0005625204.localdomain sshd[103548]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:50:59 np0005625204.localdomain sshd[103550]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:51:00 np0005625204.localdomain sshd[103550]: Invalid user titu from 103.157.25.4 port 57894
Feb 20 08:51:01 np0005625204.localdomain sshd[103550]: Received disconnect from 103.157.25.4 port 57894:11: Bye Bye [preauth]
Feb 20 08:51:01 np0005625204.localdomain sshd[103550]: Disconnected from invalid user titu 103.157.25.4 port 57894 [preauth]
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:51:10 np0005625204.localdomain podman[103552]: 2026-02-20 08:51:10.172740641 +0000 UTC m=+0.104876634 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:51:10 np0005625204.localdomain podman[103552]: 2026-02-20 08:51:10.204837823 +0000 UTC m=+0.136973836 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4)
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:51:10 np0005625204.localdomain podman[103555]: 2026-02-20 08:51:10.22283851 +0000 UTC m=+0.149019008 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 20 08:51:10 np0005625204.localdomain podman[103555]: 2026-02-20 08:51:10.259863375 +0000 UTC m=+0.186043913 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc.)
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:51:10 np0005625204.localdomain podman[103556]: 2026-02-20 08:51:10.274284901 +0000 UTC m=+0.198027794 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:51:10 np0005625204.localdomain podman[103554]: 2026-02-20 08:51:10.315493025 +0000 UTC m=+0.244982286 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:51:10 np0005625204.localdomain podman[103554]: 2026-02-20 08:51:10.326999251 +0000 UTC m=+0.256488512 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:51:10 np0005625204.localdomain podman[103553]: 2026-02-20 08:51:10.453851052 +0000 UTC m=+0.384535659 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, container_name=logrotate_crond, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:51:10 np0005625204.localdomain podman[103553]: 2026-02-20 08:51:10.489091482 +0000 UTC m=+0.419776059 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, container_name=logrotate_crond, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:51:10 np0005625204.localdomain podman[103556]: 2026-02-20 08:51:10.548412726 +0000 UTC m=+0.472155649 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1766032510, tcib_managed=true, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:51:10 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:51:17 np0005625204.localdomain podman[103671]: 2026-02-20 08:51:17.141723496 +0000 UTC m=+0.074567717 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z)
Feb 20 08:51:17 np0005625204.localdomain podman[103670]: 2026-02-20 08:51:17.15574956 +0000 UTC m=+0.088127266 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, version=17.1.13, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:36:40Z, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: tmp-crun.tMCoeB.mount: Deactivated successfully.
Feb 20 08:51:17 np0005625204.localdomain podman[103672]: 2026-02-20 08:51:17.21752475 +0000 UTC m=+0.147974626 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:51:17 np0005625204.localdomain podman[103670]: 2026-02-20 08:51:17.22561023 +0000 UTC m=+0.157987966 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, release=1766032510, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:51:17 np0005625204.localdomain podman[103670]: unhealthy
Feb 20 08:51:17 np0005625204.localdomain podman[103672]: 2026-02-20 08:51:17.237095895 +0000 UTC m=+0.167545771 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.13)
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:51:17 np0005625204.localdomain podman[103672]: unhealthy
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:51:17 np0005625204.localdomain podman[103673]: 2026-02-20 08:51:17.31459298 +0000 UTC m=+0.236186383 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, config_id=tripleo_step5, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:51:17 np0005625204.localdomain podman[103671]: 2026-02-20 08:51:17.332430562 +0000 UTC m=+0.265274793 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:51:17 np0005625204.localdomain podman[103673]: 2026-02-20 08:51:17.380282982 +0000 UTC m=+0.301876415 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:51:17 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:51:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:51:25 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:51:25 np0005625204.localdomain recover_tripleo_nova_virtqemud[103758]: 63005
Feb 20 08:51:25 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:51:25 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:51:25 np0005625204.localdomain podman[103756]: 2026-02-20 08:51:25.13724181 +0000 UTC m=+0.077047614 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, release=1766032510)
Feb 20 08:51:25 np0005625204.localdomain podman[103756]: 2026-02-20 08:51:25.506884148 +0000 UTC m=+0.446689942 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 20 08:51:25 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:51:34 np0005625204.localdomain sudo[103781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:51:34 np0005625204.localdomain sudo[103781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:51:34 np0005625204.localdomain sudo[103781]: pam_unix(sudo:session): session closed for user root
Feb 20 08:51:34 np0005625204.localdomain sudo[103796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 08:51:34 np0005625204.localdomain sudo[103796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:51:34 np0005625204.localdomain sudo[103796]: pam_unix(sudo:session): session closed for user root
Feb 20 08:51:34 np0005625204.localdomain sudo[103832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:51:34 np0005625204.localdomain sudo[103832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:51:34 np0005625204.localdomain sudo[103832]: pam_unix(sudo:session): session closed for user root
Feb 20 08:51:34 np0005625204.localdomain sudo[103847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:51:34 np0005625204.localdomain sudo[103847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:51:35 np0005625204.localdomain sudo[103847]: pam_unix(sudo:session): session closed for user root
Feb 20 08:51:36 np0005625204.localdomain sudo[103894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:51:36 np0005625204.localdomain sudo[103894]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:51:36 np0005625204.localdomain sudo[103894]: pam_unix(sudo:session): session closed for user root
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: tmp-crun.8M6QAu.mount: Deactivated successfully.
Feb 20 08:51:41 np0005625204.localdomain podman[103910]: 2026-02-20 08:51:41.162573384 +0000 UTC m=+0.101835949 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:51:41 np0005625204.localdomain podman[103910]: 2026-02-20 08:51:41.172873373 +0000 UTC m=+0.112135978 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, batch=17.1_20260112.1)
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:51:41 np0005625204.localdomain podman[103909]: 2026-02-20 08:51:41.216585184 +0000 UTC m=+0.154663963 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true)
Feb 20 08:51:41 np0005625204.localdomain podman[103911]: 2026-02-20 08:51:41.252817084 +0000 UTC m=+0.187121266 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:51:41 np0005625204.localdomain podman[103911]: 2026-02-20 08:51:41.268615603 +0000 UTC m=+0.202919815 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, container_name=collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:51:41 np0005625204.localdomain podman[103912]: 2026-02-20 08:51:41.180140177 +0000 UTC m=+0.113001685 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, container_name=ceilometer_agent_compute, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible)
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:51:41 np0005625204.localdomain podman[103912]: 2026-02-20 08:51:41.315208974 +0000 UTC m=+0.248070532 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z)
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:51:41 np0005625204.localdomain podman[103914]: 2026-02-20 08:51:41.362096923 +0000 UTC m=+0.285182708 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com)
Feb 20 08:51:41 np0005625204.localdomain podman[103909]: 2026-02-20 08:51:41.401028386 +0000 UTC m=+0.339107115 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z)
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:51:41 np0005625204.localdomain podman[103914]: 2026-02-20 08:51:41.548116465 +0000 UTC m=+0.471202260 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, container_name=metrics_qdr, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1)
Feb 20 08:51:41 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:51:43 np0005625204.localdomain sshd[104027]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:51:43 np0005625204.localdomain sshd[104027]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:51:48 np0005625204.localdomain podman[104029]: 2026-02-20 08:51:48.153791409 +0000 UTC m=+0.090065375 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, tcib_managed=true, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:51:48 np0005625204.localdomain podman[104030]: 2026-02-20 08:51:48.211130821 +0000 UTC m=+0.143153107 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 20 08:51:48 np0005625204.localdomain podman[104030]: 2026-02-20 08:51:48.220871833 +0000 UTC m=+0.152894079 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:51:48 np0005625204.localdomain podman[104031]: 2026-02-20 08:51:48.272985054 +0000 UTC m=+0.202989417 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, vendor=Red Hat, Inc.)
Feb 20 08:51:48 np0005625204.localdomain podman[104031]: 2026-02-20 08:51:48.294048435 +0000 UTC m=+0.224052848 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=ovn_metadata_agent)
Feb 20 08:51:48 np0005625204.localdomain podman[104031]: unhealthy
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:51:48 np0005625204.localdomain podman[104029]: 2026-02-20 08:51:48.328527171 +0000 UTC m=+0.264801137 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1)
Feb 20 08:51:48 np0005625204.localdomain podman[104029]: unhealthy
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:51:48 np0005625204.localdomain podman[104037]: 2026-02-20 08:51:48.367044862 +0000 UTC m=+0.292676640 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 20 08:51:48 np0005625204.localdomain podman[104037]: 2026-02-20 08:51:48.395942955 +0000 UTC m=+0.321574753 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:51:48 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:51:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:51:56 np0005625204.localdomain podman[104111]: 2026-02-20 08:51:56.138468888 +0000 UTC m=+0.075872447 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:51:56 np0005625204.localdomain podman[104111]: 2026-02-20 08:51:56.513959687 +0000 UTC m=+0.451363266 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=nova_migration_target, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:51:56 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:52:12 np0005625204.localdomain podman[104136]: 2026-02-20 08:52:12.157920494 +0000 UTC m=+0.093078049 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 20 08:52:12 np0005625204.localdomain podman[104136]: 2026-02-20 08:52:12.166945764 +0000 UTC m=+0.102103329 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, vcs-type=git, config_id=tripleo_step3, release=1766032510, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:52:12 np0005625204.localdomain podman[104134]: 2026-02-20 08:52:12.21083111 +0000 UTC m=+0.151121723 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:52:12 np0005625204.localdomain podman[104137]: 2026-02-20 08:52:12.264626284 +0000 UTC m=+0.192530784 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:52:12 np0005625204.localdomain podman[104137]: 2026-02-20 08:52:12.289935746 +0000 UTC m=+0.217840236 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com)
Feb 20 08:52:12 np0005625204.localdomain podman[104134]: 2026-02-20 08:52:12.289004547 +0000 UTC m=+0.229295120 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, release=1766032510, version=17.1.13, architecture=x86_64)
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:52:12 np0005625204.localdomain podman[104135]: 2026-02-20 08:52:12.373158309 +0000 UTC m=+0.309149269 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 20 08:52:12 np0005625204.localdomain podman[104135]: 2026-02-20 08:52:12.38193922 +0000 UTC m=+0.317930190 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1)
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:52:12 np0005625204.localdomain podman[104144]: 2026-02-20 08:52:12.482457148 +0000 UTC m=+0.407948813 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step1, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:52:12 np0005625204.localdomain podman[104144]: 2026-02-20 08:52:12.690015895 +0000 UTC m=+0.615507590 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5)
Feb 20 08:52:12 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: tmp-crun.mrsJU8.mount: Deactivated successfully.
Feb 20 08:52:19 np0005625204.localdomain podman[104261]: 2026-02-20 08:52:19.153912824 +0000 UTC m=+0.080918663 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, release=1766032510, vcs-type=git, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, config_id=tripleo_step5, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible)
Feb 20 08:52:19 np0005625204.localdomain podman[104253]: 2026-02-20 08:52:19.200307228 +0000 UTC m=+0.137373548 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 20 08:52:19 np0005625204.localdomain podman[104261]: 2026-02-20 08:52:19.230951805 +0000 UTC m=+0.157957644 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Feb 20 08:52:19 np0005625204.localdomain podman[104253]: 2026-02-20 08:52:19.240311305 +0000 UTC m=+0.177377625 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5)
Feb 20 08:52:19 np0005625204.localdomain podman[104253]: unhealthy
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:52:19 np0005625204.localdomain podman[104255]: 2026-02-20 08:52:19.264470702 +0000 UTC m=+0.194650229 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:52:19 np0005625204.localdomain podman[104255]: 2026-02-20 08:52:19.310146085 +0000 UTC m=+0.240325632 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 20 08:52:19 np0005625204.localdomain podman[104255]: unhealthy
Feb 20 08:52:19 np0005625204.localdomain podman[104254]: 2026-02-20 08:52:19.317170201 +0000 UTC m=+0.248492434 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:52:19 np0005625204.localdomain podman[104254]: 2026-02-20 08:52:19.35626135 +0000 UTC m=+0.287583563 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Feb 20 08:52:19 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:52:19 np0005625204.localdomain sshd[104338]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:52:20 np0005625204.localdomain systemd[1]: tmp-crun.hJgdgX.mount: Deactivated successfully.
Feb 20 08:52:21 np0005625204.localdomain sshd[104338]: Invalid user sol from 45.148.10.240 port 38672
Feb 20 08:52:21 np0005625204.localdomain sshd[104338]: Connection closed by invalid user sol 45.148.10.240 port 38672 [preauth]
Feb 20 08:52:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:52:27 np0005625204.localdomain systemd[1]: tmp-crun.cEhAdq.mount: Deactivated successfully.
Feb 20 08:52:27 np0005625204.localdomain podman[104341]: 2026-02-20 08:52:27.145811996 +0000 UTC m=+0.085234087 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, vcs-type=git)
Feb 20 08:52:27 np0005625204.localdomain podman[104341]: 2026-02-20 08:52:27.535654569 +0000 UTC m=+0.475076590 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 20 08:52:27 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:52:30 np0005625204.localdomain sshd[104364]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:52:30 np0005625204.localdomain sshd[104364]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:52:36 np0005625204.localdomain sudo[104366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:52:36 np0005625204.localdomain sudo[104366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:52:36 np0005625204.localdomain sudo[104366]: pam_unix(sudo:session): session closed for user root
Feb 20 08:52:36 np0005625204.localdomain sudo[104381]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:52:36 np0005625204.localdomain sudo[104381]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:52:36 np0005625204.localdomain sudo[104381]: pam_unix(sudo:session): session closed for user root
Feb 20 08:52:37 np0005625204.localdomain sudo[104429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:52:37 np0005625204.localdomain sudo[104429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:52:37 np0005625204.localdomain sudo[104429]: pam_unix(sudo:session): session closed for user root
Feb 20 08:52:39 np0005625204.localdomain sshd[104444]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:52:41 np0005625204.localdomain sshd[104444]: Received disconnect from 27.112.79.3 port 52396:11: Bye Bye [preauth]
Feb 20 08:52:41 np0005625204.localdomain sshd[104444]: Disconnected from authenticating user root 27.112.79.3 port 52396 [preauth]
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:52:43 np0005625204.localdomain podman[104446]: 2026-02-20 08:52:43.171733769 +0000 UTC m=+0.101513659 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 20 08:52:43 np0005625204.localdomain podman[104446]: 2026-02-20 08:52:43.203032647 +0000 UTC m=+0.132812487 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: tmp-crun.XAhXxQ.mount: Deactivated successfully.
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:52:43 np0005625204.localdomain podman[104449]: 2026-02-20 08:52:43.223664535 +0000 UTC m=+0.148895334 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, distribution-scope=public, version=17.1.13, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Feb 20 08:52:43 np0005625204.localdomain podman[104449]: 2026-02-20 08:52:43.259113372 +0000 UTC m=+0.184344211 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 20 08:52:43 np0005625204.localdomain podman[104448]: 2026-02-20 08:52:43.271117532 +0000 UTC m=+0.197519538 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.13, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:52:43 np0005625204.localdomain podman[104450]: 2026-02-20 08:52:43.325009388 +0000 UTC m=+0.248395941 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, container_name=metrics_qdr)
Feb 20 08:52:43 np0005625204.localdomain podman[104447]: 2026-02-20 08:52:43.376841921 +0000 UTC m=+0.303609328 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:52:43 np0005625204.localdomain podman[104447]: 2026-02-20 08:52:43.394213328 +0000 UTC m=+0.320980745 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:52:43 np0005625204.localdomain podman[104448]: 2026-02-20 08:52:43.448015201 +0000 UTC m=+0.374417127 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:52:43 np0005625204.localdomain podman[104450]: 2026-02-20 08:52:43.527701835 +0000 UTC m=+0.451088368 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, container_name=metrics_qdr, tcib_managed=true, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 20 08:52:43 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:52:46 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:52:46 np0005625204.localdomain recover_tripleo_nova_virtqemud[104566]: 63005
Feb 20 08:52:46 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:52:46 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: tmp-crun.Qm3dry.mount: Deactivated successfully.
Feb 20 08:52:50 np0005625204.localdomain podman[104569]: 2026-02-20 08:52:50.146706559 +0000 UTC m=+0.078163058 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:52:50 np0005625204.localdomain podman[104567]: 2026-02-20 08:52:50.210743599 +0000 UTC m=+0.145768078 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Feb 20 08:52:50 np0005625204.localdomain podman[104569]: 2026-02-20 08:52:50.230941713 +0000 UTC m=+0.162398252 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 20 08:52:50 np0005625204.localdomain podman[104569]: unhealthy
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:52:50 np0005625204.localdomain podman[104570]: 2026-02-20 08:52:50.191163483 +0000 UTC m=+0.116822983 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, architecture=x86_64, url=https://www.redhat.com)
Feb 20 08:52:50 np0005625204.localdomain podman[104568]: 2026-02-20 08:52:50.250994113 +0000 UTC m=+0.186218618 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:52:50 np0005625204.localdomain podman[104568]: 2026-02-20 08:52:50.268129812 +0000 UTC m=+0.203354247 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Feb 20 08:52:50 np0005625204.localdomain podman[104570]: 2026-02-20 08:52:50.280147105 +0000 UTC m=+0.205806585 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:52:50 np0005625204.localdomain podman[104567]: 2026-02-20 08:52:50.304241949 +0000 UTC m=+0.239266448 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:52:50 np0005625204.localdomain podman[104567]: unhealthy
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:52:50 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:52:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:52:58 np0005625204.localdomain systemd[1]: tmp-crun.U1pN5e.mount: Deactivated successfully.
Feb 20 08:52:58 np0005625204.localdomain podman[104646]: 2026-02-20 08:52:58.151809999 +0000 UTC m=+0.091562153 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 20 08:52:58 np0005625204.localdomain podman[104646]: 2026-02-20 08:52:58.527983069 +0000 UTC m=+0.467735253 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git)
Feb 20 08:52:58 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:53:01 np0005625204.localdomain sshd[104669]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:53:02 np0005625204.localdomain sshd[104669]: Invalid user httpd from 96.78.175.36 port 54176
Feb 20 08:53:02 np0005625204.localdomain sshd[104669]: Received disconnect from 96.78.175.36 port 54176:11: Bye Bye [preauth]
Feb 20 08:53:02 np0005625204.localdomain sshd[104669]: Disconnected from invalid user httpd 96.78.175.36 port 54176 [preauth]
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: tmp-crun.fdum35.mount: Deactivated successfully.
Feb 20 08:53:14 np0005625204.localdomain podman[104671]: 2026-02-20 08:53:14.14907752 +0000 UTC m=+0.084508264 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 20 08:53:14 np0005625204.localdomain podman[104673]: 2026-02-20 08:53:14.166151007 +0000 UTC m=+0.094715549 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, architecture=x86_64, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible)
Feb 20 08:53:14 np0005625204.localdomain podman[104671]: 2026-02-20 08:53:14.177991033 +0000 UTC m=+0.113421787 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1)
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:53:14 np0005625204.localdomain podman[104673]: 2026-02-20 08:53:14.206001539 +0000 UTC m=+0.134566041 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 20 08:53:14 np0005625204.localdomain podman[104672]: 2026-02-20 08:53:14.215868695 +0000 UTC m=+0.147049718 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:53:14 np0005625204.localdomain podman[104672]: 2026-02-20 08:53:14.225899565 +0000 UTC m=+0.157080518 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:53:14 np0005625204.localdomain podman[104679]: 2026-02-20 08:53:14.286613762 +0000 UTC m=+0.211180560 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:53:14 np0005625204.localdomain podman[104679]: 2026-02-20 08:53:14.317974891 +0000 UTC m=+0.242541769 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:53:14 np0005625204.localdomain podman[104683]: 2026-02-20 08:53:14.331141049 +0000 UTC m=+0.252409715 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:53:14 np0005625204.localdomain podman[104683]: 2026-02-20 08:53:14.53107211 +0000 UTC m=+0.452340776 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-type=git, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:53:14 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:53:15 np0005625204.localdomain systemd[1]: tmp-crun.ULcXmc.mount: Deactivated successfully.
Feb 20 08:53:15 np0005625204.localdomain sshd[104788]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:53:15 np0005625204.localdomain sshd[104788]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: tmp-crun.aJmwPh.mount: Deactivated successfully.
Feb 20 08:53:21 np0005625204.localdomain podman[104790]: 2026-02-20 08:53:21.172908852 +0000 UTC m=+0.106234556 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:53:21 np0005625204.localdomain podman[104790]: 2026-02-20 08:53:21.189293488 +0000 UTC m=+0.122619242 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5)
Feb 20 08:53:21 np0005625204.localdomain podman[104790]: unhealthy
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: tmp-crun.CaygEn.mount: Deactivated successfully.
Feb 20 08:53:21 np0005625204.localdomain podman[104791]: 2026-02-20 08:53:21.269772027 +0000 UTC m=+0.201228163 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_id=tripleo_step3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:53:21 np0005625204.localdomain podman[104791]: 2026-02-20 08:53:21.280108137 +0000 UTC m=+0.211564253 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, container_name=iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team)
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:53:21 np0005625204.localdomain podman[104792]: 2026-02-20 08:53:21.329034329 +0000 UTC m=+0.254870121 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:53:21 np0005625204.localdomain podman[104792]: 2026-02-20 08:53:21.346998665 +0000 UTC m=+0.272834447 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:53:21 np0005625204.localdomain podman[104792]: unhealthy
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:53:21 np0005625204.localdomain podman[104796]: 2026-02-20 08:53:21.39990848 +0000 UTC m=+0.322176292 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=)
Feb 20 08:53:21 np0005625204.localdomain podman[104796]: 2026-02-20 08:53:21.430876897 +0000 UTC m=+0.353144709 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, release=1766032510)
Feb 20 08:53:21 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:53:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:53:29 np0005625204.localdomain systemd[1]: tmp-crun.6F3v2d.mount: Deactivated successfully.
Feb 20 08:53:29 np0005625204.localdomain podman[104875]: 2026-02-20 08:53:29.147910851 +0000 UTC m=+0.086856356 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 20 08:53:29 np0005625204.localdomain podman[104875]: 2026-02-20 08:53:29.509512381 +0000 UTC m=+0.448457876 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 20 08:53:29 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:53:37 np0005625204.localdomain sudo[104899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:53:37 np0005625204.localdomain sudo[104899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:53:37 np0005625204.localdomain sudo[104899]: pam_unix(sudo:session): session closed for user root
Feb 20 08:53:37 np0005625204.localdomain sudo[104914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 08:53:37 np0005625204.localdomain sudo[104914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:53:38 np0005625204.localdomain podman[104998]: 2026-02-20 08:53:38.696951526 +0000 UTC m=+0.093682728 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, name=rhceph, version=7, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 08:53:38 np0005625204.localdomain podman[104998]: 2026-02-20 08:53:38.798026691 +0000 UTC m=+0.194757893 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 08:53:39 np0005625204.localdomain sudo[104914]: pam_unix(sudo:session): session closed for user root
Feb 20 08:53:39 np0005625204.localdomain sudo[105064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:53:39 np0005625204.localdomain sudo[105064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:53:39 np0005625204.localdomain sudo[105064]: pam_unix(sudo:session): session closed for user root
Feb 20 08:53:39 np0005625204.localdomain sudo[105079]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:53:39 np0005625204.localdomain sudo[105079]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:53:39 np0005625204.localdomain sudo[105079]: pam_unix(sudo:session): session closed for user root
Feb 20 08:53:40 np0005625204.localdomain sudo[105127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:53:40 np0005625204.localdomain sudo[105127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:53:40 np0005625204.localdomain sudo[105127]: pam_unix(sudo:session): session closed for user root
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:53:45 np0005625204.localdomain podman[105145]: 2026-02-20 08:53:45.155383824 +0000 UTC m=+0.085526925 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:53:45 np0005625204.localdomain podman[105143]: 2026-02-20 08:53:45.203508452 +0000 UTC m=+0.138640287 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, release=1766032510, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:53:45 np0005625204.localdomain podman[105145]: 2026-02-20 08:53:45.210140607 +0000 UTC m=+0.140283688 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ceilometer_agent_compute, release=1766032510, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team)
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:53:45 np0005625204.localdomain podman[105143]: 2026-02-20 08:53:45.244084437 +0000 UTC m=+0.179216252 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, tcib_managed=true, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1)
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:53:45 np0005625204.localdomain podman[105146]: 2026-02-20 08:53:45.265463708 +0000 UTC m=+0.192522044 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container)
Feb 20 08:53:45 np0005625204.localdomain podman[105142]: 2026-02-20 08:53:45.310102978 +0000 UTC m=+0.245065118 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64)
Feb 20 08:53:45 np0005625204.localdomain podman[105142]: 2026-02-20 08:53:45.343117338 +0000 UTC m=+0.278079468 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z)
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:53:45 np0005625204.localdomain podman[105144]: 2026-02-20 08:53:45.359508845 +0000 UTC m=+0.292905386 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:53:45 np0005625204.localdomain podman[105144]: 2026-02-20 08:53:45.370623909 +0000 UTC m=+0.304020430 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-collectd-container, container_name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:53:45 np0005625204.localdomain podman[105146]: 2026-02-20 08:53:45.496923564 +0000 UTC m=+0.423981880 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, tcib_managed=true, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 20 08:53:45 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: tmp-crun.7bV8AD.mount: Deactivated successfully.
Feb 20 08:53:52 np0005625204.localdomain podman[105265]: 2026-02-20 08:53:52.135566234 +0000 UTC m=+0.066452006 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 20 08:53:52 np0005625204.localdomain podman[105265]: 2026-02-20 08:53:52.16875176 +0000 UTC m=+0.099637562 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, release=1766032510, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: tmp-crun.vQCrBl.mount: Deactivated successfully.
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:53:52 np0005625204.localdomain podman[105266]: 2026-02-20 08:53:52.188505131 +0000 UTC m=+0.111723616 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:53:52 np0005625204.localdomain podman[105264]: 2026-02-20 08:53:52.216376483 +0000 UTC m=+0.146967576 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, version=17.1.13, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:53:52 np0005625204.localdomain podman[105267]: 2026-02-20 08:53:52.233545063 +0000 UTC m=+0.153832397 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, config_id=tripleo_step5, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc.)
Feb 20 08:53:52 np0005625204.localdomain podman[105264]: 2026-02-20 08:53:52.254562413 +0000 UTC m=+0.185153506 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=)
Feb 20 08:53:52 np0005625204.localdomain podman[105264]: unhealthy
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:53:52 np0005625204.localdomain podman[105267]: 2026-02-20 08:53:52.287931794 +0000 UTC m=+0.208219118 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:53:52 np0005625204.localdomain podman[105266]: 2026-02-20 08:53:52.308263784 +0000 UTC m=+0.231482309 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, container_name=ovn_metadata_agent, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:53:52 np0005625204.localdomain podman[105266]: unhealthy
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:53:52 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:54:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:54:00 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:54:00 np0005625204.localdomain recover_tripleo_nova_virtqemud[105348]: 63005
Feb 20 08:54:00 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:54:00 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:54:00 np0005625204.localdomain systemd[1]: tmp-crun.kGfWlc.mount: Deactivated successfully.
Feb 20 08:54:00 np0005625204.localdomain podman[105346]: 2026-02-20 08:54:00.150193658 +0000 UTC m=+0.085255718 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container)
Feb 20 08:54:00 np0005625204.localdomain sshd[105368]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:54:00 np0005625204.localdomain sshd[105368]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:54:00 np0005625204.localdomain podman[105346]: 2026-02-20 08:54:00.487929539 +0000 UTC m=+0.422991609 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:54:00 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:54:16 np0005625204.localdomain podman[105372]: 2026-02-20 08:54:16.157001602 +0000 UTC m=+0.093727979 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com)
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: tmp-crun.niMvDW.mount: Deactivated successfully.
Feb 20 08:54:16 np0005625204.localdomain podman[105385]: 2026-02-20 08:54:16.192144738 +0000 UTC m=+0.110702293 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:54:16 np0005625204.localdomain podman[105372]: 2026-02-20 08:54:16.198399672 +0000 UTC m=+0.135126119 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:54:16 np0005625204.localdomain podman[105382]: 2026-02-20 08:54:16.222016822 +0000 UTC m=+0.148145401 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:54:16 np0005625204.localdomain podman[105382]: 2026-02-20 08:54:16.250236415 +0000 UTC m=+0.176364944 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=)
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:54:16 np0005625204.localdomain podman[105373]: 2026-02-20 08:54:16.286800145 +0000 UTC m=+0.217052401 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, container_name=collectd, config_id=tripleo_step3, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:54:16 np0005625204.localdomain podman[105371]: 2026-02-20 08:54:16.317794213 +0000 UTC m=+0.252822127 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:54:16 np0005625204.localdomain podman[105371]: 2026-02-20 08:54:16.354068685 +0000 UTC m=+0.289096649 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:54:16 np0005625204.localdomain podman[105373]: 2026-02-20 08:54:16.397070454 +0000 UTC m=+0.327322660 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:54:16 np0005625204.localdomain podman[105385]: 2026-02-20 08:54:16.399885861 +0000 UTC m=+0.318443446 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd)
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:54:16 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:54:23 np0005625204.localdomain podman[105492]: 2026-02-20 08:54:23.151233567 +0000 UTC m=+0.082326886 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git)
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: tmp-crun.Q5rnbw.mount: Deactivated successfully.
Feb 20 08:54:23 np0005625204.localdomain podman[105494]: 2026-02-20 08:54:23.211584723 +0000 UTC m=+0.138148222 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1)
Feb 20 08:54:23 np0005625204.localdomain podman[105492]: 2026-02-20 08:54:23.218456875 +0000 UTC m=+0.149550244 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 20 08:54:23 np0005625204.localdomain podman[105492]: unhealthy
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:54:23 np0005625204.localdomain podman[105494]: 2026-02-20 08:54:23.230383405 +0000 UTC m=+0.156946904 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 20 08:54:23 np0005625204.localdomain podman[105494]: unhealthy
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:54:23 np0005625204.localdomain podman[105493]: 2026-02-20 08:54:23.315462035 +0000 UTC m=+0.245186801 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc.)
Feb 20 08:54:23 np0005625204.localdomain podman[105495]: 2026-02-20 08:54:23.365339267 +0000 UTC m=+0.287205021 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_id=tripleo_step5, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:54:23 np0005625204.localdomain podman[105493]: 2026-02-20 08:54:23.378260197 +0000 UTC m=+0.307984953 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, architecture=x86_64, release=1766032510, distribution-scope=public, vcs-type=git, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com)
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:54:23 np0005625204.localdomain podman[105495]: 2026-02-20 08:54:23.398053489 +0000 UTC m=+0.319919223 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, config_id=tripleo_step5, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:54:23 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:54:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:54:31 np0005625204.localdomain podman[105579]: 2026-02-20 08:54:31.154375436 +0000 UTC m=+0.089111006 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vcs-type=git, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:54:31 np0005625204.localdomain podman[105579]: 2026-02-20 08:54:31.524610472 +0000 UTC m=+0.459346082 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:54:31 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:54:40 np0005625204.localdomain sudo[105602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:54:40 np0005625204.localdomain sudo[105602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:54:40 np0005625204.localdomain sudo[105602]: pam_unix(sudo:session): session closed for user root
Feb 20 08:54:40 np0005625204.localdomain sudo[105617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:54:40 np0005625204.localdomain sudo[105617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:54:41 np0005625204.localdomain sudo[105617]: pam_unix(sudo:session): session closed for user root
Feb 20 08:54:42 np0005625204.localdomain sudo[105665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:54:42 np0005625204.localdomain sudo[105665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:54:42 np0005625204.localdomain sudo[105665]: pam_unix(sudo:session): session closed for user root
Feb 20 08:54:43 np0005625204.localdomain sshd[105680]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:54:43 np0005625204.localdomain sshd[105680]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:54:47 np0005625204.localdomain podman[105683]: 2026-02-20 08:54:47.170660993 +0000 UTC m=+0.106957269 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:54:47 np0005625204.localdomain podman[105683]: 2026-02-20 08:54:47.213872858 +0000 UTC m=+0.150169174 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, vcs-type=git, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:54:47 np0005625204.localdomain podman[105686]: 2026-02-20 08:54:47.216352075 +0000 UTC m=+0.143244330 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:54:47 np0005625204.localdomain podman[105682]: 2026-02-20 08:54:47.267108524 +0000 UTC m=+0.202880323 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1766032510, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com)
Feb 20 08:54:47 np0005625204.localdomain podman[105684]: 2026-02-20 08:54:47.31451054 +0000 UTC m=+0.250148605 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:54:47 np0005625204.localdomain podman[105684]: 2026-02-20 08:54:47.325854361 +0000 UTC m=+0.261492436 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, version=17.1.13, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:54:47 np0005625204.localdomain podman[105685]: 2026-02-20 08:54:47.373200145 +0000 UTC m=+0.303057601 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:54:47 np0005625204.localdomain podman[105682]: 2026-02-20 08:54:47.401282833 +0000 UTC m=+0.337054572 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, version=17.1.13, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:54:47 np0005625204.localdomain podman[105686]: 2026-02-20 08:54:47.434013525 +0000 UTC m=+0.360905700 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, release=1766032510, batch=17.1_20260112.1, container_name=metrics_qdr, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:54:47 np0005625204.localdomain podman[105685]: 2026-02-20 08:54:47.453841708 +0000 UTC m=+0.383699204 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 20 08:54:47 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: tmp-crun.XGy0W9.mount: Deactivated successfully.
Feb 20 08:54:54 np0005625204.localdomain podman[105801]: 2026-02-20 08:54:54.168014352 +0000 UTC m=+0.097561806 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc.)
Feb 20 08:54:54 np0005625204.localdomain podman[105802]: 2026-02-20 08:54:54.199749194 +0000 UTC m=+0.127374189 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64)
Feb 20 08:54:54 np0005625204.localdomain podman[105802]: 2026-02-20 08:54:54.211985642 +0000 UTC m=+0.139610617 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true)
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:54:54 np0005625204.localdomain podman[105801]: 2026-02-20 08:54:54.249375628 +0000 UTC m=+0.178923042 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:54:54 np0005625204.localdomain podman[105801]: unhealthy
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:54:54 np0005625204.localdomain podman[105804]: 2026-02-20 08:54:54.301790079 +0000 UTC m=+0.226376891 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:54:54 np0005625204.localdomain podman[105803]: 2026-02-20 08:54:54.255678874 +0000 UTC m=+0.180329187 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:54:54 np0005625204.localdomain podman[105804]: 2026-02-20 08:54:54.329080673 +0000 UTC m=+0.253667495 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step5, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Deactivated successfully.
Feb 20 08:54:54 np0005625204.localdomain podman[105803]: 2026-02-20 08:54:54.386333712 +0000 UTC m=+0.310984085 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:54:54 np0005625204.localdomain podman[105803]: unhealthy
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:54:54 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:54:56 np0005625204.localdomain sshd[105884]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:54:57 np0005625204.localdomain sshd[105884]: Invalid user solana from 45.148.10.240 port 58746
Feb 20 08:54:57 np0005625204.localdomain sshd[105884]: Connection closed by invalid user solana 45.148.10.240 port 58746 [preauth]
Feb 20 08:55:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:55:02 np0005625204.localdomain systemd[1]: tmp-crun.j9gksB.mount: Deactivated successfully.
Feb 20 08:55:02 np0005625204.localdomain podman[105886]: 2026-02-20 08:55:02.137140209 +0000 UTC m=+0.076238548 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-type=git, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:55:02 np0005625204.localdomain podman[105886]: 2026-02-20 08:55:02.482786996 +0000 UTC m=+0.421885345 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, release=1766032510, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5)
Feb 20 08:55:02 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:55:18 np0005625204.localdomain podman[105911]: 2026-02-20 08:55:18.146726759 +0000 UTC m=+0.084855975 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510)
Feb 20 08:55:18 np0005625204.localdomain podman[105910]: 2026-02-20 08:55:18.199972995 +0000 UTC m=+0.136264444 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:55:18 np0005625204.localdomain podman[105910]: 2026-02-20 08:55:18.206842127 +0000 UTC m=+0.143133606 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:55:18 np0005625204.localdomain podman[105909]: 2026-02-20 08:55:18.262958382 +0000 UTC m=+0.200528120 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, config_id=tripleo_step4, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:55:18 np0005625204.localdomain podman[105912]: 2026-02-20 08:55:18.319248113 +0000 UTC m=+0.249564628 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 20 08:55:18 np0005625204.localdomain podman[105913]: 2026-02-20 08:55:18.371041274 +0000 UTC m=+0.296196539 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.)
Feb 20 08:55:18 np0005625204.localdomain podman[105912]: 2026-02-20 08:55:18.380214347 +0000 UTC m=+0.310530822 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 20 08:55:18 np0005625204.localdomain podman[105909]: 2026-02-20 08:55:18.393578761 +0000 UTC m=+0.331148489 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64)
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:55:18 np0005625204.localdomain podman[105911]: 2026-02-20 08:55:18.433196245 +0000 UTC m=+0.371325521 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:55:18 np0005625204.localdomain podman[105913]: 2026-02-20 08:55:18.569739747 +0000 UTC m=+0.494894982 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vcs-type=git, config_id=tripleo_step1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:55:18 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:55:25 np0005625204.localdomain podman[106032]: 2026-02-20 08:55:25.16029535 +0000 UTC m=+0.085226216 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:55:25 np0005625204.localdomain podman[106032]: 2026-02-20 08:55:25.174400777 +0000 UTC m=+0.099331593 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:55:25 np0005625204.localdomain podman[106032]: unhealthy
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:55:25 np0005625204.localdomain sshd[106080]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:55:25 np0005625204.localdomain podman[106033]: 2026-02-20 08:55:25.259076334 +0000 UTC m=+0.183150484 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 20 08:55:25 np0005625204.localdomain podman[106031]: 2026-02-20 08:55:25.228230501 +0000 UTC m=+0.157471680 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:55:25 np0005625204.localdomain podman[106033]: 2026-02-20 08:55:25.307258104 +0000 UTC m=+0.231332264 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, config_id=tripleo_step5, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:55:25 np0005625204.localdomain podman[106033]: unhealthy
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:55:25 np0005625204.localdomain podman[106030]: 2026-02-20 08:55:25.283024914 +0000 UTC m=+0.215818983 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5)
Feb 20 08:55:25 np0005625204.localdomain podman[106030]: 2026-02-20 08:55:25.362245494 +0000 UTC m=+0.295039563 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, config_id=tripleo_step4, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 20 08:55:25 np0005625204.localdomain podman[106030]: unhealthy
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:55:25 np0005625204.localdomain sshd[106080]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:55:25 np0005625204.localdomain podman[106031]: 2026-02-20 08:55:25.415399058 +0000 UTC m=+0.344640227 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, tcib_managed=true)
Feb 20 08:55:25 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:55:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:55:33 np0005625204.localdomain podman[106113]: 2026-02-20 08:55:33.146729751 +0000 UTC m=+0.082830382 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:55:33 np0005625204.localdomain podman[106113]: 2026-02-20 08:55:33.512068847 +0000 UTC m=+0.448169488 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 20 08:55:33 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:55:42 np0005625204.localdomain sudo[106137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:55:42 np0005625204.localdomain sudo[106137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:55:42 np0005625204.localdomain sudo[106137]: pam_unix(sudo:session): session closed for user root
Feb 20 08:55:42 np0005625204.localdomain sudo[106152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:55:42 np0005625204.localdomain sudo[106152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:55:43 np0005625204.localdomain sudo[106152]: pam_unix(sudo:session): session closed for user root
Feb 20 08:55:43 np0005625204.localdomain sudo[106200]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:55:43 np0005625204.localdomain sudo[106200]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:55:43 np0005625204.localdomain sudo[106200]: pam_unix(sudo:session): session closed for user root
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:55:49 np0005625204.localdomain recover_tripleo_nova_virtqemud[106250]: 63005
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:55:49 np0005625204.localdomain podman[106215]: 2026-02-20 08:55:49.154945298 +0000 UTC m=+0.089395875 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible)
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: tmp-crun.VcPrvl.mount: Deactivated successfully.
Feb 20 08:55:49 np0005625204.localdomain podman[106217]: 2026-02-20 08:55:49.174072999 +0000 UTC m=+0.101708326 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step3)
Feb 20 08:55:49 np0005625204.localdomain podman[106215]: 2026-02-20 08:55:49.2090573 +0000 UTC m=+0.143507877 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, version=17.1.13, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team)
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:55:49 np0005625204.localdomain podman[106229]: 2026-02-20 08:55:49.228726379 +0000 UTC m=+0.149608217 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=metrics_qdr, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64)
Feb 20 08:55:49 np0005625204.localdomain podman[106217]: 2026-02-20 08:55:49.236332934 +0000 UTC m=+0.163968291 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, container_name=collectd, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:55:49 np0005625204.localdomain podman[106223]: 2026-02-20 08:55:49.273210814 +0000 UTC m=+0.194130013 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Feb 20 08:55:49 np0005625204.localdomain podman[106216]: 2026-02-20 08:55:49.316155521 +0000 UTC m=+0.246974796 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container)
Feb 20 08:55:49 np0005625204.localdomain podman[106216]: 2026-02-20 08:55:49.323315123 +0000 UTC m=+0.254134408 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:55:49 np0005625204.localdomain podman[106223]: 2026-02-20 08:55:49.349980617 +0000 UTC m=+0.270899826 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:55:49 np0005625204.localdomain podman[106229]: 2026-02-20 08:55:49.444198281 +0000 UTC m=+0.365080119 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, config_id=tripleo_step1, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container)
Feb 20 08:55:49 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:55:56 np0005625204.localdomain podman[106341]: 2026-02-20 08:55:56.169744508 +0000 UTC m=+0.088697184 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, version=17.1.13, container_name=nova_compute, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:55:56 np0005625204.localdomain podman[106341]: 2026-02-20 08:55:56.198120795 +0000 UTC m=+0.117073241 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc.)
Feb 20 08:55:56 np0005625204.localdomain podman[106341]: unhealthy
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:55:56 np0005625204.localdomain podman[106339]: 2026-02-20 08:55:56.214965266 +0000 UTC m=+0.140141964 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:55:56 np0005625204.localdomain podman[106338]: 2026-02-20 08:55:56.254253461 +0000 UTC m=+0.182410271 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, container_name=ovn_controller, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com)
Feb 20 08:55:56 np0005625204.localdomain podman[106340]: 2026-02-20 08:55:56.273285509 +0000 UTC m=+0.195556547 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z)
Feb 20 08:55:56 np0005625204.localdomain podman[106339]: 2026-02-20 08:55:56.27945215 +0000 UTC m=+0.204628838 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, container_name=iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:55:56 np0005625204.localdomain podman[106340]: 2026-02-20 08:55:56.289114468 +0000 UTC m=+0.211385506 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:55:56 np0005625204.localdomain podman[106340]: unhealthy
Feb 20 08:55:56 np0005625204.localdomain podman[106338]: 2026-02-20 08:55:56.300201381 +0000 UTC m=+0.228358191 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-type=git, container_name=ovn_controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:55:56 np0005625204.localdomain podman[106338]: unhealthy
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:55:56 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:55:56 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45345 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D019F0000000001030307) 
Feb 20 08:55:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45346 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D05A90000000001030307) 
Feb 20 08:55:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45347 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D0DA80000000001030307) 
Feb 20 08:56:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20220 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D11840000000001030307) 
Feb 20 08:56:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13616 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D11E00000000001030307) 
Feb 20 08:56:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20221 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D15A80000000001030307) 
Feb 20 08:56:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13617 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D15E80000000001030307) 
Feb 20 08:56:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45348 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D1D680000000001030307) 
Feb 20 08:56:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20222 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D1DA80000000001030307) 
Feb 20 08:56:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13618 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D1DE90000000001030307) 
Feb 20 08:56:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:56:04 np0005625204.localdomain systemd[1]: tmp-crun.khjE9P.mount: Deactivated successfully.
Feb 20 08:56:04 np0005625204.localdomain podman[106417]: 2026-02-20 08:56:04.161365981 +0000 UTC m=+0.098077473 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, version=17.1.13)
Feb 20 08:56:04 np0005625204.localdomain podman[106417]: 2026-02-20 08:56:04.561346256 +0000 UTC m=+0.498057708 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=nova_migration_target, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:56:04 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:56:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12485 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D27A40000000001030307) 
Feb 20 08:56:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12486 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D2BA80000000001030307) 
Feb 20 08:56:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20223 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D2D680000000001030307) 
Feb 20 08:56:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13619 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D2DA80000000001030307) 
Feb 20 08:56:07 np0005625204.localdomain sshd[106440]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:08 np0005625204.localdomain sshd[106440]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:56:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12487 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D33A80000000001030307) 
Feb 20 08:56:09 np0005625204.localdomain sshd[106442]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:10 np0005625204.localdomain sshd[106442]: Invalid user ubuntu from 96.78.175.36 port 33570
Feb 20 08:56:10 np0005625204.localdomain sshd[106442]: Received disconnect from 96.78.175.36 port 33570:11: Bye Bye [preauth]
Feb 20 08:56:10 np0005625204.localdomain sshd[106442]: Disconnected from invalid user ubuntu 96.78.175.36 port 33570 [preauth]
Feb 20 08:56:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45349 DF PROTO=TCP SPT=44390 DPT=9105 SEQ=917478032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D3D680000000001030307) 
Feb 20 08:56:13 np0005625204.localdomain sshd[106444]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:13 np0005625204.localdomain sshd[106444]: Accepted publickey for zuul from 192.168.122.31 port 34282 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 08:56:13 np0005625204.localdomain systemd-logind[759]: New session 37 of user zuul.
Feb 20 08:56:13 np0005625204.localdomain systemd[1]: Started Session 37 of User zuul.
Feb 20 08:56:13 np0005625204.localdomain sshd[106444]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 08:56:13 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12488 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D43680000000001030307) 
Feb 20 08:56:13 np0005625204.localdomain sshd[106480]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:13 np0005625204.localdomain sudo[106539]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmzgfnywekcbtxyikvszzavoxmfrwqsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577773.3369675-23-224487341103662/AnsiballZ_stat.py
Feb 20 08:56:13 np0005625204.localdomain sudo[106539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:14 np0005625204.localdomain python3.9[106541]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 08:56:14 np0005625204.localdomain sudo[106539]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:14 np0005625204.localdomain sudo[106633]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agtgxaoifpyxjejgvlmjmkaradhvbylf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577774.2265763-60-153708359995854/AnsiballZ_command.py
Feb 20 08:56:14 np0005625204.localdomain sudo[106633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:14 np0005625204.localdomain python3.9[106635]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 08:56:14 np0005625204.localdomain sudo[106633]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:15 np0005625204.localdomain sudo[106726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yswycguzmqsmyycpyaxvzmlnkocamrtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577775.061238-83-255390044120328/AnsiballZ_stat.py
Feb 20 08:56:15 np0005625204.localdomain sudo[106726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:15 np0005625204.localdomain python3.9[106728]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 08:56:15 np0005625204.localdomain sudo[106726]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:15 np0005625204.localdomain sudo[106820]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjevxbgsvtivpknwrjtnvvscqznopdih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577775.6929667-108-5180951376272/AnsiballZ_command.py
Feb 20 08:56:15 np0005625204.localdomain sudo[106820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20224 DF PROTO=TCP SPT=55548 DPT=9102 SEQ=1718216173 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D4D680000000001030307) 
Feb 20 08:56:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13620 DF PROTO=TCP SPT=54380 DPT=9100 SEQ=2445342985 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D4D690000000001030307) 
Feb 20 08:56:16 np0005625204.localdomain python3.9[106822]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 08:56:16 np0005625204.localdomain sshd[106480]: Invalid user titu from 103.157.25.4 port 36428
Feb 20 08:56:16 np0005625204.localdomain sudo[106820]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:16 np0005625204.localdomain sudo[106913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzxehdurttklylwcwrmewgkocjvydtue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577776.4302316-134-189272042982529/AnsiballZ_command.py
Feb 20 08:56:16 np0005625204.localdomain sudo[106913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:16 np0005625204.localdomain python3.9[106915]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 08:56:16 np0005625204.localdomain sudo[106913]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:17 np0005625204.localdomain sshd[106480]: Received disconnect from 103.157.25.4 port 36428:11: Bye Bye [preauth]
Feb 20 08:56:17 np0005625204.localdomain sshd[106480]: Disconnected from invalid user titu 103.157.25.4 port 36428 [preauth]
Feb 20 08:56:17 np0005625204.localdomain python3.9[107006]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 20 08:56:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58880 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D542A0000000001030307) 
Feb 20 08:56:18 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58881 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D58280000000001030307) 
Feb 20 08:56:19 np0005625204.localdomain python3.9[107096]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 08:56:19 np0005625204.localdomain python3.9[107188]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: tmp-crun.b04IBC.mount: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain podman[107205]: 2026-02-20 08:56:20.17130425 +0000 UTC m=+0.094560325 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: tmp-crun.I0wQVC.mount: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain podman[107205]: 2026-02-20 08:56:20.208805349 +0000 UTC m=+0.132061444 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., container_name=collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain podman[107204]: 2026-02-20 08:56:20.259795106 +0000 UTC m=+0.188392836 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible)
Feb 20 08:56:20 np0005625204.localdomain podman[107203]: 2026-02-20 08:56:20.2207896 +0000 UTC m=+0.148284156 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 20 08:56:20 np0005625204.localdomain podman[107203]: 2026-02-20 08:56:20.30523792 +0000 UTC m=+0.232732516 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, vcs-type=git, config_id=tripleo_step4)
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain podman[107213]: 2026-02-20 08:56:20.325605441 +0000 UTC m=+0.243397107 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_id=tripleo_step1, vendor=Red Hat, Inc.)
Feb 20 08:56:20 np0005625204.localdomain podman[107206]: 2026-02-20 08:56:20.377149234 +0000 UTC m=+0.295619861 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 20 08:56:20 np0005625204.localdomain podman[107204]: 2026-02-20 08:56:20.397345678 +0000 UTC m=+0.325943398 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain podman[107206]: 2026-02-20 08:56:20.439994967 +0000 UTC m=+0.358465594 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, release=1766032510, url=https://www.redhat.com, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain podman[107213]: 2026-02-20 08:56:20.515961106 +0000 UTC m=+0.433752772 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc.)
Feb 20 08:56:20 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:56:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58882 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D60290000000001030307) 
Feb 20 08:56:20 np0005625204.localdomain python3.9[107396]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 08:56:21 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12489 DF PROTO=TCP SPT=59792 DPT=9101 SEQ=3974714452 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D63680000000001030307) 
Feb 20 08:56:21 np0005625204.localdomain python3.9[107444]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 08:56:22 np0005625204.localdomain sshd[106444]: pam_unix(sshd:session): session closed for user zuul
Feb 20 08:56:22 np0005625204.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Feb 20 08:56:22 np0005625204.localdomain systemd[1]: session-37.scope: Consumed 4.904s CPU time.
Feb 20 08:56:22 np0005625204.localdomain systemd-logind[759]: Session 37 logged out. Waiting for processes to exit.
Feb 20 08:56:22 np0005625204.localdomain systemd-logind[759]: Removed session 37.
Feb 20 08:56:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58883 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D6FE90000000001030307) 
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:56:27 np0005625204.localdomain podman[107462]: 2026-02-20 08:56:27.172894594 +0000 UTC m=+0.090177639 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:56:27 np0005625204.localdomain podman[107461]: 2026-02-20 08:56:27.223069435 +0000 UTC m=+0.142691792 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: tmp-crun.XR3Mn6.mount: Deactivated successfully.
Feb 20 08:56:27 np0005625204.localdomain podman[107461]: 2026-02-20 08:56:27.264045662 +0000 UTC m=+0.183668049 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:56:27 np0005625204.localdomain podman[107460]: 2026-02-20 08:56:27.278938133 +0000 UTC m=+0.203105271 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:36:40Z, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:56:27 np0005625204.localdomain podman[107462]: 2026-02-20 08:56:27.295541876 +0000 UTC m=+0.212824931 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 08:56:27 np0005625204.localdomain podman[107462]: unhealthy
Feb 20 08:56:27 np0005625204.localdomain podman[107468]: 2026-02-20 08:56:27.254507857 +0000 UTC m=+0.166118136 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=nova_compute, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:56:27 np0005625204.localdomain podman[107468]: 2026-02-20 08:56:27.337981518 +0000 UTC m=+0.249591807 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:56:27 np0005625204.localdomain podman[107468]: unhealthy
Feb 20 08:56:27 np0005625204.localdomain podman[107460]: 2026-02-20 08:56:27.347998567 +0000 UTC m=+0.272165715 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:56:27 np0005625204.localdomain podman[107460]: unhealthy
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:56:27 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:56:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55666 DF PROTO=TCP SPT=57422 DPT=9105 SEQ=2464786171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D7AE80000000001030307) 
Feb 20 08:56:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55667 DF PROTO=TCP SPT=57422 DPT=9105 SEQ=2464786171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D82E80000000001030307) 
Feb 20 08:56:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58884 DF PROTO=TCP SPT=33768 DPT=9882 SEQ=922608339 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D8F680000000001030307) 
Feb 20 08:56:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:56:35 np0005625204.localdomain systemd[1]: tmp-crun.DIRMOS.mount: Deactivated successfully.
Feb 20 08:56:35 np0005625204.localdomain podman[107540]: 2026-02-20 08:56:35.14777912 +0000 UTC m=+0.088581040 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13)
Feb 20 08:56:35 np0005625204.localdomain podman[107540]: 2026-02-20 08:56:35.529554553 +0000 UTC m=+0.470356463 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510)
Feb 20 08:56:35 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:56:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18700 DF PROTO=TCP SPT=57424 DPT=9101 SEQ=775098770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597D9CD50000000001030307) 
Feb 20 08:56:39 np0005625204.localdomain sshd[107563]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18702 DF PROTO=TCP SPT=57424 DPT=9101 SEQ=775098770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DA8E80000000001030307) 
Feb 20 08:56:39 np0005625204.localdomain sshd[107563]: Accepted publickey for zuul from 192.168.122.30 port 34998 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 08:56:39 np0005625204.localdomain systemd-logind[759]: New session 38 of user zuul.
Feb 20 08:56:39 np0005625204.localdomain systemd[1]: Started Session 38 of User zuul.
Feb 20 08:56:39 np0005625204.localdomain sshd[107563]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 08:56:40 np0005625204.localdomain sudo[107656]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvwoibrtsyreddwlprflzujiojpqdkys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577799.635005-20-212393374723027/AnsiballZ_systemd_service.py
Feb 20 08:56:40 np0005625204.localdomain sudo[107656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:40 np0005625204.localdomain python3.9[107658]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 08:56:40 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:56:40 np0005625204.localdomain systemd-rc-local-generator[107680]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:56:40 np0005625204.localdomain systemd-sysv-generator[107685]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:56:40 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:56:40 np0005625204.localdomain systemd[1]: Starting dnf makecache...
Feb 20 08:56:40 np0005625204.localdomain sudo[107656]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:41 np0005625204.localdomain dnf[107694]: Updating Subscription Management repositories.
Feb 20 08:56:41 np0005625204.localdomain python3.9[107784]: ansible-ansible.builtin.service_facts Invoked
Feb 20 08:56:41 np0005625204.localdomain network[107801]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 08:56:41 np0005625204.localdomain network[107802]: 'network-scripts' will be removed from distribution in near future.
Feb 20 08:56:41 np0005625204.localdomain network[107803]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 08:56:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55669 DF PROTO=TCP SPT=57422 DPT=9105 SEQ=2464786171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DB3690000000001030307) 
Feb 20 08:56:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:56:43 np0005625204.localdomain dnf[107694]: Metadata cache refreshed recently.
Feb 20 08:56:43 np0005625204.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 20 08:56:43 np0005625204.localdomain systemd[1]: Finished dnf makecache.
Feb 20 08:56:43 np0005625204.localdomain systemd[1]: dnf-makecache.service: Consumed 2.180s CPU time.
Feb 20 08:56:43 np0005625204.localdomain sudo[107875]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:56:43 np0005625204.localdomain sudo[107875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:56:43 np0005625204.localdomain sudo[107875]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:43 np0005625204.localdomain sudo[107890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:56:43 np0005625204.localdomain sudo[107890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:56:44 np0005625204.localdomain sudo[107890]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:45 np0005625204.localdomain sudo[107974]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:56:45 np0005625204.localdomain sudo[107974]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:56:45 np0005625204.localdomain sudo[107974]: pam_unix(sudo:session): session closed for user root
Feb 20 08:56:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64317 DF PROTO=TCP SPT=38472 DPT=9100 SEQ=192635383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DC3690000000001030307) 
Feb 20 08:56:46 np0005625204.localdomain sshd[108002]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46349 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DC95A0000000001030307) 
Feb 20 08:56:48 np0005625204.localdomain python3.9[108079]: ansible-ansible.builtin.service_facts Invoked
Feb 20 08:56:48 np0005625204.localdomain network[108096]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 08:56:48 np0005625204.localdomain network[108097]: 'network-scripts' will be removed from distribution in near future.
Feb 20 08:56:48 np0005625204.localdomain network[108098]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 08:56:48 np0005625204.localdomain sshd[108002]: Received disconnect from 27.112.79.3 port 59420:11: Bye Bye [preauth]
Feb 20 08:56:48 np0005625204.localdomain sshd[108002]: Disconnected from authenticating user root 27.112.79.3 port 59420 [preauth]
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:56:50 np0005625204.localdomain podman[108142]: 2026-02-20 08:56:50.372105128 +0000 UTC m=+0.084984258 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, architecture=x86_64, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 20 08:56:50 np0005625204.localdomain podman[108142]: 2026-02-20 08:56:50.385030888 +0000 UTC m=+0.097910048 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public)
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:56:50 np0005625204.localdomain podman[108161]: 2026-02-20 08:56:50.456690623 +0000 UTC m=+0.093059558 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:56:50 np0005625204.localdomain podman[108161]: 2026-02-20 08:56:50.50380646 +0000 UTC m=+0.140175385 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, version=17.1.13)
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:56:50 np0005625204.localdomain podman[108183]: 2026-02-20 08:56:50.519016491 +0000 UTC m=+0.079118088 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, tcib_managed=true, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: tmp-crun.lTkmTm.mount: Deactivated successfully.
Feb 20 08:56:50 np0005625204.localdomain podman[108195]: 2026-02-20 08:56:50.589969404 +0000 UTC m=+0.127080409 container health_status cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z)
Feb 20 08:56:50 np0005625204.localdomain podman[108183]: 2026-02-20 08:56:50.603140232 +0000 UTC m=+0.163241829 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T22:10:15Z, tcib_managed=true, architecture=x86_64)
Feb 20 08:56:50 np0005625204.localdomain podman[108195]: 2026-02-20 08:56:50.618795015 +0000 UTC m=+0.155905960 container exec_died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Deactivated successfully.
Feb 20 08:56:50 np0005625204.localdomain podman[108233]: 2026-02-20 08:56:50.740806718 +0000 UTC m=+0.176882520 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 20 08:56:50 np0005625204.localdomain podman[108233]: 2026-02-20 08:56:50.921671189 +0000 UTC m=+0.357746951 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 20 08:56:50 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:56:51 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46351 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DD5680000000001030307) 
Feb 20 08:56:52 np0005625204.localdomain sudo[108417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqgdnwzlsdkslzxbtzwqooepuliucmcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577811.801328-110-87147589162679/AnsiballZ_systemd_service.py
Feb 20 08:56:52 np0005625204.localdomain sudo[108417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:56:52 np0005625204.localdomain python3.9[108419]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:56:52 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:56:52 np0005625204.localdomain systemd-sysv-generator[108446]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:56:52 np0005625204.localdomain systemd-rc-local-generator[108443]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:56:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:56:52 np0005625204.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Feb 20 08:56:53 np0005625204.localdomain sshd[108472]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:56:53 np0005625204.localdomain sshd[108472]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:56:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46352 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DE5290000000001030307) 
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:56:57 np0005625204.localdomain podman[108474]: 2026-02-20 08:56:57.389744897 +0000 UTC m=+0.082596804 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vcs-type=git)
Feb 20 08:56:57 np0005625204.localdomain podman[108474]: 2026-02-20 08:56:57.402987307 +0000 UTC m=+0.095839264 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid)
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: tmp-crun.2jZBwG.mount: Deactivated successfully.
Feb 20 08:56:57 np0005625204.localdomain podman[108495]: 2026-02-20 08:56:57.505474616 +0000 UTC m=+0.082749200 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:56:57 np0005625204.localdomain podman[108493]: 2026-02-20 08:56:57.553333906 +0000 UTC m=+0.138535105 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ovn-controller-container)
Feb 20 08:56:57 np0005625204.localdomain podman[108495]: 2026-02-20 08:56:57.556594466 +0000 UTC m=+0.133869060 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:56:57 np0005625204.localdomain podman[108495]: unhealthy
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:56:57 np0005625204.localdomain podman[108493]: 2026-02-20 08:56:57.570131385 +0000 UTC m=+0.155332584 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:56:57 np0005625204.localdomain podman[108493]: unhealthy
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:56:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2092 DF PROTO=TCP SPT=54016 DPT=9105 SEQ=1705066968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DF0280000000001030307) 
Feb 20 08:56:57 np0005625204.localdomain podman[108494]: 2026-02-20 08:56:57.652729188 +0000 UTC m=+0.233477249 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:56:57 np0005625204.localdomain podman[108494]: 2026-02-20 08:56:57.670917731 +0000 UTC m=+0.251665812 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z)
Feb 20 08:56:57 np0005625204.localdomain podman[108494]: unhealthy
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:56:57 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:56:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2093 DF PROTO=TCP SPT=54016 DPT=9105 SEQ=1705066968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597DF8280000000001030307) 
Feb 20 08:57:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46353 DF PROTO=TCP SPT=35458 DPT=9882 SEQ=1296352457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E05680000000001030307) 
Feb 20 08:57:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:57:06 np0005625204.localdomain podman[108554]: 2026-02-20 08:57:06.128481328 +0000 UTC m=+0.064940299 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible)
Feb 20 08:57:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62194 DF PROTO=TCP SPT=60452 DPT=9101 SEQ=2944394580 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E12050000000001030307) 
Feb 20 08:57:06 np0005625204.localdomain podman[108554]: 2026-02-20 08:57:06.536352718 +0000 UTC m=+0.472811769 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 20 08:57:06 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:57:09 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:57:09 np0005625204.localdomain recover_tripleo_nova_virtqemud[108579]: 63005
Feb 20 08:57:09 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:57:09 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:57:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62196 DF PROTO=TCP SPT=60452 DPT=9101 SEQ=2944394580 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E1E280000000001030307) 
Feb 20 08:57:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2095 DF PROTO=TCP SPT=54016 DPT=9105 SEQ=1705066968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E27680000000001030307) 
Feb 20 08:57:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62481 DF PROTO=TCP SPT=44394 DPT=9102 SEQ=242983024 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E37680000000001030307) 
Feb 20 08:57:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47325 DF PROTO=TCP SPT=57518 DPT=9882 SEQ=346104544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E3E8A0000000001030307) 
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:57:20 np0005625204.localdomain podman[108581]: 2026-02-20 08:57:20.628286738 +0000 UTC m=+0.065095815 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, release=1766032510, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:57:20 np0005625204.localdomain podman[108581]: 2026-02-20 08:57:20.638521074 +0000 UTC m=+0.075330151 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:57:20 np0005625204.localdomain podman[108610]: 2026-02-20 08:57:20.750731633 +0000 UTC m=+0.087426884 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:57:20 np0005625204.localdomain podman[108580]: 2026-02-20 08:57:20.71214027 +0000 UTC m=+0.146408028 container health_status 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64)
Feb 20 08:57:20 np0005625204.localdomain podman[108610]: 2026-02-20 08:57:20.78717582 +0000 UTC m=+0.123871031 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, version=17.1.13, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com)
Feb 20 08:57:20 np0005625204.localdomain podman[108580]: 2026-02-20 08:57:20.796084465 +0000 UTC m=+0.230352253 container exec_died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:57:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47327 DF PROTO=TCP SPT=57518 DPT=9882 SEQ=346104544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E4AA80000000001030307) 
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Deactivated successfully.
Feb 20 08:57:20 np0005625204.localdomain podman[108611]: Error: container cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a is not running
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Main process exited, code=exited, status=125/n/a
Feb 20 08:57:20 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed with result 'exit-code'.
Feb 20 08:57:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:57:21 np0005625204.localdomain podman[108656]: 2026-02-20 08:57:21.158826381 +0000 UTC m=+0.090163508 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510)
Feb 20 08:57:21 np0005625204.localdomain podman[108656]: 2026-02-20 08:57:21.358987869 +0000 UTC m=+0.290324986 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:57:21 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:57:21 np0005625204.localdomain systemd[1]: tmp-crun.ICmI5p.mount: Deactivated successfully.
Feb 20 08:57:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47328 DF PROTO=TCP SPT=57518 DPT=9882 SEQ=346104544 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E5A690000000001030307) 
Feb 20 08:57:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22162 DF PROTO=TCP SPT=57974 DPT=9105 SEQ=2104025892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E65280000000001030307) 
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: tmp-crun.qGHC2T.mount: Deactivated successfully.
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: tmp-crun.N6PzfT.mount: Deactivated successfully.
Feb 20 08:57:27 np0005625204.localdomain podman[108693]: 2026-02-20 08:57:27.913605305 +0000 UTC m=+0.085730431 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 20 08:57:27 np0005625204.localdomain podman[108684]: 2026-02-20 08:57:27.894244036 +0000 UTC m=+0.080822410 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, url=https://www.redhat.com)
Feb 20 08:57:27 np0005625204.localdomain podman[108686]: 2026-02-20 08:57:27.951053673 +0000 UTC m=+0.128338709 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:57:27 np0005625204.localdomain podman[108693]: 2026-02-20 08:57:27.980158233 +0000 UTC m=+0.152283389 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 20 08:57:27 np0005625204.localdomain podman[108693]: unhealthy
Feb 20 08:57:27 np0005625204.localdomain podman[108686]: 2026-02-20 08:57:27.991674779 +0000 UTC m=+0.168959795 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:57:27 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:57:28 np0005625204.localdomain podman[108686]: unhealthy
Feb 20 08:57:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:57:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:57:28 np0005625204.localdomain podman[108684]: 2026-02-20 08:57:28.024503914 +0000 UTC m=+0.211082228 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public)
Feb 20 08:57:28 np0005625204.localdomain podman[108684]: unhealthy
Feb 20 08:57:28 np0005625204.localdomain podman[108685]: 2026-02-20 08:57:27.996654213 +0000 UTC m=+0.176749616 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, container_name=iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:57:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:57:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:57:28 np0005625204.localdomain podman[108685]: 2026-02-20 08:57:28.07550346 +0000 UTC m=+0.255598863 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z)
Feb 20 08:57:28 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:57:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22163 DF PROTO=TCP SPT=57974 DPT=9105 SEQ=2104025892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E6D280000000001030307) 
Feb 20 08:57:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47391 DF PROTO=TCP SPT=60244 DPT=9100 SEQ=760201816 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E79680000000001030307) 
Feb 20 08:57:34 np0005625204.localdomain podman[108459]: time="2026-02-20T08:57:34Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: libpod-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope: Deactivated successfully.
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: libpod-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope: Consumed 6.248s CPU time.
Feb 20 08:57:34 np0005625204.localdomain podman[108459]: 2026-02-20 08:57:34.859176644 +0000 UTC m=+42.097371613 container died cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container)
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: Deactivated successfully.
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: No such file or directory
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: tmp-crun.9R1hAS.mount: Deactivated successfully.
Feb 20 08:57:34 np0005625204.localdomain podman[108459]: 2026-02-20 08:57:34.924087541 +0000 UTC m=+42.162282480 container cleanup cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true)
Feb 20 08:57:34 np0005625204.localdomain podman[108459]: ceilometer_agent_compute
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: No such file or directory
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: No such file or directory
Feb 20 08:57:34 np0005625204.localdomain podman[108762]: 2026-02-20 08:57:34.952095157 +0000 UTC m=+0.080891982 container cleanup cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 08:57:34 np0005625204.localdomain systemd[1]: libpod-conmon-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.scope: Deactivated successfully.
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.timer: No such file or directory
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: Failed to open /run/systemd/transient/cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a.service: No such file or directory
Feb 20 08:57:35 np0005625204.localdomain podman[108778]: 2026-02-20 08:57:35.063657746 +0000 UTC m=+0.069386916 container cleanup cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true)
Feb 20 08:57:35 np0005625204.localdomain podman[108778]: ceilometer_agent_compute
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.091s CPU time, no IO.
Feb 20 08:57:35 np0005625204.localdomain sudo[108417]: pam_unix(sudo:session): session closed for user root
Feb 20 08:57:35 np0005625204.localdomain sudo[108879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sukhpykwxlweklwaawsfnatetxdzvxaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577855.2313995-110-223939227937378/AnsiballZ_systemd_service.py
Feb 20 08:57:35 np0005625204.localdomain sudo[108879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:57:35 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:d7:b4:4a MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45034 SEQ=671084412 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 20 08:57:35 np0005625204.localdomain python3.9[108881]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e1c07b1bd08758bd14fb80cc901f6da6a3ccc5e5eba94f04ead08e95db5f3037-merged.mount: Deactivated successfully.
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a-userdata-shm.mount: Deactivated successfully.
Feb 20 08:57:35 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:57:35 np0005625204.localdomain systemd-rc-local-generator[108906]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:57:35 np0005625204.localdomain systemd-sysv-generator[108912]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:57:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:57:36 np0005625204.localdomain sshd[108920]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:57:36 np0005625204.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Feb 20 08:57:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:57:37 np0005625204.localdomain podman[108938]: 2026-02-20 08:57:37.144949504 +0000 UTC m=+0.083855573 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510)
Feb 20 08:57:37 np0005625204.localdomain sshd[108920]: Invalid user solana from 45.148.10.240 port 45246
Feb 20 08:57:37 np0005625204.localdomain podman[108938]: 2026-02-20 08:57:37.517398509 +0000 UTC m=+0.456304518 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public)
Feb 20 08:57:37 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:57:37 np0005625204.localdomain sshd[108920]: Connection closed by invalid user solana 45.148.10.240 port 45246 [preauth]
Feb 20 08:57:38 np0005625204.localdomain sshd[108961]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:57:38 np0005625204.localdomain sshd[108961]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:57:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59499 DF PROTO=TCP SPT=51882 DPT=9101 SEQ=2614739063 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E93290000000001030307) 
Feb 20 08:57:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22165 DF PROTO=TCP SPT=57974 DPT=9105 SEQ=2104025892 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597E9D680000000001030307) 
Feb 20 08:57:45 np0005625204.localdomain sudo[108963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:57:45 np0005625204.localdomain sudo[108963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:57:45 np0005625204.localdomain sudo[108963]: pam_unix(sudo:session): session closed for user root
Feb 20 08:57:45 np0005625204.localdomain sudo[108978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:57:45 np0005625204.localdomain sudo[108978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:57:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20144 DF PROTO=TCP SPT=55476 DPT=9100 SEQ=250879534 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EAD680000000001030307) 
Feb 20 08:57:46 np0005625204.localdomain sudo[108978]: pam_unix(sudo:session): session closed for user root
Feb 20 08:57:47 np0005625204.localdomain sudo[109024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:57:47 np0005625204.localdomain sudo[109024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:57:47 np0005625204.localdomain sudo[109024]: pam_unix(sudo:session): session closed for user root
Feb 20 08:57:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35038 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EB3BA0000000001030307) 
Feb 20 08:57:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35040 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EBFA80000000001030307) 
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:57:51 np0005625204.localdomain podman[109041]: 2026-02-20 08:57:51.152549408 +0000 UTC m=+0.080410897 container health_status 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 20 08:57:51 np0005625204.localdomain podman[109041]: 2026-02-20 08:57:51.165075445 +0000 UTC m=+0.092936924 container exec_died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, container_name=collectd, release=1766032510, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team)
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: tmp-crun.ANVzJn.mount: Deactivated successfully.
Feb 20 08:57:51 np0005625204.localdomain podman[109040]: 2026-02-20 08:57:51.207073113 +0000 UTC m=+0.139861695 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, container_name=logrotate_crond, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 20 08:57:51 np0005625204.localdomain podman[109039]: Error: container 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 is not running
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Main process exited, code=exited, status=125/n/a
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed with result 'exit-code'.
Feb 20 08:57:51 np0005625204.localdomain podman[109040]: 2026-02-20 08:57:51.241463236 +0000 UTC m=+0.174251848 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, release=1766032510, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron)
Feb 20 08:57:51 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:57:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:57:52 np0005625204.localdomain podman[109086]: 2026-02-20 08:57:52.12976986 +0000 UTC m=+0.075288088 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git)
Feb 20 08:57:52 np0005625204.localdomain podman[109086]: 2026-02-20 08:57:52.337828483 +0000 UTC m=+0.283346701 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public)
Feb 20 08:57:52 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:57:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35041 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597ECF680000000001030307) 
Feb 20 08:57:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11800 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=1323325226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EDA680000000001030307) 
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: tmp-crun.v1oroh.mount: Deactivated successfully.
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:57:58 np0005625204.localdomain podman[109116]: 2026-02-20 08:57:58.157240983 +0000 UTC m=+0.093787240 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:57:58 np0005625204.localdomain podman[109121]: 2026-02-20 08:57:58.171011809 +0000 UTC m=+0.093772380 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4)
Feb 20 08:57:58 np0005625204.localdomain podman[109117]: 2026-02-20 08:57:58.207208328 +0000 UTC m=+0.138731040 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64)
Feb 20 08:57:58 np0005625204.localdomain podman[109116]: 2026-02-20 08:57:58.229993143 +0000 UTC m=+0.166539380 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 08:57:58 np0005625204.localdomain podman[109116]: unhealthy
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:57:58 np0005625204.localdomain podman[109121]: 2026-02-20 08:57:58.264761707 +0000 UTC m=+0.187522268 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 20 08:57:58 np0005625204.localdomain podman[109121]: unhealthy
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:57:58 np0005625204.localdomain podman[109159]: 2026-02-20 08:57:58.235710459 +0000 UTC m=+0.073000198 container health_status 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, container_name=iscsid, version=17.1.13, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 20 08:57:58 np0005625204.localdomain podman[109159]: 2026-02-20 08:57:58.317107936 +0000 UTC m=+0.154397665 container exec_died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, url=https://www.redhat.com, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Deactivated successfully.
Feb 20 08:57:58 np0005625204.localdomain podman[109117]: 2026-02-20 08:57:58.33826544 +0000 UTC m=+0.269788132 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, managed_by=tripleo_ansible)
Feb 20 08:57:58 np0005625204.localdomain podman[109117]: unhealthy
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:57:58 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:57:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11801 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=1323325226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EE2690000000001030307) 
Feb 20 08:58:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35042 DF PROTO=TCP SPT=44828 DPT=9882 SEQ=3412511464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EEF690000000001030307) 
Feb 20 08:58:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17671 DF PROTO=TCP SPT=60500 DPT=9101 SEQ=3302469427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597EFC650000000001030307) 
Feb 20 08:58:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:58:07 np0005625204.localdomain systemd[1]: tmp-crun.c47BBy.mount: Deactivated successfully.
Feb 20 08:58:07 np0005625204.localdomain podman[109200]: 2026-02-20 08:58:07.902818844 +0000 UTC m=+0.093916836 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc.)
Feb 20 08:58:08 np0005625204.localdomain podman[109200]: 2026-02-20 08:58:08.24329353 +0000 UTC m=+0.434391592 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Feb 20 08:58:08 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:58:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17673 DF PROTO=TCP SPT=60500 DPT=9101 SEQ=3302469427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F08680000000001030307) 
Feb 20 08:58:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11803 DF PROTO=TCP SPT=48190 DPT=9105 SEQ=1323325226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F13680000000001030307) 
Feb 20 08:58:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51596 DF PROTO=TCP SPT=58240 DPT=9102 SEQ=1064857520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F23680000000001030307) 
Feb 20 08:58:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56057 DF PROTO=TCP SPT=48320 DPT=9882 SEQ=4293003028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F28EA0000000001030307) 
Feb 20 08:58:18 np0005625204.localdomain podman[108923]: time="2026-02-20T08:58:18Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: libpod-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope: Deactivated successfully.
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: libpod-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope: Consumed 6.508s CPU time.
Feb 20 08:58:18 np0005625204.localdomain podman[108923]: 2026-02-20 08:58:18.312714485 +0000 UTC m=+42.098986532 container died 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git)
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: Deactivated successfully.
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: No such file or directory
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5-userdata-shm.mount: Deactivated successfully.
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-271fbe47d50a90f03735a26a1ff5b20e2027c13cb6e9d5c8a6a9112793cd7c92-merged.mount: Deactivated successfully.
Feb 20 08:58:18 np0005625204.localdomain podman[108923]: 2026-02-20 08:58:18.394020829 +0000 UTC m=+42.180292826 container cleanup 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 20 08:58:18 np0005625204.localdomain podman[108923]: ceilometer_agent_ipmi
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: No such file or directory
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: No such file or directory
Feb 20 08:58:18 np0005625204.localdomain podman[109223]: 2026-02-20 08:58:18.439598758 +0000 UTC m=+0.120218318 container cleanup 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: libpod-conmon-1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.scope: Deactivated successfully.
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.timer: No such file or directory
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: Failed to open /run/systemd/transient/1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5.service: No such file or directory
Feb 20 08:58:18 np0005625204.localdomain podman[109239]: 2026-02-20 08:58:18.527439035 +0000 UTC m=+0.060248425 container cleanup 1218627acd13e5a707a21a770cf681a5a70d81088810bfb809619e430afd70a5 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 20 08:58:18 np0005625204.localdomain podman[109239]: ceilometer_agent_ipmi
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Feb 20 08:58:18 np0005625204.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Feb 20 08:58:18 np0005625204.localdomain sudo[108879]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:18 np0005625204.localdomain sudo[109340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtnuwdjomxwgkxaeluqqjnosbugzgjac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577898.6959815-110-88815466222306/AnsiballZ_systemd_service.py
Feb 20 08:58:18 np0005625204.localdomain sudo[109340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:19 np0005625204.localdomain python3.9[109342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:19 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:58:19 np0005625204.localdomain systemd-sysv-generator[109370]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:58:19 np0005625204.localdomain systemd-rc-local-generator[109366]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:58:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:58:19 np0005625204.localdomain systemd[1]: Stopping collectd container...
Feb 20 08:58:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56059 DF PROTO=TCP SPT=48320 DPT=9882 SEQ=4293003028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F34E80000000001030307) 
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: libpod-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: libpod-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope: Consumed 2.254s CPU time.
Feb 20 08:58:21 np0005625204.localdomain podman[109382]: 2026-02-20 08:58:21.159713868 +0000 UTC m=+1.475591993 container stop 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 20 08:58:21 np0005625204.localdomain podman[109382]: 2026-02-20 08:58:21.194924817 +0000 UTC m=+1.510802992 container died 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: tmp-crun.3p0MX5.mount: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: Stopping /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e...
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: tmp-crun.8xGWsa.mount: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:58:21 np0005625204.localdomain podman[109382]: 2026-02-20 08:58:21.281264917 +0000 UTC m=+1.597143042 container cleanup 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd)
Feb 20 08:58:21 np0005625204.localdomain podman[109382]: collectd
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: No such file or directory
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: No such file or directory
Feb 20 08:58:21 np0005625204.localdomain podman[109394]: 2026-02-20 08:58:21.310418778 +0000 UTC m=+0.130406413 container cleanup 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, container_name=collectd, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:10:15Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: libpod-conmon-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.scope: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain podman[109410]: 2026-02-20 08:58:21.362796228 +0000 UTC m=+0.078706705 container health_status 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron)
Feb 20 08:58:21 np0005625204.localdomain podman[109446]: error opening file `/run/crun/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e/status`: No such file or directory
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.timer: No such file or directory
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: Failed to open /run/systemd/transient/55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e.service: No such file or directory
Feb 20 08:58:21 np0005625204.localdomain podman[109429]: 2026-02-20 08:58:21.422234985 +0000 UTC m=+0.078516728 container cleanup 55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, container_name=collectd, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 20 08:58:21 np0005625204.localdomain podman[109429]: collectd
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: Stopped collectd container.
Feb 20 08:58:21 np0005625204.localdomain podman[109410]: 2026-02-20 08:58:21.452245093 +0000 UTC m=+0.168155600 container exec_died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-cron-container, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 20 08:58:21 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Deactivated successfully.
Feb 20 08:58:21 np0005625204.localdomain sudo[109340]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:21 np0005625204.localdomain sudo[109537]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxymqavxckhwmjhkgurtmuwpjgfbseeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577901.603295-110-37160570907099/AnsiballZ_systemd_service.py
Feb 20 08:58:21 np0005625204.localdomain sudo[109537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:22 np0005625204.localdomain python3.9[109539]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7-merged.mount: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-55d94aebbeebc2a22c750bcca771a584549b2b87deb2c9c4eef8de5cd8ce1f1e-userdata-shm.mount: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:58:22 np0005625204.localdomain systemd-sysv-generator[109570]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:58:22 np0005625204.localdomain systemd-rc-local-generator[109565]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Stopping iscsid container...
Feb 20 08:58:22 np0005625204.localdomain recover_tripleo_nova_virtqemud[109586]: 63005
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: libpod-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: libpod-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope: Consumed 1.091s CPU time.
Feb 20 08:58:22 np0005625204.localdomain podman[109582]: 2026-02-20 08:58:22.689995861 +0000 UTC m=+0.084536185 container died 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: No such file or directory
Feb 20 08:58:22 np0005625204.localdomain podman[109582]: 2026-02-20 08:58:22.784252835 +0000 UTC m=+0.178793109 container cleanup 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, version=17.1.13, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 20 08:58:22 np0005625204.localdomain podman[109582]: iscsid
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: No such file or directory
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: No such file or directory
Feb 20 08:58:22 np0005625204.localdomain podman[109605]: 2026-02-20 08:58:22.796347199 +0000 UTC m=+0.094469731 container cleanup 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:58:22 np0005625204.localdomain podman[109579]: 2026-02-20 08:58:22.757845409 +0000 UTC m=+0.156564271 container health_status f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-type=git, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: libpod-conmon-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.scope: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.timer: No such file or directory
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: Failed to open /run/systemd/transient/5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008.service: No such file or directory
Feb 20 08:58:22 np0005625204.localdomain podman[109641]: 2026-02-20 08:58:22.912055027 +0000 UTC m=+0.085634779 container cleanup 5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 20 08:58:22 np0005625204.localdomain podman[109641]: iscsid
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: Stopped iscsid container.
Feb 20 08:58:22 np0005625204.localdomain sudo[109537]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:22 np0005625204.localdomain podman[109579]: 2026-02-20 08:58:22.983010971 +0000 UTC m=+0.381729883 container exec_died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, release=1766032510, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 20 08:58:22 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Deactivated successfully.
Feb 20 08:58:23 np0005625204.localdomain systemd[1]: tmp-crun.tPntMm.mount: Deactivated successfully.
Feb 20 08:58:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d-merged.mount: Deactivated successfully.
Feb 20 08:58:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bed33d9d6b29091147825ed2c85ee5118e22b6060489dd8f3f07b3ab986b008-userdata-shm.mount: Deactivated successfully.
Feb 20 08:58:23 np0005625204.localdomain sudo[109743]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjbqvuujtpghzbaykxzkdmdljbmllrbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577903.0729294-110-83143415256439/AnsiballZ_systemd_service.py
Feb 20 08:58:23 np0005625204.localdomain sudo[109743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:23 np0005625204.localdomain python3.9[109745]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:58:23 np0005625204.localdomain systemd-rc-local-generator[109769]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:58:23 np0005625204.localdomain systemd-sysv-generator[109775]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:58:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:58:24 np0005625204.localdomain sshd[109784]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: Stopping logrotate_crond container...
Feb 20 08:58:24 np0005625204.localdomain crond[72022]: (CRON) INFO (Shutting down)
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: libpod-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.scope: Deactivated successfully.
Feb 20 08:58:24 np0005625204.localdomain podman[109788]: 2026-02-20 08:58:24.218154398 +0000 UTC m=+0.053595217 container died 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5)
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: tmp-crun.ETMQTz.mount: Deactivated successfully.
Feb 20 08:58:24 np0005625204.localdomain sshd[109784]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: Deactivated successfully.
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: No such file or directory
Feb 20 08:58:24 np0005625204.localdomain podman[109788]: 2026-02-20 08:58:24.273212651 +0000 UTC m=+0.108653500 container cleanup 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 20 08:58:24 np0005625204.localdomain podman[109788]: logrotate_crond
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: No such file or directory
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: No such file or directory
Feb 20 08:58:24 np0005625204.localdomain podman[109802]: 2026-02-20 08:58:24.310995559 +0000 UTC m=+0.083805072 container cleanup 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, release=1766032510, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: libpod-conmon-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.scope: Deactivated successfully.
Feb 20 08:58:24 np0005625204.localdomain podman[109831]: error opening file `/run/crun/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e/status`: No such file or directory
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.timer: No such file or directory
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: Failed to open /run/systemd/transient/1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e.service: No such file or directory
Feb 20 08:58:24 np0005625204.localdomain podman[109818]: 2026-02-20 08:58:24.403139128 +0000 UTC m=+0.068569742 container cleanup 1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 08:58:24 np0005625204.localdomain podman[109818]: logrotate_crond
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Feb 20 08:58:24 np0005625204.localdomain systemd[1]: Stopped logrotate_crond container.
Feb 20 08:58:24 np0005625204.localdomain sudo[109743]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56060 DF PROTO=TCP SPT=48320 DPT=9882 SEQ=4293003028 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F44A80000000001030307) 
Feb 20 08:58:24 np0005625204.localdomain sudo[109922]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iltfifpzxrfkneybkgwdpkolalztsvul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577904.5905364-110-224051249742785/AnsiballZ_systemd_service.py
Feb 20 08:58:24 np0005625204.localdomain sudo[109922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:25 np0005625204.localdomain python3.9[109924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0b03ed83be81af8ca31d355d34bc84741adbeedeb0b33580fe27349115e799d7-merged.mount: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1cb8b581e523ab9232042a730765662606b3c23934ea8d2aa43c4503a7c1f43e-userdata-shm.mount: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:58:25 np0005625204.localdomain systemd-rc-local-generator[109947]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:58:25 np0005625204.localdomain systemd-sysv-generator[109950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: Stopping metrics_qdr container...
Feb 20 08:58:25 np0005625204.localdomain kernel: qdrouterd[55300]: segfault at 0 ip 00007fdcedc9e7cb sp 00007fff412a6c00 error 4 in libc.so.6[7fdcedc3b000+175000]
Feb 20 08:58:25 np0005625204.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: Started Process Core Dump (PID 109977/UID 0).
Feb 20 08:58:25 np0005625204.localdomain systemd-coredump[109978]: Resource limits disable core dumping for process 55300 (qdrouterd).
Feb 20 08:58:25 np0005625204.localdomain systemd-coredump[109978]: Process 55300 (qdrouterd) of user 42465 dumped core.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: systemd-coredump@0-109977-0.service: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain podman[109965]: 2026-02-20 08:58:25.815581127 +0000 UTC m=+0.226337709 container died f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, config_id=tripleo_step1, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=)
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: libpod-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: libpod-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope: Consumed 28.009s CPU time.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: No such file or directory
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: tmp-crun.TO2Bv5.mount: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain podman[109965]: 2026-02-20 08:58:25.861271589 +0000 UTC m=+0.272028121 container cleanup f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 20 08:58:25 np0005625204.localdomain podman[109965]: metrics_qdr
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: No such file or directory
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: No such file or directory
Feb 20 08:58:25 np0005625204.localdomain podman[109982]: 2026-02-20 08:58:25.90267972 +0000 UTC m=+0.075307650 container cleanup f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: libpod-conmon-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.scope: Deactivated successfully.
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.timer: No such file or directory
Feb 20 08:58:25 np0005625204.localdomain systemd[1]: f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: Failed to open /run/systemd/transient/f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b.service: No such file or directory
Feb 20 08:58:26 np0005625204.localdomain podman[109998]: 2026-02-20 08:58:26.000217685 +0000 UTC m=+0.068476057 container cleanup f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, vcs-type=git, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'c3cf83e3d6b9a6a9323d670f77d9e810'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 20 08:58:26 np0005625204.localdomain podman[109998]: metrics_qdr
Feb 20 08:58:26 np0005625204.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Feb 20 08:58:26 np0005625204.localdomain systemd[1]: Stopped metrics_qdr container.
Feb 20 08:58:26 np0005625204.localdomain sudo[109922]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a-merged.mount: Deactivated successfully.
Feb 20 08:58:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f82025a1089ec0cfc4f170208cbbaacd45876755eb76fba50f80dba86182987b-userdata-shm.mount: Deactivated successfully.
Feb 20 08:58:26 np0005625204.localdomain sudo[110099]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syhwncwcqxhclcecvhvsczaleoqfmgtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577906.1810663-110-123762421669068/AnsiballZ_systemd_service.py
Feb 20 08:58:26 np0005625204.localdomain sudo[110099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:26 np0005625204.localdomain python3.9[110101]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:26 np0005625204.localdomain sudo[110099]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:27 np0005625204.localdomain sudo[110192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slfanctzkdgqtcmnudimrkvqpjeeghtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577906.8884203-110-102889884253928/AnsiballZ_systemd_service.py
Feb 20 08:58:27 np0005625204.localdomain sudo[110192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:27 np0005625204.localdomain python3.9[110194]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:27 np0005625204.localdomain sudo[110192]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2528 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=4075530326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F4FA80000000001030307) 
Feb 20 08:58:27 np0005625204.localdomain sudo[110285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxentmmkfwqlltdqgfgsglesdfurkyfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577907.6135662-110-124723407590309/AnsiballZ_systemd_service.py
Feb 20 08:58:27 np0005625204.localdomain sudo[110285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:28 np0005625204.localdomain python3.9[110287]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:28 np0005625204.localdomain sudo[110285]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:28 np0005625204.localdomain sudo[110378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxvfrtbxwqnnbgkgmxqxvnzaqtnvyoty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577908.2715409-110-190183494240552/AnsiballZ_systemd_service.py
Feb 20 08:58:28 np0005625204.localdomain sudo[110378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:58:28 np0005625204.localdomain podman[110382]: 2026-02-20 08:58:28.649015971 +0000 UTC m=+0.097735513 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, version=17.1.13)
Feb 20 08:58:28 np0005625204.localdomain podman[110382]: 2026-02-20 08:58:28.663159498 +0000 UTC m=+0.111879010 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git)
Feb 20 08:58:28 np0005625204.localdomain podman[110383]: 2026-02-20 08:58:28.690833263 +0000 UTC m=+0.131810637 container health_status a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, distribution-scope=public, container_name=nova_compute, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 20 08:58:28 np0005625204.localdomain podman[110381]: 2026-02-20 08:58:28.735616617 +0000 UTC m=+0.182697879 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 20 08:58:28 np0005625204.localdomain podman[110383]: 2026-02-20 08:58:28.739147407 +0000 UTC m=+0.180124791 container exec_died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5)
Feb 20 08:58:28 np0005625204.localdomain podman[110383]: unhealthy
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:58:28 np0005625204.localdomain podman[110382]: unhealthy
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:58:28 np0005625204.localdomain podman[110381]: 2026-02-20 08:58:28.778466803 +0000 UTC m=+0.225548115 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:58:28 np0005625204.localdomain podman[110381]: unhealthy
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:58:28 np0005625204.localdomain python3.9[110380]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:58:28 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:58:29 np0005625204.localdomain systemd-rc-local-generator[110469]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:58:29 np0005625204.localdomain systemd-sysv-generator[110475]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:58:29 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:58:29 np0005625204.localdomain systemd[1]: Stopping nova_compute container...
Feb 20 08:58:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2529 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=4075530326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F57A80000000001030307) 
Feb 20 08:58:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51597 DF PROTO=TCP SPT=58240 DPT=9102 SEQ=1064857520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F63680000000001030307) 
Feb 20 08:58:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22181 DF PROTO=TCP SPT=42204 DPT=9101 SEQ=237021640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F71950000000001030307) 
Feb 20 08:58:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:58:38 np0005625204.localdomain podman[110495]: 2026-02-20 08:58:38.649000417 +0000 UTC m=+0.084408121 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 08:58:39 np0005625204.localdomain podman[110495]: 2026-02-20 08:58:39.02138018 +0000 UTC m=+0.456787884 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, release=1766032510, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 20 08:58:39 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:58:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22183 DF PROTO=TCP SPT=42204 DPT=9101 SEQ=237021640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F7DA80000000001030307) 
Feb 20 08:58:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2531 DF PROTO=TCP SPT=54810 DPT=9105 SEQ=4075530326 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F87680000000001030307) 
Feb 20 08:58:44 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:d7:b4:4a MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=45046 SEQ=955136020 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 20 08:58:47 np0005625204.localdomain sudo[110517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:58:47 np0005625204.localdomain sudo[110517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:58:47 np0005625204.localdomain sudo[110517]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:47 np0005625204.localdomain sudo[110532]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:58:47 np0005625204.localdomain sudo[110532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:58:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10319 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597F9E1A0000000001030307) 
Feb 20 08:58:48 np0005625204.localdomain sudo[110532]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:48 np0005625204.localdomain sudo[110578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:58:48 np0005625204.localdomain sudo[110578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:58:48 np0005625204.localdomain sudo[110578]: pam_unix(sudo:session): session closed for user root
Feb 20 08:58:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10321 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FAA280000000001030307) 
Feb 20 08:58:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10322 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FB9E90000000001030307) 
Feb 20 08:58:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51047 DF PROTO=TCP SPT=51900 DPT=9105 SEQ=2352136963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FC4E80000000001030307) 
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:58:58 np0005625204.localdomain podman[110593]: 2026-02-20 08:58:58.899780323 +0000 UTC m=+0.086375732 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=)
Feb 20 08:58:58 np0005625204.localdomain podman[110594]: Error: container a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 is not running
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Main process exited, code=exited, status=125/n/a
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed with result 'exit-code'.
Feb 20 08:58:58 np0005625204.localdomain podman[110593]: 2026-02-20 08:58:58.935684803 +0000 UTC m=+0.122280192 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 20 08:58:58 np0005625204.localdomain podman[110593]: unhealthy
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:58:58 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:58:59 np0005625204.localdomain podman[110595]: 2026-02-20 08:58:59.0028472 +0000 UTC m=+0.186642242 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.openshift.expose-services=, container_name=ovn_controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 08:58:59 np0005625204.localdomain podman[110595]: 2026-02-20 08:58:59.041229217 +0000 UTC m=+0.225024279 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 20 08:58:59 np0005625204.localdomain podman[110595]: unhealthy
Feb 20 08:58:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:58:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:58:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51048 DF PROTO=TCP SPT=51900 DPT=9105 SEQ=2352136963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FCCE80000000001030307) 
Feb 20 08:59:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10323 DF PROTO=TCP SPT=59640 DPT=9882 SEQ=2421885162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FD9680000000001030307) 
Feb 20 08:59:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9182 DF PROTO=TCP SPT=40754 DPT=9101 SEQ=255780391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FE6C50000000001030307) 
Feb 20 08:59:08 np0005625204.localdomain sshd[110645]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:59:08 np0005625204.localdomain sshd[110645]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:59:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:59:09 np0005625204.localdomain podman[110647]: 2026-02-20 08:59:09.139313486 +0000 UTC m=+0.079258071 container health_status b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 20 08:59:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9184 DF PROTO=TCP SPT=40754 DPT=9101 SEQ=255780391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FF2E80000000001030307) 
Feb 20 08:59:09 np0005625204.localdomain podman[110647]: 2026-02-20 08:59:09.496059016 +0000 UTC m=+0.436003591 container exec_died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:59:09 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain podman[110483]: time="2026-02-20T08:59:11Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: libpod-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: libpod-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope: Consumed 36.163s CPU time.
Feb 20 08:59:11 np0005625204.localdomain podman[110483]: 2026-02-20 08:59:11.34238018 +0000 UTC m=+42.100069068 container died a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: No such file or directory
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: tmp-crun.qPO3qi.mount: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f0cb971c193396cffe7309d9c21e724fe81066ff3a6d017ab3c7b1cd9fec4cc2-merged.mount: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain podman[110483]: 2026-02-20 08:59:11.400577878 +0000 UTC m=+42.158266736 container cleanup a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_id=tripleo_step5, io.buildah.version=1.41.5)
Feb 20 08:59:11 np0005625204.localdomain podman[110483]: nova_compute
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: No such file or directory
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: No such file or directory
Feb 20 08:59:11 np0005625204.localdomain podman[110670]: 2026-02-20 08:59:11.429215355 +0000 UTC m=+0.075825226 container cleanup a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, architecture=x86_64, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: libpod-conmon-a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.scope: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.timer: No such file or directory
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: Failed to open /run/systemd/transient/a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380.service: No such file or directory
Feb 20 08:59:11 np0005625204.localdomain podman[110686]: 2026-02-20 08:59:11.544992124 +0000 UTC m=+0.078239350 container cleanup a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_compute, release=1766032510, vcs-type=git)
Feb 20 08:59:11 np0005625204.localdomain podman[110686]: nova_compute
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: Stopped nova_compute container.
Feb 20 08:59:11 np0005625204.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.074s CPU time, no IO.
Feb 20 08:59:11 np0005625204.localdomain sudo[110378]: pam_unix(sudo:session): session closed for user root
Feb 20 08:59:11 np0005625204.localdomain sudo[110787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbvskyhmsspcnwkgexoxsphpvhnxoekh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577951.7058754-110-122122881965063/AnsiballZ_systemd_service.py
Feb 20 08:59:11 np0005625204.localdomain sudo[110787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:59:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51050 DF PROTO=TCP SPT=51900 DPT=9105 SEQ=2352136963 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A597FFD680000000001030307) 
Feb 20 08:59:12 np0005625204.localdomain python3.9[110789]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:59:12 np0005625204.localdomain systemd-sysv-generator[110817]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:59:12 np0005625204.localdomain systemd-rc-local-generator[110813]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: Stopping nova_migration_target container...
Feb 20 08:59:12 np0005625204.localdomain sshd[72368]: Received signal 15; terminating.
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: libpod-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope: Deactivated successfully.
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: libpod-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope: Consumed 34.496s CPU time.
Feb 20 08:59:12 np0005625204.localdomain podman[110830]: 2026-02-20 08:59:12.82246478 +0000 UTC m=+0.094400200 container died b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute)
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: Deactivated successfully.
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: No such file or directory
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601-userdata-shm.mount: Deactivated successfully.
Feb 20 08:59:12 np0005625204.localdomain podman[110830]: 2026-02-20 08:59:12.867503213 +0000 UTC m=+0.139438603 container cleanup b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, release=1766032510, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Feb 20 08:59:12 np0005625204.localdomain podman[110830]: nova_migration_target
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: No such file or directory
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: No such file or directory
Feb 20 08:59:12 np0005625204.localdomain podman[110842]: 2026-02-20 08:59:12.902292588 +0000 UTC m=+0.065221607 container cleanup b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-type=git, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 20 08:59:12 np0005625204.localdomain systemd[1]: libpod-conmon-b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.scope: Deactivated successfully.
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.timer: No such file or directory
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: Failed to open /run/systemd/transient/b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601.service: No such file or directory
Feb 20 08:59:13 np0005625204.localdomain podman[110857]: 2026-02-20 08:59:13.015703085 +0000 UTC m=+0.070254333 container cleanup b4fa26ef86f047d7db38c98e55417768dc5498b45700e1aa936413e4da4d4601 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:59:13 np0005625204.localdomain podman[110857]: nova_migration_target
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: Stopped nova_migration_target container.
Feb 20 08:59:13 np0005625204.localdomain sudo[110787]: pam_unix(sudo:session): session closed for user root
Feb 20 08:59:13 np0005625204.localdomain sudo[110957]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayohpmkblswloksylxzurbvrxeozftcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771577953.1678178-110-3664347513233/AnsiballZ_systemd_service.py
Feb 20 08:59:13 np0005625204.localdomain sudo[110957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 08:59:13 np0005625204.localdomain python3.9[110959]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61-merged.mount: Deactivated successfully.
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 08:59:13 np0005625204.localdomain systemd-rc-local-generator[110989]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 08:59:13 np0005625204.localdomain systemd-sysv-generator[110993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 08:59:13 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 08:59:14 np0005625204.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Feb 20 08:59:14 np0005625204.localdomain systemd[1]: libpod-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19.scope: Deactivated successfully.
Feb 20 08:59:14 np0005625204.localdomain podman[111000]: 2026-02-20 08:59:14.233739464 +0000 UTC m=+0.063843325 container died e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtlogd_wrapper, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:31:49Z)
Feb 20 08:59:14 np0005625204.localdomain podman[111000]: 2026-02-20 08:59:14.277715853 +0000 UTC m=+0.107819604 container cleanup e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., container_name=nova_virtlogd_wrapper, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 08:59:14 np0005625204.localdomain podman[111000]: nova_virtlogd_wrapper
Feb 20 08:59:14 np0005625204.localdomain podman[111013]: 2026-02-20 08:59:14.299574989 +0000 UTC m=+0.063078951 container cleanup e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtlogd_wrapper, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible)
Feb 20 08:59:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24-merged.mount: Deactivated successfully.
Feb 20 08:59:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19-userdata-shm.mount: Deactivated successfully.
Feb 20 08:59:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52752 DF PROTO=TCP SPT=55974 DPT=9100 SEQ=3057389671 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59800D680000000001030307) 
Feb 20 08:59:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30174 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1359209721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980134A0000000001030307) 
Feb 20 08:59:18 np0005625204.localdomain sshd[111030]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:59:18 np0005625204.localdomain sshd[111030]: Received disconnect from 96.78.175.36 port 48292:11: Bye Bye [preauth]
Feb 20 08:59:19 np0005625204.localdomain sshd[111030]: Disconnected from authenticating user root 96.78.175.36 port 48292 [preauth]
Feb 20 08:59:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30176 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1359209721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59801F680000000001030307) 
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Activating special unit Exit the Session...
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Removed slice User Background Tasks Slice.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped target Main User Target.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped target Basic System.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped target Paths.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped target Sockets.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped target Timers.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Closed D-Bus User Message Bus Socket.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Stopped Create User's Volatile Files and Directories.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Removed slice User Application Slice.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Reached target Shutdown.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Finished Exit the Session.
Feb 20 08:59:21 np0005625204.localdomain systemd[85653]: Reached target Exit the Session.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: user@0.service: Consumed 3.926s CPU time, no IO.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 20 08:59:21 np0005625204.localdomain systemd[1]: user-0.slice: Consumed 4.864s CPU time.
Feb 20 08:59:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30177 DF PROTO=TCP SPT=57916 DPT=9882 SEQ=1359209721 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59802F280000000001030307) 
Feb 20 08:59:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31478 DF PROTO=TCP SPT=55518 DPT=9105 SEQ=818947730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598039E80000000001030307) 
Feb 20 08:59:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:59:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:59:29 np0005625204.localdomain podman[111034]: 2026-02-20 08:59:29.160873093 +0000 UTC m=+0.088646541 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=ovn_controller, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 20 08:59:29 np0005625204.localdomain podman[111033]: 2026-02-20 08:59:29.201977524 +0000 UTC m=+0.132673414 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, architecture=x86_64, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.expose-services=)
Feb 20 08:59:29 np0005625204.localdomain podman[111034]: 2026-02-20 08:59:29.209018571 +0000 UTC m=+0.136791989 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 20 08:59:29 np0005625204.localdomain podman[111034]: unhealthy
Feb 20 08:59:29 np0005625204.localdomain podman[111033]: 2026-02-20 08:59:29.222041614 +0000 UTC m=+0.152737554 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 20 08:59:29 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:59:29 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:59:29 np0005625204.localdomain podman[111033]: unhealthy
Feb 20 08:59:29 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:59:29 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:59:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31479 DF PROTO=TCP SPT=55518 DPT=9105 SEQ=818947730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598041E90000000001030307) 
Feb 20 08:59:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9196 DF PROTO=TCP SPT=56218 DPT=9102 SEQ=4037064552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59804D690000000001030307) 
Feb 20 08:59:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39615 DF PROTO=TCP SPT=44112 DPT=9101 SEQ=893233341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59805BF60000000001030307) 
Feb 20 08:59:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39617 DF PROTO=TCP SPT=44112 DPT=9101 SEQ=893233341 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598067E80000000001030307) 
Feb 20 08:59:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31481 DF PROTO=TCP SPT=55518 DPT=9105 SEQ=818947730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598071680000000001030307) 
Feb 20 08:59:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52076 DF PROTO=TCP SPT=56640 DPT=9102 SEQ=380876730 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598081690000000001030307) 
Feb 20 08:59:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:59:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:59:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10529 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980887A0000000001030307) 
Feb 20 08:59:48 np0005625204.localdomain sudo[111074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 08:59:48 np0005625204.localdomain sudo[111074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:59:48 np0005625204.localdomain sudo[111074]: pam_unix(sudo:session): session closed for user root
Feb 20 08:59:48 np0005625204.localdomain sudo[111089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 08:59:48 np0005625204.localdomain sudo[111089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:59:49 np0005625204.localdomain sudo[111089]: pam_unix(sudo:session): session closed for user root
Feb 20 08:59:50 np0005625204.localdomain sudo[111137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 08:59:50 np0005625204.localdomain sudo[111137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 08:59:50 np0005625204.localdomain sudo[111137]: pam_unix(sudo:session): session closed for user root
Feb 20 08:59:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10531 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598094680000000001030307) 
Feb 20 08:59:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 08:59:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 08:59:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10532 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980A4280000000001030307) 
Feb 20 08:59:55 np0005625204.localdomain sshd[111152]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 08:59:55 np0005625204.localdomain sshd[111152]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 08:59:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5076 DF PROTO=TCP SPT=47872 DPT=9105 SEQ=1572668428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980AF280000000001030307) 
Feb 20 08:59:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 08:59:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 08:59:59 np0005625204.localdomain podman[111154]: 2026-02-20 08:59:59.400326134 +0000 UTC m=+0.088321812 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 08:59:59 np0005625204.localdomain podman[111154]: 2026-02-20 08:59:59.418776314 +0000 UTC m=+0.106772022 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc.)
Feb 20 08:59:59 np0005625204.localdomain podman[111154]: unhealthy
Feb 20 08:59:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:59:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 08:59:59 np0005625204.localdomain podman[111155]: 2026-02-20 08:59:59.511040896 +0000 UTC m=+0.195942239 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent)
Feb 20 08:59:59 np0005625204.localdomain podman[111155]: 2026-02-20 08:59:59.524017128 +0000 UTC m=+0.208918521 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 08:59:59 np0005625204.localdomain podman[111155]: unhealthy
Feb 20 08:59:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 08:59:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 08:59:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5077 DF PROTO=TCP SPT=47872 DPT=9105 SEQ=1572668428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980B7280000000001030307) 
Feb 20 09:00:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10533 DF PROTO=TCP SPT=43552 DPT=9882 SEQ=227066227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980C5690000000001030307) 
Feb 20 09:00:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36737 DF PROTO=TCP SPT=57714 DPT=9101 SEQ=3142129976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980D1240000000001030307) 
Feb 20 09:00:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36739 DF PROTO=TCP SPT=57714 DPT=9101 SEQ=3142129976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980DD280000000001030307) 
Feb 20 09:00:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5079 DF PROTO=TCP SPT=47872 DPT=9105 SEQ=1572668428 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980E7690000000001030307) 
Feb 20 09:00:15 np0005625204.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 20 09:00:15 np0005625204.localdomain recover_tripleo_nova_virtqemud[111196]: 63005
Feb 20 09:00:15 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 20 09:00:15 np0005625204.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 20 09:00:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=278 DF PROTO=TCP SPT=33934 DPT=9100 SEQ=2071844979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980F7690000000001030307) 
Feb 20 09:00:16 np0005625204.localdomain sshd[111197]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:00:17 np0005625204.localdomain sshd[111197]: Invalid user sol from 45.148.10.240 port 44672
Feb 20 09:00:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42199 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5980FDAA0000000001030307) 
Feb 20 09:00:17 np0005625204.localdomain sshd[111197]: Connection closed by invalid user sol 45.148.10.240 port 44672 [preauth]
Feb 20 09:00:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42201 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598109A80000000001030307) 
Feb 20 09:00:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42202 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598119680000000001030307) 
Feb 20 09:00:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19165 DF PROTO=TCP SPT=42918 DPT=9105 SEQ=2861299012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598124680000000001030307) 
Feb 20 09:00:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 09:00:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 09:00:29 np0005625204.localdomain podman[111200]: 2026-02-20 09:00:29.640821957 +0000 UTC m=+0.073204824 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent)
Feb 20 09:00:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19166 DF PROTO=TCP SPT=42918 DPT=9105 SEQ=2861299012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59812C680000000001030307) 
Feb 20 09:00:29 np0005625204.localdomain podman[111199]: 2026-02-20 09:00:29.704824046 +0000 UTC m=+0.137757830 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, version=17.1.13, distribution-scope=public, release=1766032510, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 09:00:29 np0005625204.localdomain podman[111199]: 2026-02-20 09:00:29.717915641 +0000 UTC m=+0.150849435 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.openshift.expose-services=, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 20 09:00:29 np0005625204.localdomain podman[111199]: unhealthy
Feb 20 09:00:29 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:00:29 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 09:00:29 np0005625204.localdomain podman[111200]: 2026-02-20 09:00:29.730972085 +0000 UTC m=+0.163354912 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public)
Feb 20 09:00:29 np0005625204.localdomain podman[111200]: unhealthy
Feb 20 09:00:29 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:00:29 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 09:00:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42203 DF PROTO=TCP SPT=44354 DPT=9882 SEQ=3409662979 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598139680000000001030307) 
Feb 20 09:00:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54777 DF PROTO=TCP SPT=49322 DPT=9101 SEQ=2158845858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598146540000000001030307) 
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 62225 (conmon) with signal SIGKILL.
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: libpod-conmon-e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19.scope: Deactivated successfully.
Feb 20 09:00:38 np0005625204.localdomain podman[111251]: error opening file `/run/crun/e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19/status`: No such file or directory
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: tmp-crun.xCxWUS.mount: Deactivated successfully.
Feb 20 09:00:38 np0005625204.localdomain podman[111239]: 2026-02-20 09:00:38.398126425 +0000 UTC m=+0.082351676 container cleanup e975d0a86c5e9fe26deea4425eaa0740c3112c236f6abb739ab5b827518b7d19 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:00:38 np0005625204.localdomain podman[111239]: nova_virtlogd_wrapper
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Feb 20 09:00:38 np0005625204.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Feb 20 09:00:38 np0005625204.localdomain sudo[110957]: pam_unix(sudo:session): session closed for user root
Feb 20 09:00:38 np0005625204.localdomain sudo[111343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yewfabotfvjyjlnyhwsmtiinmocxzzhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578038.5839057-110-143667106662388/AnsiballZ_systemd_service.py
Feb 20 09:00:38 np0005625204.localdomain sudo[111343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:00:39 np0005625204.localdomain python3.9[111345]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:00:39 np0005625204.localdomain systemd-sysv-generator[111378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:00:39 np0005625204.localdomain systemd-rc-local-generator[111372]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:00:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54779 DF PROTO=TCP SPT=49322 DPT=9101 SEQ=2158845858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598152680000000001030307) 
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: libpod-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope: Deactivated successfully.
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: libpod-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope: Consumed 1.496s CPU time.
Feb 20 09:00:39 np0005625204.localdomain podman[111386]: 2026-02-20 09:00:39.620714085 +0000 UTC m=+0.056520868 container died b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vcs-type=git, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, container_name=nova_virtnodedevd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 20 09:00:39 np0005625204.localdomain podman[111386]: 2026-02-20 09:00:39.659817254 +0000 UTC m=+0.095624057 container cleanup b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, container_name=nova_virtnodedevd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 20 09:00:39 np0005625204.localdomain podman[111386]: nova_virtnodedevd
Feb 20 09:00:39 np0005625204.localdomain podman[111399]: 2026-02-20 09:00:39.712424011 +0000 UTC m=+0.076056593 container cleanup b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, container_name=nova_virtnodedevd, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, managed_by=tripleo_ansible, config_id=tripleo_step3, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: libpod-conmon-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1.scope: Deactivated successfully.
Feb 20 09:00:39 np0005625204.localdomain podman[111428]: error opening file `/run/crun/b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1/status`: No such file or directory
Feb 20 09:00:39 np0005625204.localdomain podman[111416]: 2026-02-20 09:00:39.80198342 +0000 UTC m=+0.064814696 container cleanup b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtnodedevd, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, batch=17.1_20260112.1)
Feb 20 09:00:39 np0005625204.localdomain podman[111416]: nova_virtnodedevd
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Feb 20 09:00:39 np0005625204.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Feb 20 09:00:39 np0005625204.localdomain sudo[111343]: pam_unix(sudo:session): session closed for user root
Feb 20 09:00:40 np0005625204.localdomain sudo[111519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snjvmnxepxlnbsqrkxmuisxomyfecotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578039.993122-110-81043006559863/AnsiballZ_systemd_service.py
Feb 20 09:00:40 np0005625204.localdomain sudo[111519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:00:40 np0005625204.localdomain python3.9[111521]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:00:40 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:00:40 np0005625204.localdomain systemd-sysv-generator[111555]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:00:40 np0005625204.localdomain systemd-rc-local-generator[111551]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:00:40 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:00:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d-merged.mount: Deactivated successfully.
Feb 20 09:00:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b55023cf95a6c54b5f7b39ef0c0145f45f7f1fb4d09c6f5bf4c14d9fafa348d1-userdata-shm.mount: Deactivated successfully.
Feb 20 09:00:40 np0005625204.localdomain systemd[1]: Stopping nova_virtproxyd container...
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: libpod-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34.scope: Deactivated successfully.
Feb 20 09:00:41 np0005625204.localdomain podman[111562]: 2026-02-20 09:00:41.036926731 +0000 UTC m=+0.083284695 container died e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.expose-services=, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, container_name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc.)
Feb 20 09:00:41 np0005625204.localdomain podman[111562]: 2026-02-20 09:00:41.081444317 +0000 UTC m=+0.127802281 container cleanup e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, container_name=nova_virtproxyd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 09:00:41 np0005625204.localdomain podman[111562]: nova_virtproxyd
Feb 20 09:00:41 np0005625204.localdomain podman[111577]: 2026-02-20 09:00:41.11385802 +0000 UTC m=+0.068715776 container cleanup e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, distribution-scope=public, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtproxyd, managed_by=tripleo_ansible)
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: libpod-conmon-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34.scope: Deactivated successfully.
Feb 20 09:00:41 np0005625204.localdomain podman[111606]: error opening file `/run/crun/e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34/status`: No such file or directory
Feb 20 09:00:41 np0005625204.localdomain podman[111595]: 2026-02-20 09:00:41.208698492 +0000 UTC m=+0.057736256 container cleanup e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtproxyd, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com)
Feb 20 09:00:41 np0005625204.localdomain podman[111595]: nova_virtproxyd
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: Stopped nova_virtproxyd container.
Feb 20 09:00:41 np0005625204.localdomain sshd[111608]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:00:41 np0005625204.localdomain sudo[111519]: pam_unix(sudo:session): session closed for user root
Feb 20 09:00:41 np0005625204.localdomain sshd[111608]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:00:41 np0005625204.localdomain sudo[111699]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnnkbqdgbfupfxsoiyqekeluxhypwdge ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578041.3729606-110-257157316657081/AnsiballZ_systemd_service.py
Feb 20 09:00:41 np0005625204.localdomain sudo[111699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: tmp-crun.mmFv9A.mount: Deactivated successfully.
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe-merged.mount: Deactivated successfully.
Feb 20 09:00:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9d005a490407a34bd1203c5a34245150ff3d78c6922994131011eba383c8c34-userdata-shm.mount: Deactivated successfully.
Feb 20 09:00:41 np0005625204.localdomain python3.9[111701]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:00:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19168 DF PROTO=TCP SPT=42918 DPT=9105 SEQ=2861299012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59815D680000000001030307) 
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:00:43 np0005625204.localdomain systemd-rc-local-generator[111729]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:00:43 np0005625204.localdomain systemd-sysv-generator[111733]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: Stopping nova_virtqemud container...
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: libpod-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope: Deactivated successfully.
Feb 20 09:00:43 np0005625204.localdomain systemd[1]: libpod-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope: Consumed 2.746s CPU time.
Feb 20 09:00:43 np0005625204.localdomain podman[111742]: 2026-02-20 09:00:43.437958195 +0000 UTC m=+0.074970469 container died 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step3, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, container_name=nova_virtqemud, managed_by=tripleo_ansible, tcib_managed=true)
Feb 20 09:00:43 np0005625204.localdomain podman[111742]: 2026-02-20 09:00:43.462853514 +0000 UTC m=+0.099865728 container cleanup 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=nova_virtqemud, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, maintainer=OpenStack TripleO Team)
Feb 20 09:00:43 np0005625204.localdomain podman[111742]: nova_virtqemud
Feb 20 09:00:43 np0005625204.localdomain podman[111755]: 2026-02-20 09:00:43.51575483 +0000 UTC m=+0.064700611 container cleanup 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, container_name=nova_virtqemud, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt)
Feb 20 09:00:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260-merged.mount: Deactivated successfully.
Feb 20 09:00:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69-userdata-shm.mount: Deactivated successfully.
Feb 20 09:00:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59488 DF PROTO=TCP SPT=39078 DPT=9102 SEQ=3151336319 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59816B680000000001030307) 
Feb 20 09:00:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37037 DF PROTO=TCP SPT=38760 DPT=9882 SEQ=89716857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598172DA0000000001030307) 
Feb 20 09:00:50 np0005625204.localdomain sudo[111770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:00:50 np0005625204.localdomain sudo[111770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:00:50 np0005625204.localdomain sudo[111770]: pam_unix(sudo:session): session closed for user root
Feb 20 09:00:50 np0005625204.localdomain sudo[111785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:00:50 np0005625204.localdomain sudo[111785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:00:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37039 DF PROTO=TCP SPT=38760 DPT=9882 SEQ=89716857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59817EE80000000001030307) 
Feb 20 09:00:51 np0005625204.localdomain sudo[111785]: pam_unix(sudo:session): session closed for user root
Feb 20 09:00:53 np0005625204.localdomain sshd[111831]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:00:54 np0005625204.localdomain sudo[111833]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:00:54 np0005625204.localdomain sudo[111833]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:00:54 np0005625204.localdomain sudo[111833]: pam_unix(sudo:session): session closed for user root
Feb 20 09:00:54 np0005625204.localdomain sshd[111831]: Invalid user claude from 27.112.79.3 port 36340
Feb 20 09:00:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37040 DF PROTO=TCP SPT=38760 DPT=9882 SEQ=89716857 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59818EA80000000001030307) 
Feb 20 09:00:54 np0005625204.localdomain sshd[111831]: Received disconnect from 27.112.79.3 port 36340:11: Bye Bye [preauth]
Feb 20 09:00:54 np0005625204.localdomain sshd[111831]: Disconnected from invalid user claude 27.112.79.3 port 36340 [preauth]
Feb 20 09:00:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52129 DF PROTO=TCP SPT=45840 DPT=9105 SEQ=470536505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598199680000000001030307) 
Feb 20 09:00:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52130 DF PROTO=TCP SPT=45840 DPT=9105 SEQ=470536505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981A1690000000001030307) 
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: tmp-crun.OKadpE.mount: Deactivated successfully.
Feb 20 09:00:59 np0005625204.localdomain podman[111849]: 2026-02-20 09:00:59.913113079 +0000 UTC m=+0.097346801 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 09:00:59 np0005625204.localdomain podman[111849]: 2026-02-20 09:00:59.934071737 +0000 UTC m=+0.118305529 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 09:00:59 np0005625204.localdomain podman[111849]: unhealthy
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: tmp-crun.FsmNeX.mount: Deactivated successfully.
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 09:00:59 np0005625204.localdomain podman[111848]: 2026-02-20 09:00:59.955285603 +0000 UTC m=+0.139870175 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4)
Feb 20 09:00:59 np0005625204.localdomain podman[111848]: 2026-02-20 09:00:59.969860204 +0000 UTC m=+0.154444776 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 09:00:59 np0005625204.localdomain podman[111848]: unhealthy
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:00:59 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 09:01:01 np0005625204.localdomain CROND[111890]: (root) CMD (run-parts /etc/cron.hourly)
Feb 20 09:01:01 np0005625204.localdomain run-parts[111893]: (/etc/cron.hourly) starting 0anacron
Feb 20 09:01:01 np0005625204.localdomain run-parts[111899]: (/etc/cron.hourly) finished 0anacron
Feb 20 09:01:01 np0005625204.localdomain CROND[111889]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 20 09:01:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19335 DF PROTO=TCP SPT=53684 DPT=9100 SEQ=2478429227 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981AD680000000001030307) 
Feb 20 09:01:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5605 DF PROTO=TCP SPT=42846 DPT=9101 SEQ=3351586781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981BB840000000001030307) 
Feb 20 09:01:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5607 DF PROTO=TCP SPT=42846 DPT=9101 SEQ=3351586781 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981C7A80000000001030307) 
Feb 20 09:01:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52132 DF PROTO=TCP SPT=45840 DPT=9105 SEQ=470536505 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981D1680000000001030307) 
Feb 20 09:01:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=319 DF PROTO=TCP SPT=34466 DPT=9100 SEQ=2865431081 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981E1680000000001030307) 
Feb 20 09:01:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2780 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981E80A0000000001030307) 
Feb 20 09:01:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2782 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5981F4280000000001030307) 
Feb 20 09:01:24 np0005625204.localdomain sshd[111900]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:01:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2783 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598203E80000000001030307) 
Feb 20 09:01:25 np0005625204.localdomain sshd[111902]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:01:25 np0005625204.localdomain sshd[111902]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:01:26 np0005625204.localdomain sshd[111900]: Invalid user n8n from 103.157.25.4 port 43278
Feb 20 09:01:26 np0005625204.localdomain sshd[111900]: Received disconnect from 103.157.25.4 port 43278:11: Bye Bye [preauth]
Feb 20 09:01:26 np0005625204.localdomain sshd[111900]: Disconnected from invalid user n8n 103.157.25.4 port 43278 [preauth]
Feb 20 09:01:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18927 DF PROTO=TCP SPT=49522 DPT=9105 SEQ=155240626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59820EA80000000001030307) 
Feb 20 09:01:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18928 DF PROTO=TCP SPT=49522 DPT=9105 SEQ=155240626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598216A80000000001030307) 
Feb 20 09:01:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 09:01:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 09:01:30 np0005625204.localdomain podman[111904]: 2026-02-20 09:01:30.158005061 +0000 UTC m=+0.090653273 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 09:01:30 np0005625204.localdomain podman[111904]: 2026-02-20 09:01:30.172466358 +0000 UTC m=+0.105114570 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 20 09:01:30 np0005625204.localdomain podman[111904]: unhealthy
Feb 20 09:01:30 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:01:30 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 09:01:30 np0005625204.localdomain podman[111905]: 2026-02-20 09:01:30.260587063 +0000 UTC m=+0.189024755 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 20 09:01:30 np0005625204.localdomain podman[111905]: 2026-02-20 09:01:30.275065861 +0000 UTC m=+0.203503533 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4)
Feb 20 09:01:30 np0005625204.localdomain podman[111905]: unhealthy
Feb 20 09:01:30 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:01:30 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 09:01:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2784 DF PROTO=TCP SPT=55480 DPT=9882 SEQ=2817686025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598223680000000001030307) 
Feb 20 09:01:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62345 DF PROTO=TCP SPT=48546 DPT=9101 SEQ=1091139209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598230B40000000001030307) 
Feb 20 09:01:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62347 DF PROTO=TCP SPT=48546 DPT=9101 SEQ=1091139209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59823CA80000000001030307) 
Feb 20 09:01:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18930 DF PROTO=TCP SPT=49522 DPT=9105 SEQ=155240626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598247680000000001030307) 
Feb 20 09:01:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61083 DF PROTO=TCP SPT=44978 DPT=9102 SEQ=1875276320 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598257680000000001030307) 
Feb 20 09:01:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5633 DF PROTO=TCP SPT=44844 DPT=9882 SEQ=2640199640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59825D3A0000000001030307) 
Feb 20 09:01:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5635 DF PROTO=TCP SPT=44844 DPT=9882 SEQ=2640199640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598269280000000001030307) 
Feb 20 09:01:54 np0005625204.localdomain sudo[111943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:01:54 np0005625204.localdomain sudo[111943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:01:54 np0005625204.localdomain sudo[111943]: pam_unix(sudo:session): session closed for user root
Feb 20 09:01:54 np0005625204.localdomain sudo[111958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 09:01:54 np0005625204.localdomain sudo[111958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:01:54 np0005625204.localdomain sudo[111958]: pam_unix(sudo:session): session closed for user root
Feb 20 09:01:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5636 DF PROTO=TCP SPT=44844 DPT=9882 SEQ=2640199640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598278E80000000001030307) 
Feb 20 09:01:54 np0005625204.localdomain sudo[111994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:01:54 np0005625204.localdomain sudo[111994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:01:54 np0005625204.localdomain sudo[111994]: pam_unix(sudo:session): session closed for user root
Feb 20 09:01:54 np0005625204.localdomain sudo[112009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:01:54 np0005625204.localdomain sudo[112009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:01:55 np0005625204.localdomain sudo[112009]: pam_unix(sudo:session): session closed for user root
Feb 20 09:01:56 np0005625204.localdomain sudo[112056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:01:56 np0005625204.localdomain sudo[112056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:01:56 np0005625204.localdomain sudo[112056]: pam_unix(sudo:session): session closed for user root
Feb 20 09:01:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51558 DF PROTO=TCP SPT=39554 DPT=9105 SEQ=2002411649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598283E80000000001030307) 
Feb 20 09:01:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51559 DF PROTO=TCP SPT=39554 DPT=9105 SEQ=2002411649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59828BE80000000001030307) 
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: tmp-crun.mQoQL5.mount: Deactivated successfully.
Feb 20 09:02:00 np0005625204.localdomain podman[112072]: 2026-02-20 09:02:00.653618986 +0000 UTC m=+0.086107294 container health_status 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git)
Feb 20 09:02:00 np0005625204.localdomain podman[112072]: 2026-02-20 09:02:00.700138143 +0000 UTC m=+0.132626531 container exec_died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 09:02:00 np0005625204.localdomain podman[112072]: unhealthy
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed with result 'exit-code'.
Feb 20 09:02:00 np0005625204.localdomain podman[112071]: 2026-02-20 09:02:00.743522175 +0000 UTC m=+0.178113338 container health_status 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, release=1766032510, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 20 09:02:00 np0005625204.localdomain podman[112071]: 2026-02-20 09:02:00.78798197 +0000 UTC m=+0.222573143 container exec_died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4)
Feb 20 09:02:00 np0005625204.localdomain podman[112071]: unhealthy
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:02:00 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed with result 'exit-code'.
Feb 20 09:02:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63526 DF PROTO=TCP SPT=58314 DPT=9100 SEQ=3974477460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598297680000000001030307) 
Feb 20 09:02:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14847 DF PROTO=TCP SPT=39812 DPT=9101 SEQ=709151425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982A5E70000000001030307) 
Feb 20 09:02:07 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud.service: State 'stop-sigterm' timed out. Killing.
Feb 20 09:02:07 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud.service: Killing process 63001 (conmon) with signal SIGKILL.
Feb 20 09:02:07 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud.service: Main process exited, code=killed, status=9/KILL
Feb 20 09:02:07 np0005625204.localdomain systemd[1]: libpod-conmon-0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69.scope: Deactivated successfully.
Feb 20 09:02:07 np0005625204.localdomain podman[112124]: error opening file `/run/crun/0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69/status`: No such file or directory
Feb 20 09:02:07 np0005625204.localdomain podman[112111]: 2026-02-20 09:02:07.609687539 +0000 UTC m=+0.049602635 container cleanup 0ab0f62f93ce19c96a67772669c1678b354b0e5f8befa3ec687f29c3f3472a69 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 20 09:02:07 np0005625204.localdomain podman[112111]: nova_virtqemud
Feb 20 09:02:07 np0005625204.localdomain systemd[1]: tripleo_nova_virtqemud.service: Failed with result 'timeout'.
Feb 20 09:02:07 np0005625204.localdomain systemd[1]: Stopped nova_virtqemud container.
Feb 20 09:02:07 np0005625204.localdomain sudo[111699]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:08 np0005625204.localdomain sudo[112215]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uteqjvagijvbcojvjngkknnbhayqypyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578127.802588-110-59677814641642/AnsiballZ_systemd_service.py
Feb 20 09:02:08 np0005625204.localdomain sudo[112215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:02:08 np0005625204.localdomain python3.9[112217]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:02:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14849 DF PROTO=TCP SPT=39812 DPT=9101 SEQ=709151425 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982B1E80000000001030307) 
Feb 20 09:02:09 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:02:09 np0005625204.localdomain systemd-rc-local-generator[112244]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:02:09 np0005625204.localdomain systemd-sysv-generator[112249]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:02:09 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:02:09 np0005625204.localdomain sshd[112255]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:02:09 np0005625204.localdomain sudo[112215]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:09 np0005625204.localdomain sshd[112255]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:02:10 np0005625204.localdomain sudo[112346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shurwgqnscneekwsphywlhssulzdscns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578129.901649-110-199479037534552/AnsiballZ_systemd_service.py
Feb 20 09:02:10 np0005625204.localdomain sudo[112346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:02:10 np0005625204.localdomain python3.9[112348]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:02:10 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:02:10 np0005625204.localdomain systemd-rc-local-generator[112372]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:02:10 np0005625204.localdomain systemd-sysv-generator[112376]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:02:10 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:02:10 np0005625204.localdomain systemd[1]: Stopping nova_virtsecretd container...
Feb 20 09:02:10 np0005625204.localdomain systemd[1]: tmp-crun.hwfvmZ.mount: Deactivated successfully.
Feb 20 09:02:10 np0005625204.localdomain systemd[1]: libpod-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2.scope: Deactivated successfully.
Feb 20 09:02:10 np0005625204.localdomain podman[112389]: 2026-02-20 09:02:10.934434943 +0000 UTC m=+0.079496129 container died c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtsecretd)
Feb 20 09:02:10 np0005625204.localdomain podman[112389]: 2026-02-20 09:02:10.975351648 +0000 UTC m=+0.120412814 container cleanup c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, batch=17.1_20260112.1)
Feb 20 09:02:10 np0005625204.localdomain podman[112389]: nova_virtsecretd
Feb 20 09:02:10 np0005625204.localdomain podman[112402]: 2026-02-20 09:02:10.994310764 +0000 UTC m=+0.051864664 container cleanup c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, release=1766032510, container_name=nova_virtsecretd, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, build-date=2026-01-12T23:31:49Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64)
Feb 20 09:02:11 np0005625204.localdomain systemd[1]: libpod-conmon-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2.scope: Deactivated successfully.
Feb 20 09:02:11 np0005625204.localdomain podman[112429]: error opening file `/run/crun/c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2/status`: No such file or directory
Feb 20 09:02:11 np0005625204.localdomain podman[112418]: 2026-02-20 09:02:11.08926535 +0000 UTC m=+0.063898047 container cleanup c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, container_name=nova_virtsecretd, io.buildah.version=1.41.5, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 09:02:11 np0005625204.localdomain podman[112418]: nova_virtsecretd
Feb 20 09:02:11 np0005625204.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Feb 20 09:02:11 np0005625204.localdomain systemd[1]: Stopped nova_virtsecretd container.
Feb 20 09:02:11 np0005625204.localdomain sudo[112346]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:11 np0005625204.localdomain sudo[112522]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qubacugvggghufwzijbfcgqwgwcqtkid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578131.2279892-110-255558859025654/AnsiballZ_systemd_service.py
Feb 20 09:02:11 np0005625204.localdomain sudo[112522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:02:11 np0005625204.localdomain python3.9[112524]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:02:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51561 DF PROTO=TCP SPT=39554 DPT=9105 SEQ=2002411649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982BB680000000001030307) 
Feb 20 09:02:11 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:02:11 np0005625204.localdomain systemd-rc-local-generator[112551]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:02:11 np0005625204.localdomain systemd-sysv-generator[112556]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95-merged.mount: Deactivated successfully.
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c8bc816de19434ded1ec1bf0b75428c1b31cd0ca585f42623a6edebefa3517b2-userdata-shm.mount: Deactivated successfully.
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: Stopping nova_virtstoraged container...
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: libpod-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108.scope: Deactivated successfully.
Feb 20 09:02:12 np0005625204.localdomain podman[112565]: 2026-02-20 09:02:12.261401889 +0000 UTC m=+0.077580249 container died 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtstoraged, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=)
Feb 20 09:02:12 np0005625204.localdomain podman[112565]: 2026-02-20 09:02:12.302568642 +0000 UTC m=+0.118746992 container cleanup 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, architecture=x86_64, container_name=nova_virtstoraged, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3)
Feb 20 09:02:12 np0005625204.localdomain podman[112565]: nova_virtstoraged
Feb 20 09:02:12 np0005625204.localdomain podman[112582]: 2026-02-20 09:02:12.342891349 +0000 UTC m=+0.070406908 container cleanup 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, container_name=nova_virtstoraged)
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: libpod-conmon-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108.scope: Deactivated successfully.
Feb 20 09:02:12 np0005625204.localdomain podman[112609]: error opening file `/run/crun/025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108/status`: No such file or directory
Feb 20 09:02:12 np0005625204.localdomain podman[112598]: 2026-02-20 09:02:12.443333044 +0000 UTC m=+0.068759396 container cleanup 025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=nova_virtstoraged, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, config_id=tripleo_step3, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '6f2a8ada21c5a8beb0844e05e372be87'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, com.redhat.component=openstack-nova-libvirt-container)
Feb 20 09:02:12 np0005625204.localdomain podman[112598]: nova_virtstoraged
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Feb 20 09:02:12 np0005625204.localdomain systemd[1]: Stopped nova_virtstoraged container.
Feb 20 09:02:12 np0005625204.localdomain sudo[112522]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:12 np0005625204.localdomain sudo[112700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfpltiadeosgrypuvjjfxjcoxninwcjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578132.5910807-110-198507702770462/AnsiballZ_systemd_service.py
Feb 20 09:02:12 np0005625204.localdomain sudo[112700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: tmp-crun.4g6Y7y.mount: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5-merged.mount: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-025ebe68be8a7e031c777f7c63ec239d5b5cc52b19910121acc0793bb59ae108-userdata-shm.mount: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain python3.9[112702]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:02:13 np0005625204.localdomain systemd-rc-local-generator[112732]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:02:13 np0005625204.localdomain systemd-sysv-generator[112735]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: Stopping ovn_controller container...
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: libpod-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: libpod-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope: Consumed 2.699s CPU time.
Feb 20 09:02:13 np0005625204.localdomain podman[112744]: 2026-02-20 09:02:13.641286552 +0000 UTC m=+0.080740407 container died 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13)
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: No such file or directory
Feb 20 09:02:13 np0005625204.localdomain podman[112744]: 2026-02-20 09:02:13.680183985 +0000 UTC m=+0.119637820 container cleanup 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, container_name=ovn_controller, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 20 09:02:13 np0005625204.localdomain podman[112744]: ovn_controller
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: No such file or directory
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: No such file or directory
Feb 20 09:02:13 np0005625204.localdomain podman[112758]: 2026-02-20 09:02:13.698200222 +0000 UTC m=+0.050882234 container cleanup 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public)
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: libpod-conmon-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.scope: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.timer: No such file or directory
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: Failed to open /run/systemd/transient/0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850.service: No such file or directory
Feb 20 09:02:13 np0005625204.localdomain podman[112773]: 2026-02-20 09:02:13.786233694 +0000 UTC m=+0.060419449 container cleanup 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 20 09:02:13 np0005625204.localdomain podman[112773]: ovn_controller
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Feb 20 09:02:13 np0005625204.localdomain systemd[1]: Stopped ovn_controller container.
Feb 20 09:02:13 np0005625204.localdomain sudo[112700]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-9d2a27c37c1e0aa5be6fdab947882ef1f426e5cc1bd21c037426b7439e8b098c-merged.mount: Deactivated successfully.
Feb 20 09:02:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850-userdata-shm.mount: Deactivated successfully.
Feb 20 09:02:14 np0005625204.localdomain sudo[112875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhavxkzashqcxkumbkryhyatnvrsjupn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578133.9232285-110-190113021165914/AnsiballZ_systemd_service.py
Feb 20 09:02:14 np0005625204.localdomain sudo[112875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:02:14 np0005625204.localdomain python3.9[112877]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:02:14 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:02:14 np0005625204.localdomain systemd-rc-local-generator[112905]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:02:14 np0005625204.localdomain systemd-sysv-generator[112910]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:02:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:02:14 np0005625204.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: libpod-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope: Deactivated successfully.
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: libpod-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope: Consumed 11.444s CPU time.
Feb 20 09:02:15 np0005625204.localdomain podman[112917]: 2026-02-20 09:02:15.26491023 +0000 UTC m=+0.369471813 container died 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: tmp-crun.UqVjfM.mount: Deactivated successfully.
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: Deactivated successfully.
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: No such file or directory
Feb 20 09:02:15 np0005625204.localdomain sshd[112936]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59-userdata-shm.mount: Deactivated successfully.
Feb 20 09:02:15 np0005625204.localdomain podman[112917]: 2026-02-20 09:02:15.339991112 +0000 UTC m=+0.444552645 container cleanup 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=ovn_metadata_agent, tcib_managed=true, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 09:02:15 np0005625204.localdomain podman[112917]: ovn_metadata_agent
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: No such file or directory
Feb 20 09:02:15 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: No such file or directory
Feb 20 09:02:15 np0005625204.localdomain podman[112929]: 2026-02-20 09:02:15.364140269 +0000 UTC m=+0.094545465 container cleanup 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., version=17.1.13, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 20 09:02:15 np0005625204.localdomain sshd[112936]: Invalid user brandon from 96.78.175.36 port 45380
Feb 20 09:02:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13410 DF PROTO=TCP SPT=54536 DPT=9102 SEQ=1882484095 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982CB680000000001030307) 
Feb 20 09:02:15 np0005625204.localdomain sshd[112936]: Received disconnect from 96.78.175.36 port 45380:11: Bye Bye [preauth]
Feb 20 09:02:15 np0005625204.localdomain sshd[112936]: Disconnected from invalid user brandon 96.78.175.36 port 45380 [preauth]
Feb 20 09:02:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d98c062b21764b21e0b6595874844668fb8ff8886b054dd456077eeaff5c7e50-merged.mount: Deactivated successfully.
Feb 20 09:02:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25599 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982D26A0000000001030307) 
Feb 20 09:02:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25601 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982DE680000000001030307) 
Feb 20 09:02:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25602 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982EE280000000001030307) 
Feb 20 09:02:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15112 DF PROTO=TCP SPT=55516 DPT=9105 SEQ=3645250186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5982F9280000000001030307) 
Feb 20 09:02:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15113 DF PROTO=TCP SPT=55516 DPT=9105 SEQ=3645250186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598301280000000001030307) 
Feb 20 09:02:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25603 DF PROTO=TCP SPT=59982 DPT=9882 SEQ=1976116075 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59830F680000000001030307) 
Feb 20 09:02:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19912 DF PROTO=TCP SPT=50874 DPT=9101 SEQ=1703013491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59831B150000000001030307) 
Feb 20 09:02:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19914 DF PROTO=TCP SPT=50874 DPT=9101 SEQ=1703013491 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598327280000000001030307) 
Feb 20 09:02:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15115 DF PROTO=TCP SPT=55516 DPT=9105 SEQ=3645250186 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598331680000000001030307) 
Feb 20 09:02:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28307 DF PROTO=TCP SPT=33502 DPT=9100 SEQ=1954362177 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598341680000000001030307) 
Feb 20 09:02:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61440 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983479B0000000001030307) 
Feb 20 09:02:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61442 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598353A80000000001030307) 
Feb 20 09:02:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61443 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598363690000000001030307) 
Feb 20 09:02:55 np0005625204.localdomain sshd[112950]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:02:55 np0005625204.localdomain sshd[112950]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:02:56 np0005625204.localdomain sudo[112952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:02:56 np0005625204.localdomain sudo[112952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:02:56 np0005625204.localdomain sudo[112952]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:56 np0005625204.localdomain sudo[112967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:02:56 np0005625204.localdomain sudo[112967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:02:57 np0005625204.localdomain sudo[112967]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:57 np0005625204.localdomain sshd[113014]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:02:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14898 DF PROTO=TCP SPT=33676 DPT=9105 SEQ=2657743949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59836E290000000001030307) 
Feb 20 09:02:57 np0005625204.localdomain sudo[113016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:02:57 np0005625204.localdomain sudo[113016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:02:57 np0005625204.localdomain sudo[113016]: pam_unix(sudo:session): session closed for user root
Feb 20 09:02:58 np0005625204.localdomain sshd[113014]: Invalid user sol from 45.148.10.240 port 53222
Feb 20 09:02:58 np0005625204.localdomain sshd[113014]: Connection closed by invalid user sol 45.148.10.240 port 53222 [preauth]
Feb 20 09:02:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14899 DF PROTO=TCP SPT=33676 DPT=9105 SEQ=2657743949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598376280000000001030307) 
Feb 20 09:03:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61444 DF PROTO=TCP SPT=34920 DPT=9882 SEQ=1211461637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598383680000000001030307) 
Feb 20 09:03:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61647 DF PROTO=TCP SPT=47776 DPT=9101 SEQ=1524675634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598390450000000001030307) 
Feb 20 09:03:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61649 DF PROTO=TCP SPT=47776 DPT=9101 SEQ=1524675634 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59839C680000000001030307) 
Feb 20 09:03:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14901 DF PROTO=TCP SPT=33676 DPT=9105 SEQ=2657743949 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983A5680000000001030307) 
Feb 20 09:03:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32104 DF PROTO=TCP SPT=55654 DPT=9102 SEQ=654630034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983B5680000000001030307) 
Feb 20 09:03:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6701 DF PROTO=TCP SPT=39064 DPT=9882 SEQ=1798188793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983BCCA0000000001030307) 
Feb 20 09:03:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6703 DF PROTO=TCP SPT=39064 DPT=9882 SEQ=1798188793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983C8E80000000001030307) 
Feb 20 09:03:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6704 DF PROTO=TCP SPT=39064 DPT=9882 SEQ=1798188793 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983D8A80000000001030307) 
Feb 20 09:03:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57140 DF PROTO=TCP SPT=34470 DPT=9105 SEQ=2681173294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983E3680000000001030307) 
Feb 20 09:03:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57141 DF PROTO=TCP SPT=34470 DPT=9105 SEQ=2681173294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983EB680000000001030307) 
Feb 20 09:03:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24433 DF PROTO=TCP SPT=41248 DPT=9100 SEQ=437128031 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5983F7690000000001030307) 
Feb 20 09:03:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27182 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3234130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598405750000000001030307) 
Feb 20 09:03:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27184 DF PROTO=TCP SPT=45110 DPT=9101 SEQ=3234130462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598411680000000001030307) 
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: State 'stop-sigterm' timed out. Killing.
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Killing process 72703 (conmon) with signal SIGKILL.
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Main process exited, code=killed, status=9/KILL
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: libpod-conmon-8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.scope: Deactivated successfully.
Feb 20 09:03:39 np0005625204.localdomain podman[113044]: error opening file `/run/crun/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59/status`: No such file or directory
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.timer: No such file or directory
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: Failed to open /run/systemd/transient/8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59.service: No such file or directory
Feb 20 09:03:39 np0005625204.localdomain podman[113031]: 2026-02-20 09:03:39.658677334 +0000 UTC m=+0.087925999 container cleanup 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4)
Feb 20 09:03:39 np0005625204.localdomain podman[113031]: ovn_metadata_agent
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Failed with result 'timeout'.
Feb 20 09:03:39 np0005625204.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Feb 20 09:03:39 np0005625204.localdomain sudo[112875]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:40 np0005625204.localdomain sudo[113137]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfpstmnqyepwnhylkiiqaavwhrzklsas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578219.822592-110-197619337902758/AnsiballZ_systemd_service.py
Feb 20 09:03:40 np0005625204.localdomain sudo[113137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:40 np0005625204.localdomain python3.9[113139]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:03:40 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:03:40 np0005625204.localdomain systemd-rc-local-generator[113169]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:03:40 np0005625204.localdomain systemd-sysv-generator[113172]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:03:40 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:03:40 np0005625204.localdomain sudo[113137]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:41 np0005625204.localdomain sshd[113192]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:03:41 np0005625204.localdomain sshd[113192]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:03:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57143 DF PROTO=TCP SPT=34470 DPT=9105 SEQ=2681173294 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59841B680000000001030307) 
Feb 20 09:03:42 np0005625204.localdomain sudo[113269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cywtruybuutoqmbmpnvvjvkltihmajky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578221.7540731-560-118139527102611/AnsiballZ_file.py
Feb 20 09:03:42 np0005625204.localdomain sudo[113269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:42 np0005625204.localdomain python3.9[113271]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:42 np0005625204.localdomain sudo[113269]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:42 np0005625204.localdomain sudo[113361]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpqoawgrhykskwlvtajbzcxldallppdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578222.5111425-560-3057849777538/AnsiballZ_file.py
Feb 20 09:03:42 np0005625204.localdomain sudo[113361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:43 np0005625204.localdomain python3.9[113363]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:43 np0005625204.localdomain sudo[113361]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:43 np0005625204.localdomain sudo[113453]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrnbgjkwibnvncegicsimmymbxadvjab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578223.2601304-560-12468806686390/AnsiballZ_file.py
Feb 20 09:03:43 np0005625204.localdomain sudo[113453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:43 np0005625204.localdomain python3.9[113455]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:43 np0005625204.localdomain sudo[113453]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:44 np0005625204.localdomain sudo[113545]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukirqqabosijdjzhzmtupliwwozydmar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578223.8496-560-84228005813404/AnsiballZ_file.py
Feb 20 09:03:44 np0005625204.localdomain sudo[113545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:44 np0005625204.localdomain python3.9[113547]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:44 np0005625204.localdomain sudo[113545]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:44 np0005625204.localdomain sudo[113637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oupdvadzjjtrokogybzvtdhpombfetof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578224.4465125-560-175420457209037/AnsiballZ_file.py
Feb 20 09:03:44 np0005625204.localdomain sudo[113637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:44 np0005625204.localdomain python3.9[113639]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:44 np0005625204.localdomain sudo[113637]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:45 np0005625204.localdomain sudo[113729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htdrxujoysubgaxtwmmzhqdvidwxlpcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578225.058506-560-81051778133006/AnsiballZ_file.py
Feb 20 09:03:45 np0005625204.localdomain sudo[113729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:45 np0005625204.localdomain python3.9[113731]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:45 np0005625204.localdomain sudo[113729]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:45 np0005625204.localdomain sudo[113821]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkurzdfjqaxxekkmekutpuawmpavnbcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578225.6166062-560-126778965377795/AnsiballZ_file.py
Feb 20 09:03:45 np0005625204.localdomain sudo[113821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52426 DF PROTO=TCP SPT=42758 DPT=9102 SEQ=1735802678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59842B680000000001030307) 
Feb 20 09:03:46 np0005625204.localdomain python3.9[113823]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:46 np0005625204.localdomain sudo[113821]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:46 np0005625204.localdomain sudo[113913]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jiliuiwelzecstpesunmomjryksdzpaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578226.1718059-560-153991921906367/AnsiballZ_file.py
Feb 20 09:03:46 np0005625204.localdomain sudo[113913]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:46 np0005625204.localdomain python3.9[113915]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:46 np0005625204.localdomain sudo[113913]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:46 np0005625204.localdomain sudo[114005]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-refocatvntnuibyccddpdmbabthdrbzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578226.7533984-560-109015713861585/AnsiballZ_file.py
Feb 20 09:03:46 np0005625204.localdomain sudo[114005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:47 np0005625204.localdomain python3.9[114007]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:47 np0005625204.localdomain sudo[114005]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:47 np0005625204.localdomain sudo[114097]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtqrxmymslnsqqnqthuzcmwgzjhendql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578227.2647693-560-185828707647831/AnsiballZ_file.py
Feb 20 09:03:47 np0005625204.localdomain sudo[114097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15244 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598431FA0000000001030307) 
Feb 20 09:03:47 np0005625204.localdomain python3.9[114099]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:47 np0005625204.localdomain sudo[114097]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:48 np0005625204.localdomain sudo[114189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zogfejkyqvmvlkbamlnzxaqklpzfuwzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578227.8390121-560-60530792770452/AnsiballZ_file.py
Feb 20 09:03:48 np0005625204.localdomain sudo[114189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:48 np0005625204.localdomain python3.9[114191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:48 np0005625204.localdomain sudo[114189]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:48 np0005625204.localdomain sudo[114281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ibuzlplqnmaiodcwhlaocohmerphbunx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578228.4240663-560-6295727150087/AnsiballZ_file.py
Feb 20 09:03:48 np0005625204.localdomain sudo[114281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:48 np0005625204.localdomain python3.9[114283]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:48 np0005625204.localdomain sudo[114281]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:49 np0005625204.localdomain sudo[114373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjinzkaqmwisfvyrzmxvbnnrmmppkcgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578229.0684083-560-269555717252957/AnsiballZ_file.py
Feb 20 09:03:49 np0005625204.localdomain sudo[114373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:49 np0005625204.localdomain python3.9[114375]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:49 np0005625204.localdomain sudo[114373]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:49 np0005625204.localdomain sudo[114465]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgxvvpdrboxfomxuxemvoiwqzbpyayaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578229.6548765-560-281473879533425/AnsiballZ_file.py
Feb 20 09:03:49 np0005625204.localdomain sudo[114465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:50 np0005625204.localdomain python3.9[114467]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:50 np0005625204.localdomain sudo[114465]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:50 np0005625204.localdomain sudo[114557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwuiqwuvhtpbkvzkjpqgzhjgcwhyseca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578230.246126-560-5509025510345/AnsiballZ_file.py
Feb 20 09:03:50 np0005625204.localdomain sudo[114557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:50 np0005625204.localdomain python3.9[114559]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:50 np0005625204.localdomain sudo[114557]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15246 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59843DE90000000001030307) 
Feb 20 09:03:51 np0005625204.localdomain sudo[114649]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cticfglcdyfbjlauoubwmokhvsjbqlog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578230.8043144-560-41039137282938/AnsiballZ_file.py
Feb 20 09:03:51 np0005625204.localdomain sudo[114649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:51 np0005625204.localdomain python3.9[114651]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:51 np0005625204.localdomain sudo[114649]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:51 np0005625204.localdomain sudo[114741]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxkwbsfjpcxogtounzaukdcnolmwpihs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578231.3823948-560-236843618803208/AnsiballZ_file.py
Feb 20 09:03:51 np0005625204.localdomain sudo[114741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:51 np0005625204.localdomain python3.9[114743]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:51 np0005625204.localdomain sudo[114741]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:52 np0005625204.localdomain sudo[114833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyvvpgybelkxaopcmzaujnxihyiiasix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578232.011096-560-111638125058967/AnsiballZ_file.py
Feb 20 09:03:52 np0005625204.localdomain sudo[114833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:52 np0005625204.localdomain python3.9[114835]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:52 np0005625204.localdomain sudo[114833]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:52 np0005625204.localdomain sudo[114925]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nickkgbycxtjilxilypnifcrvmyksgti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578232.5564609-560-246564996723877/AnsiballZ_file.py
Feb 20 09:03:52 np0005625204.localdomain sudo[114925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:53 np0005625204.localdomain python3.9[114927]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:53 np0005625204.localdomain sudo[114925]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:53 np0005625204.localdomain sudo[115017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyoawommgsmnughxvzmzmerraxnubloe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578233.1382134-560-3213230614844/AnsiballZ_file.py
Feb 20 09:03:53 np0005625204.localdomain sudo[115017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:53 np0005625204.localdomain python3.9[115019]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:53 np0005625204.localdomain sudo[115017]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:53 np0005625204.localdomain sudo[115109]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jflqiqbctrbctwlgluqujjbqbtjewpck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578233.7533445-560-242830284772422/AnsiballZ_file.py
Feb 20 09:03:53 np0005625204.localdomain sudo[115109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:54 np0005625204.localdomain python3.9[115111]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:54 np0005625204.localdomain sudo[115109]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:54 np0005625204.localdomain sudo[115201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-niaqjkjpovhgszjzxlqctlovshzrlzog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578234.4073439-1011-226988834616899/AnsiballZ_file.py
Feb 20 09:03:54 np0005625204.localdomain auditd[725]: Audit daemon rotating log files
Feb 20 09:03:54 np0005625204.localdomain sudo[115201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15247 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59844DA90000000001030307) 
Feb 20 09:03:54 np0005625204.localdomain python3.9[115203]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:54 np0005625204.localdomain sudo[115201]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:55 np0005625204.localdomain sudo[115293]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cayblupbvgqwbanllnmtsbrotinyfawp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578234.9832077-1011-219120995833301/AnsiballZ_file.py
Feb 20 09:03:55 np0005625204.localdomain sudo[115293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:55 np0005625204.localdomain python3.9[115295]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:55 np0005625204.localdomain sudo[115293]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:55 np0005625204.localdomain sudo[115385]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xsvtdeivitgpnopruoffnergihenajab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578235.5872788-1011-245961785205283/AnsiballZ_file.py
Feb 20 09:03:55 np0005625204.localdomain sudo[115385]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:56 np0005625204.localdomain python3.9[115387]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:56 np0005625204.localdomain sudo[115385]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:56 np0005625204.localdomain sudo[115477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzfzxlydxzsgcpjzkaylpdysqvmsdqkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578236.194913-1011-145389176871516/AnsiballZ_file.py
Feb 20 09:03:56 np0005625204.localdomain sudo[115477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:56 np0005625204.localdomain python3.9[115479]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:56 np0005625204.localdomain sudo[115477]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:57 np0005625204.localdomain sudo[115569]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-whdawmzbcweetkvcpllidegjsyxisasb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578236.8265803-1011-183625842518595/AnsiballZ_file.py
Feb 20 09:03:57 np0005625204.localdomain sudo[115569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:57 np0005625204.localdomain python3.9[115571]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:57 np0005625204.localdomain sudo[115569]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33357 DF PROTO=TCP SPT=41010 DPT=9105 SEQ=1801412553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598458A90000000001030307) 
Feb 20 09:03:57 np0005625204.localdomain sudo[115661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyqmheefqgqvjijymgiloboshmlefabo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578237.4036446-1011-46896804739060/AnsiballZ_file.py
Feb 20 09:03:57 np0005625204.localdomain sudo[115661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:57 np0005625204.localdomain python3.9[115663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:57 np0005625204.localdomain sudo[115661]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:57 np0005625204.localdomain sudo[115664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:03:57 np0005625204.localdomain sudo[115664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:03:57 np0005625204.localdomain sudo[115664]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:57 np0005625204.localdomain sudo[115693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:03:57 np0005625204.localdomain sudo[115693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:03:58 np0005625204.localdomain sudo[115783]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvsjuxjdmelkumfwvzzqafobhxcikoik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578237.9774964-1011-80671165375437/AnsiballZ_file.py
Feb 20 09:03:58 np0005625204.localdomain sudo[115783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:58 np0005625204.localdomain python3.9[115785]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:58 np0005625204.localdomain sudo[115783]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:58 np0005625204.localdomain systemd[1]: tmp-crun.DPTcTv.mount: Deactivated successfully.
Feb 20 09:03:58 np0005625204.localdomain podman[115906]: 2026-02-20 09:03:58.764173752 +0000 UTC m=+0.103862472 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Feb 20 09:03:58 np0005625204.localdomain sudo[115968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amfsskwckorzxzhjxifgtvjqyrahfhlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578238.5918152-1011-245165882296293/AnsiballZ_file.py
Feb 20 09:03:58 np0005625204.localdomain sudo[115968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:58 np0005625204.localdomain podman[115906]: 2026-02-20 09:03:58.865045801 +0000 UTC m=+0.204734511 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, RELEASE=main, distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, release=1770267347, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:03:59 np0005625204.localdomain python3.9[115970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:59 np0005625204.localdomain sudo[115968]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:59 np0005625204.localdomain sudo[115693]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:59 np0005625204.localdomain sudo[116035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:03:59 np0005625204.localdomain sudo[116035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:03:59 np0005625204.localdomain sudo[116035]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:59 np0005625204.localdomain sudo[116067]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:03:59 np0005625204.localdomain sudo[116067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:03:59 np0005625204.localdomain sudo[116140]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-henjczkaizdfcydbvqyolppxkvskmtcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578239.198186-1011-144703791975175/AnsiballZ_file.py
Feb 20 09:03:59 np0005625204.localdomain sudo[116140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:03:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33358 DF PROTO=TCP SPT=41010 DPT=9105 SEQ=1801412553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598460A90000000001030307) 
Feb 20 09:03:59 np0005625204.localdomain python3.9[116142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:03:59 np0005625204.localdomain sudo[116140]: pam_unix(sudo:session): session closed for user root
Feb 20 09:03:59 np0005625204.localdomain sudo[116067]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:00 np0005625204.localdomain sudo[116264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tetxbldiqmqpnnlywcbzrnjwoaqralww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578239.7949283-1011-139135379237885/AnsiballZ_file.py
Feb 20 09:04:00 np0005625204.localdomain sudo[116264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:00 np0005625204.localdomain python3.9[116266]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:00 np0005625204.localdomain sudo[116264]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:00 np0005625204.localdomain sudo[116326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:04:00 np0005625204.localdomain sudo[116326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:04:00 np0005625204.localdomain sudo[116326]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:00 np0005625204.localdomain sudo[116371]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skfurgkiyljvhhnxknzpveceejupkspy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578240.346727-1011-7252797634176/AnsiballZ_file.py
Feb 20 09:04:00 np0005625204.localdomain sudo[116371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:00 np0005625204.localdomain python3.9[116373]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:00 np0005625204.localdomain sudo[116371]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:01 np0005625204.localdomain sudo[116463]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbshkxzlcwcjxankcvkbhhizfhebqego ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578240.8888555-1011-280924619751097/AnsiballZ_file.py
Feb 20 09:04:01 np0005625204.localdomain sudo[116463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:01 np0005625204.localdomain python3.9[116465]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:01 np0005625204.localdomain sudo[116463]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:01 np0005625204.localdomain sudo[116555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqjeahhpsboqbbswtdmreiofbvzevype ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578241.5508966-1011-206150692854727/AnsiballZ_file.py
Feb 20 09:04:01 np0005625204.localdomain sudo[116555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:01 np0005625204.localdomain python3.9[116557]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:02 np0005625204.localdomain sudo[116555]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:02 np0005625204.localdomain sudo[116647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krbkntsiimwcuicakscvgiwkrdziphzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578242.1059546-1011-138450551991864/AnsiballZ_file.py
Feb 20 09:04:02 np0005625204.localdomain sudo[116647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:02 np0005625204.localdomain python3.9[116649]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:02 np0005625204.localdomain sudo[116647]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15248 DF PROTO=TCP SPT=48172 DPT=9882 SEQ=1798228547 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59846D680000000001030307) 
Feb 20 09:04:02 np0005625204.localdomain sudo[116739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dadxglmcauduqqevndlrdqgnwiqbvwqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578242.6865478-1011-172076354134755/AnsiballZ_file.py
Feb 20 09:04:02 np0005625204.localdomain sudo[116739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:03 np0005625204.localdomain python3.9[116741]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:03 np0005625204.localdomain sudo[116739]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:03 np0005625204.localdomain sudo[116831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-febbrlyxikuvfvegxvbgtllhlfwboxqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578243.3171756-1011-206634905003460/AnsiballZ_file.py
Feb 20 09:04:03 np0005625204.localdomain sudo[116831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:03 np0005625204.localdomain python3.9[116833]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:03 np0005625204.localdomain sudo[116831]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:04 np0005625204.localdomain sudo[116923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imhtignjuvfaocqbvtamaqdraupnuggw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578243.9059162-1011-228773034928145/AnsiballZ_file.py
Feb 20 09:04:04 np0005625204.localdomain sudo[116923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:04 np0005625204.localdomain python3.9[116925]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:04 np0005625204.localdomain sudo[116923]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:04 np0005625204.localdomain sudo[117015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwkzpwsncvdovxwvcyqyylqvgnsblefs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578244.4915175-1011-122752636719841/AnsiballZ_file.py
Feb 20 09:04:04 np0005625204.localdomain sudo[117015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:04 np0005625204.localdomain python3.9[117017]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:04 np0005625204.localdomain sudo[117015]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:05 np0005625204.localdomain sudo[117107]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgrutqarmahvizvfktklhsgkqmtkrfmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578245.0797908-1011-227429251231923/AnsiballZ_file.py
Feb 20 09:04:05 np0005625204.localdomain sudo[117107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:05 np0005625204.localdomain python3.9[117109]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:05 np0005625204.localdomain sudo[117107]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:05 np0005625204.localdomain sudo[117199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxkoouvsvogguzimflrfcvfenrsxtaoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578245.5682786-1011-136305081053120/AnsiballZ_file.py
Feb 20 09:04:05 np0005625204.localdomain sudo[117199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:06 np0005625204.localdomain python3.9[117201]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:06 np0005625204.localdomain sudo[117199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13768 DF PROTO=TCP SPT=51760 DPT=9101 SEQ=2817866390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59847AA50000000001030307) 
Feb 20 09:04:06 np0005625204.localdomain sudo[117291]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbzpcquecurtoqaqphazxespioptztdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578246.1062481-1011-227927178404382/AnsiballZ_file.py
Feb 20 09:04:06 np0005625204.localdomain sudo[117291]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:06 np0005625204.localdomain python3.9[117293]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:06 np0005625204.localdomain sudo[117291]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:07 np0005625204.localdomain sudo[117383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaaoaweaitmbshiecklddnqfyndmkpui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578246.8469286-1458-151289604026946/AnsiballZ_command.py
Feb 20 09:04:07 np0005625204.localdomain sudo[117383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:07 np0005625204.localdomain python3.9[117385]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:07 np0005625204.localdomain sudo[117383]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:08 np0005625204.localdomain python3.9[117477]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:04:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13770 DF PROTO=TCP SPT=51760 DPT=9101 SEQ=2817866390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598486A90000000001030307) 
Feb 20 09:04:10 np0005625204.localdomain sudo[117567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoyebqlusygkxqgfoagkvpnjigdbcvxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578248.921052-1511-102785527310403/AnsiballZ_systemd_service.py
Feb 20 09:04:10 np0005625204.localdomain sudo[117567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:10 np0005625204.localdomain python3.9[117569]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:04:10 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:04:10 np0005625204.localdomain systemd-rc-local-generator[117596]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:04:10 np0005625204.localdomain systemd-sysv-generator[117599]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:04:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:04:11 np0005625204.localdomain sudo[117567]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:11 np0005625204.localdomain sudo[117694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnnzhbfjyidbooomagfrsetbwiwevqtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578251.3592634-1536-235793025278486/AnsiballZ_command.py
Feb 20 09:04:11 np0005625204.localdomain sudo[117694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:11 np0005625204.localdomain python3.9[117696]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:11 np0005625204.localdomain sudo[117694]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33360 DF PROTO=TCP SPT=41010 DPT=9105 SEQ=1801412553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598491680000000001030307) 
Feb 20 09:04:12 np0005625204.localdomain sudo[117787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjeojnvufiriezzhyxnbdlgaezquwzbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578251.9104693-1536-48265207893892/AnsiballZ_command.py
Feb 20 09:04:12 np0005625204.localdomain sudo[117787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:12 np0005625204.localdomain python3.9[117789]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:12 np0005625204.localdomain sudo[117787]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:12 np0005625204.localdomain sudo[117880]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlbdmsqwflsyzneymcczgqwequaelstt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578252.5231388-1536-196581065110789/AnsiballZ_command.py
Feb 20 09:04:12 np0005625204.localdomain sudo[117880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:12 np0005625204.localdomain python3.9[117882]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:12 np0005625204.localdomain sudo[117880]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:13 np0005625204.localdomain sudo[117973]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efoffmddodoaxgdgexymbqmqpyzblbha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578253.0600805-1536-166827669447164/AnsiballZ_command.py
Feb 20 09:04:13 np0005625204.localdomain sudo[117973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:13 np0005625204.localdomain python3.9[117975]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:13 np0005625204.localdomain sudo[117973]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:13 np0005625204.localdomain sudo[118066]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayaotjuyruhaatgpzrmjizwqwnksjfil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578253.6654906-1536-48528403204275/AnsiballZ_command.py
Feb 20 09:04:13 np0005625204.localdomain sudo[118066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:14 np0005625204.localdomain python3.9[118068]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:14 np0005625204.localdomain sudo[118066]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:14 np0005625204.localdomain sudo[118159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnpbxfksiosnuvgfqnjxezvllnvjakzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578254.2729838-1536-215047602374132/AnsiballZ_command.py
Feb 20 09:04:14 np0005625204.localdomain sudo[118159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:14 np0005625204.localdomain python3.9[118161]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:14 np0005625204.localdomain sudo[118159]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:15 np0005625204.localdomain sudo[118252]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtmxsyjbgjdbxmnvamdejneeiikmefui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578254.9126644-1536-134675440794339/AnsiballZ_command.py
Feb 20 09:04:15 np0005625204.localdomain sudo[118252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:15 np0005625204.localdomain python3.9[118254]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:15 np0005625204.localdomain sudo[118252]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:15 np0005625204.localdomain sudo[118345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaahccsidnqdmgqgqaxifcaymjwohczn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578255.6087837-1536-15953835688798/AnsiballZ_command.py
Feb 20 09:04:15 np0005625204.localdomain sudo[118345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:16 np0005625204.localdomain python3.9[118347]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:16 np0005625204.localdomain sudo[118345]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16550 DF PROTO=TCP SPT=45740 DPT=9100 SEQ=2847017574 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984A1680000000001030307) 
Feb 20 09:04:16 np0005625204.localdomain sudo[118438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlrqiegsftxvrpuldzqofmnowrkzfmad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578256.1935956-1536-226005578595873/AnsiballZ_command.py
Feb 20 09:04:16 np0005625204.localdomain sudo[118438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:16 np0005625204.localdomain python3.9[118440]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:16 np0005625204.localdomain sudo[118438]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:17 np0005625204.localdomain sudo[118531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtskcipussuyotdyzctjksskrqdepuou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578256.8205657-1536-79639757006145/AnsiballZ_command.py
Feb 20 09:04:17 np0005625204.localdomain sudo[118531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:17 np0005625204.localdomain python3.9[118533]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:17 np0005625204.localdomain sudo[118531]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:17 np0005625204.localdomain sudo[118624]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upblhgsstmnqomllmkzqwlncitdsiqri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578257.41041-1536-71274877329843/AnsiballZ_command.py
Feb 20 09:04:17 np0005625204.localdomain sudo[118624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49483 DF PROTO=TCP SPT=46936 DPT=9882 SEQ=3980037504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984A72A0000000001030307) 
Feb 20 09:04:17 np0005625204.localdomain python3.9[118626]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:17 np0005625204.localdomain sudo[118624]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:18 np0005625204.localdomain sudo[118717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpxbblmctwaocwnsfhnpfemiyixvnfwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578257.9648852-1536-27729671729622/AnsiballZ_command.py
Feb 20 09:04:18 np0005625204.localdomain sudo[118717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:18 np0005625204.localdomain python3.9[118719]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:18 np0005625204.localdomain sudo[118717]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:18 np0005625204.localdomain sudo[118810]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-efaagdpuxgcbsonnceogsgdxqwdyszob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578258.537239-1536-111516451886304/AnsiballZ_command.py
Feb 20 09:04:18 np0005625204.localdomain sudo[118810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:19 np0005625204.localdomain python3.9[118812]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:19 np0005625204.localdomain sudo[118810]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:19 np0005625204.localdomain sudo[118903]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iznbqglymtuqgxcnmfjjcmherocwasdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578259.251593-1536-108464290688446/AnsiballZ_command.py
Feb 20 09:04:19 np0005625204.localdomain sudo[118903]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:19 np0005625204.localdomain python3.9[118905]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:19 np0005625204.localdomain sudo[118903]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:20 np0005625204.localdomain sudo[118996]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqhttmkdbszfakxjadrzsxugwytyqazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578259.9862747-1536-155467524600382/AnsiballZ_command.py
Feb 20 09:04:20 np0005625204.localdomain sudo[118996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:20 np0005625204.localdomain python3.9[118998]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:20 np0005625204.localdomain sudo[118996]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49485 DF PROTO=TCP SPT=46936 DPT=9882 SEQ=3980037504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984B3290000000001030307) 
Feb 20 09:04:20 np0005625204.localdomain sudo[119089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-latgowexlcyxqwbxduhaghwnpxncbcgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578260.6087933-1536-167528895808960/AnsiballZ_command.py
Feb 20 09:04:20 np0005625204.localdomain sudo[119089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:21 np0005625204.localdomain python3.9[119091]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:21 np0005625204.localdomain sudo[119089]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:21 np0005625204.localdomain sudo[119182]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcuzqbgfolwaitomstoioxeudszqrrjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578261.2024248-1536-1542706958732/AnsiballZ_command.py
Feb 20 09:04:21 np0005625204.localdomain sudo[119182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:21 np0005625204.localdomain python3.9[119184]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:21 np0005625204.localdomain sudo[119182]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:22 np0005625204.localdomain sudo[119275]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egaoallhfoyqiqpoapzfpkocxvrqfrmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578261.8459225-1536-94747372009863/AnsiballZ_command.py
Feb 20 09:04:22 np0005625204.localdomain sudo[119275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:22 np0005625204.localdomain python3.9[119277]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:22 np0005625204.localdomain sudo[119275]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:22 np0005625204.localdomain sudo[119368]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptnooqvzpdmdctuwrbvqstdaeylpymki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578262.604936-1536-4443314320310/AnsiballZ_command.py
Feb 20 09:04:22 np0005625204.localdomain sudo[119368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:23 np0005625204.localdomain python3.9[119370]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:24 np0005625204.localdomain sudo[119368]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:24 np0005625204.localdomain sudo[119461]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqipubhghapichontvuxxkdrguoinybf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578264.2374034-1536-236895651623780/AnsiballZ_command.py
Feb 20 09:04:24 np0005625204.localdomain sudo[119461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:24 np0005625204.localdomain python3.9[119463]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:24 np0005625204.localdomain sudo[119461]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49486 DF PROTO=TCP SPT=46936 DPT=9882 SEQ=3980037504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984C2E80000000001030307) 
Feb 20 09:04:25 np0005625204.localdomain sudo[119554]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxjgnwmmqailrygwsvspadlqpptgueop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578264.803367-1536-64168433590241/AnsiballZ_command.py
Feb 20 09:04:25 np0005625204.localdomain sudo[119554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:25 np0005625204.localdomain python3.9[119556]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:25 np0005625204.localdomain sudo[119554]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:26 np0005625204.localdomain sshd[107563]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:04:26 np0005625204.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Feb 20 09:04:26 np0005625204.localdomain systemd[1]: session-38.scope: Consumed 48.546s CPU time.
Feb 20 09:04:26 np0005625204.localdomain systemd-logind[759]: Session 38 logged out. Waiting for processes to exit.
Feb 20 09:04:26 np0005625204.localdomain systemd-logind[759]: Removed session 38.
Feb 20 09:04:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61815 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=4136402762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984CDE80000000001030307) 
Feb 20 09:04:28 np0005625204.localdomain sshd[119572]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:04:28 np0005625204.localdomain sshd[119572]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:04:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61816 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=4136402762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984D5E90000000001030307) 
Feb 20 09:04:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24810 DF PROTO=TCP SPT=44300 DPT=9102 SEQ=980417637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984E1680000000001030307) 
Feb 20 09:04:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59185 DF PROTO=TCP SPT=37732 DPT=9101 SEQ=15335578 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984EFD50000000001030307) 
Feb 20 09:04:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59187 DF PROTO=TCP SPT=37732 DPT=9101 SEQ=15335578 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5984FBE90000000001030307) 
Feb 20 09:04:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61818 DF PROTO=TCP SPT=53196 DPT=9105 SEQ=4136402762 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598505680000000001030307) 
Feb 20 09:04:44 np0005625204.localdomain sshd[119574]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:04:44 np0005625204.localdomain sshd[119574]: Accepted publickey for zuul from 192.168.122.30 port 45988 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:04:44 np0005625204.localdomain systemd-logind[759]: New session 39 of user zuul.
Feb 20 09:04:44 np0005625204.localdomain systemd[1]: Started Session 39 of User zuul.
Feb 20 09:04:44 np0005625204.localdomain sshd[119574]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:04:45 np0005625204.localdomain python3.9[119667]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 20 09:04:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49466 DF PROTO=TCP SPT=35674 DPT=9100 SEQ=1302452893 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598515680000000001030307) 
Feb 20 09:04:46 np0005625204.localdomain python3.9[119771]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:04:47 np0005625204.localdomain sudo[119861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypwatlnwbyypaodrkassrpgacysxtisc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578286.6239538-90-186306922692304/AnsiballZ_command.py
Feb 20 09:04:47 np0005625204.localdomain sudo[119861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:47 np0005625204.localdomain python3.9[119863]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:47 np0005625204.localdomain sudo[119861]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27068 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59851C5A0000000001030307) 
Feb 20 09:04:48 np0005625204.localdomain sudo[119954]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyjgtlyhepljhjaggtlpyhehijtdknea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578287.6088452-127-206237528467637/AnsiballZ_stat.py
Feb 20 09:04:48 np0005625204.localdomain sudo[119954]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:48 np0005625204.localdomain python3.9[119956]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:04:48 np0005625204.localdomain sudo[119954]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:48 np0005625204.localdomain sudo[120046]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpkciksyqqrkpwbbqqmbpyvusrdqngyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578288.38725-150-222948398156274/AnsiballZ_file.py
Feb 20 09:04:48 np0005625204.localdomain sudo[120046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:49 np0005625204.localdomain python3.9[120048]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:49 np0005625204.localdomain sudo[120046]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:49 np0005625204.localdomain sudo[120138]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltecsjczkjvtnmznqzdmgnfeaiifjggt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578289.2424505-175-50825165301189/AnsiballZ_stat.py
Feb 20 09:04:49 np0005625204.localdomain sudo[120138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:49 np0005625204.localdomain python3.9[120140]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:04:49 np0005625204.localdomain sudo[120138]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:50 np0005625204.localdomain sudo[120211]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paquhxfvurazkxybeucamhlyiqwxgnrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578289.2424505-175-50825165301189/AnsiballZ_copy.py
Feb 20 09:04:50 np0005625204.localdomain sudo[120211]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:50 np0005625204.localdomain python3.9[120213]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578289.2424505-175-50825165301189/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:50 np0005625204.localdomain sudo[120211]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27070 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598528680000000001030307) 
Feb 20 09:04:50 np0005625204.localdomain sudo[120303]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrilnjsjzwdbwuexyiarvycbmbddflkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578290.6891563-219-269220055040833/AnsiballZ_setup.py
Feb 20 09:04:50 np0005625204.localdomain sudo[120303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:51 np0005625204.localdomain python3.9[120305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:04:51 np0005625204.localdomain sudo[120303]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:52 np0005625204.localdomain sudo[120399]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqpywbwxjahdikkztkuigogdvpuspdrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578291.7594318-244-114425313293637/AnsiballZ_file.py
Feb 20 09:04:52 np0005625204.localdomain sudo[120399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:52 np0005625204.localdomain python3.9[120401]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:04:52 np0005625204.localdomain sudo[120399]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:52 np0005625204.localdomain sudo[120491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puvrcdjpmcipqfdbpnbgmreqdfspsyww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578292.5082839-270-255609172482431/AnsiballZ_file.py
Feb 20 09:04:52 np0005625204.localdomain sudo[120491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:53 np0005625204.localdomain python3.9[120493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:04:53 np0005625204.localdomain sudo[120491]: pam_unix(sudo:session): session closed for user root
Feb 20 09:04:53 np0005625204.localdomain python3.9[120583]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:04:53 np0005625204.localdomain network[120600]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:04:53 np0005625204.localdomain network[120601]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:04:53 np0005625204.localdomain network[120602]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:04:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27071 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598538280000000001030307) 
Feb 20 09:04:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:04:56 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42826 DF PROTO=TCP SPT=49320 DPT=9105 SEQ=1002855727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59853F070000000001030307) 
Feb 20 09:04:57 np0005625204.localdomain python3.9[120799]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:04:58 np0005625204.localdomain python3.9[120889]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:04:58 np0005625204.localdomain sshd[120894]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:04:59 np0005625204.localdomain sudo[120985]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxngffjivcvnarxwxwebkockypmzafvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578299.0628498-373-264452269568849/AnsiballZ_command.py
Feb 20 09:04:59 np0005625204.localdomain sudo[120985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:04:59 np0005625204.localdomain python3.9[120987]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:04:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42828 DF PROTO=TCP SPT=49320 DPT=9105 SEQ=1002855727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59854B280000000001030307) 
Feb 20 09:05:00 np0005625204.localdomain sshd[120894]: Invalid user n8n from 27.112.79.3 port 44014
Feb 20 09:05:00 np0005625204.localdomain sshd[120894]: Received disconnect from 27.112.79.3 port 44014:11: Bye Bye [preauth]
Feb 20 09:05:00 np0005625204.localdomain sshd[120894]: Disconnected from invalid user n8n 27.112.79.3 port 44014 [preauth]
Feb 20 09:05:00 np0005625204.localdomain sudo[120998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:05:00 np0005625204.localdomain sudo[120998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:05:00 np0005625204.localdomain sudo[120998]: pam_unix(sudo:session): session closed for user root
Feb 20 09:05:00 np0005625204.localdomain sudo[121013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:05:00 np0005625204.localdomain sudo[121013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:05:01 np0005625204.localdomain sudo[121013]: pam_unix(sudo:session): session closed for user root
Feb 20 09:05:02 np0005625204.localdomain sudo[121062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:05:02 np0005625204.localdomain sudo[121062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:05:02 np0005625204.localdomain sudo[121062]: pam_unix(sudo:session): session closed for user root
Feb 20 09:05:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27072 DF PROTO=TCP SPT=55792 DPT=9882 SEQ=3018468552 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598559680000000001030307) 
Feb 20 09:05:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62550 DF PROTO=TCP SPT=43976 DPT=9101 SEQ=160469416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598565050000000001030307) 
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 20 09:05:09 np0005625204.localdomain sshd[46205]: Received signal 15; terminating.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: sshd.service: Consumed 27.099s CPU time.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 20 09:05:09 np0005625204.localdomain sshd[121107]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:09 np0005625204.localdomain sshd[121107]: Server listening on 0.0.0.0 port 22.
Feb 20 09:05:09 np0005625204.localdomain sshd[121107]: Server listening on :: port 22.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:05:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62552 DF PROTO=TCP SPT=43976 DPT=9101 SEQ=160469416 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598571280000000001030307) 
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: run-rbbb8798f96e34674bbaf104827d20089.service: Deactivated successfully.
Feb 20 09:05:09 np0005625204.localdomain systemd[1]: run-r5986b363c93241878ab93cc953588cd2.service: Deactivated successfully.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 20 09:05:10 np0005625204.localdomain sshd[121107]: Received signal 15; terminating.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 20 09:05:10 np0005625204.localdomain sshd[121278]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:10 np0005625204.localdomain sshd[121278]: Server listening on 0.0.0.0 port 22.
Feb 20 09:05:10 np0005625204.localdomain sshd[121278]: Server listening on :: port 22.
Feb 20 09:05:10 np0005625204.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 20 09:05:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42830 DF PROTO=TCP SPT=49320 DPT=9105 SEQ=1002855727 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59857B680000000001030307) 
Feb 20 09:05:14 np0005625204.localdomain sshd[121284]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:14 np0005625204.localdomain sshd[121284]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:05:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55633 DF PROTO=TCP SPT=42926 DPT=9100 SEQ=775551466 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59858B680000000001030307) 
Feb 20 09:05:16 np0005625204.localdomain sshd[121287]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:16 np0005625204.localdomain sshd[121287]: Invalid user n8n from 96.78.175.36 port 38354
Feb 20 09:05:17 np0005625204.localdomain sshd[121287]: Received disconnect from 96.78.175.36 port 38354:11: Bye Bye [preauth]
Feb 20 09:05:17 np0005625204.localdomain sshd[121287]: Disconnected from invalid user n8n 96.78.175.36 port 38354 [preauth]
Feb 20 09:05:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1739 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985918A0000000001030307) 
Feb 20 09:05:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1741 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59859DA80000000001030307) 
Feb 20 09:05:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1742 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985AD680000000001030307) 
Feb 20 09:05:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48393 DF PROTO=TCP SPT=53078 DPT=9105 SEQ=2937509130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985B8280000000001030307) 
Feb 20 09:05:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48394 DF PROTO=TCP SPT=53078 DPT=9105 SEQ=2937509130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985C0290000000001030307) 
Feb 20 09:05:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1743 DF PROTO=TCP SPT=58590 DPT=9882 SEQ=3775899828 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985CD680000000001030307) 
Feb 20 09:05:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56670 DF PROTO=TCP SPT=44272 DPT=9101 SEQ=1716440593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985DA340000000001030307) 
Feb 20 09:05:36 np0005625204.localdomain sshd[121408]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:37 np0005625204.localdomain sshd[121408]: Invalid user solana from 45.148.10.240 port 47698
Feb 20 09:05:37 np0005625204.localdomain sshd[121408]: Connection closed by invalid user solana 45.148.10.240 port 47698 [preauth]
Feb 20 09:05:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56672 DF PROTO=TCP SPT=44272 DPT=9101 SEQ=1716440593 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985E6280000000001030307) 
Feb 20 09:05:41 np0005625204.localdomain sshd[121423]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:41 np0005625204.localdomain sshd[121423]: Invalid user proxyuser from 54.36.99.29 port 58538
Feb 20 09:05:41 np0005625204.localdomain sshd[121423]: Received disconnect from 54.36.99.29 port 58538:11: Bye Bye [preauth]
Feb 20 09:05:41 np0005625204.localdomain sshd[121423]: Disconnected from invalid user proxyuser 54.36.99.29 port 58538 [preauth]
Feb 20 09:05:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48396 DF PROTO=TCP SPT=53078 DPT=9105 SEQ=2937509130 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985EF680000000001030307) 
Feb 20 09:05:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10518 DF PROTO=TCP SPT=57060 DPT=9102 SEQ=1822393694 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5985FF680000000001030307) 
Feb 20 09:05:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25107 DF PROTO=TCP SPT=42584 DPT=9882 SEQ=4293515916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598606BA0000000001030307) 
Feb 20 09:05:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25109 DF PROTO=TCP SPT=42584 DPT=9882 SEQ=4293515916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598612A90000000001030307) 
Feb 20 09:05:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25110 DF PROTO=TCP SPT=42584 DPT=9882 SEQ=4293515916 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598622680000000001030307) 
Feb 20 09:05:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20179 DF PROTO=TCP SPT=34736 DPT=9105 SEQ=1894680188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59862D690000000001030307) 
Feb 20 09:05:59 np0005625204.localdomain sshd[121597]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:05:59 np0005625204.localdomain sshd[121597]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:05:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20180 DF PROTO=TCP SPT=34736 DPT=9105 SEQ=1894680188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598635690000000001030307) 
Feb 20 09:06:02 np0005625204.localdomain sudo[121621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:06:02 np0005625204.localdomain sudo[121621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:06:02 np0005625204.localdomain sudo[121621]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:02 np0005625204.localdomain sudo[121636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:06:02 np0005625204.localdomain sudo[121636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:06:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20390 DF PROTO=TCP SPT=57532 DPT=9100 SEQ=1464944850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598641690000000001030307) 
Feb 20 09:06:02 np0005625204.localdomain sudo[121636]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:03 np0005625204.localdomain sudo[121688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:06:03 np0005625204.localdomain sudo[121688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:06:03 np0005625204.localdomain sudo[121688]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47571 DF PROTO=TCP SPT=58468 DPT=9101 SEQ=1646320255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59864F640000000001030307) 
Feb 20 09:06:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47573 DF PROTO=TCP SPT=58468 DPT=9101 SEQ=1646320255 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59865B680000000001030307) 
Feb 20 09:06:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20182 DF PROTO=TCP SPT=34736 DPT=9105 SEQ=1894680188 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598665680000000001030307) 
Feb 20 09:06:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64594 DF PROTO=TCP SPT=48798 DPT=9100 SEQ=1833037813 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598675680000000001030307) 
Feb 20 09:06:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=774 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59867BEA0000000001030307) 
Feb 20 09:06:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=776 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598687E90000000001030307) 
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  Converting 2754 SID table entries...
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:06:21 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:06:23 np0005625204.localdomain sudo[120985]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:23 np0005625204.localdomain sudo[121933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfsjutufcfqmukxzwnwlvfxhnvsipkbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578383.2917643-400-8522037188734/AnsiballZ_file.py
Feb 20 09:06:23 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Feb 20 09:06:23 np0005625204.localdomain sudo[121933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:23 np0005625204.localdomain python3.9[121935]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:06:23 np0005625204.localdomain sudo[121933]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:24 np0005625204.localdomain sudo[122025]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itnjuiclsmvfvtakbjnfnvuqhynjvpjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578383.9552515-423-137880703546724/AnsiballZ_stat.py
Feb 20 09:06:24 np0005625204.localdomain sudo[122025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:24 np0005625204.localdomain python3.9[122027]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:06:24 np0005625204.localdomain sudo[122025]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:24 np0005625204.localdomain sudo[122098]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtdztaekcqsvnuvjsenhjvhayibvkwpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578383.9552515-423-137880703546724/AnsiballZ_copy.py
Feb 20 09:06:24 np0005625204.localdomain sudo[122098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=777 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598697A80000000001030307) 
Feb 20 09:06:24 np0005625204.localdomain python3.9[122100]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578383.9552515-423-137880703546724/.source.fact _original_basename=.ss4t8t6i follow=False checksum=d686dccd4d8cd0883f3e3bc0a6f664c73290ba68 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:06:24 np0005625204.localdomain sudo[122098]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:25 np0005625204.localdomain python3.9[122190]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:06:26 np0005625204.localdomain sudo[122286]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icsqmdsjjiukcfthhyapbxxjvawzmlce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578386.354304-499-161329765234660/AnsiballZ_setup.py
Feb 20 09:06:26 np0005625204.localdomain sudo[122286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:26 np0005625204.localdomain python3.9[122288]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:06:27 np0005625204.localdomain sudo[122286]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:27 np0005625204.localdomain sudo[122340]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wswaaypvntmvnpydigltmuxiruvvogue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578386.354304-499-161329765234660/AnsiballZ_dnf.py
Feb 20 09:06:27 np0005625204.localdomain sudo[122340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57769 DF PROTO=TCP SPT=46492 DPT=9105 SEQ=999787407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986A2A80000000001030307) 
Feb 20 09:06:27 np0005625204.localdomain python3.9[122342]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:06:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57770 DF PROTO=TCP SPT=46492 DPT=9105 SEQ=999787407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986AAA80000000001030307) 
Feb 20 09:06:31 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:06:31 np0005625204.localdomain systemd-rc-local-generator[122379]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:06:31 np0005625204.localdomain systemd-sysv-generator[122382]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:06:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:06:31 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 09:06:32 np0005625204.localdomain sudo[122340]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=778 DF PROTO=TCP SPT=33430 DPT=9882 SEQ=4245260675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986B7680000000001030307) 
Feb 20 09:06:33 np0005625204.localdomain sudo[122481]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edocdzggbodtuttoiuigofafbhsqexcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578392.8428044-535-32674419654040/AnsiballZ_command.py
Feb 20 09:06:33 np0005625204.localdomain sudo[122481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:33 np0005625204.localdomain python3.9[122483]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:06:34 np0005625204.localdomain sudo[122481]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:34 np0005625204.localdomain sudo[122720]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oljwkviizuaihvqlwgmyztkdzvfxrubq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578394.3050644-558-32198901914559/AnsiballZ_selinux.py
Feb 20 09:06:34 np0005625204.localdomain sudo[122720]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:35 np0005625204.localdomain python3.9[122722]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 20 09:06:35 np0005625204.localdomain sudo[122720]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:35 np0005625204.localdomain sudo[122812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqqjcltrmsejtulsbscjlofmuxtfazjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578395.640388-592-167452944085896/AnsiballZ_command.py
Feb 20 09:06:35 np0005625204.localdomain sudo[122812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:36 np0005625204.localdomain python3.9[122814]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 20 09:06:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21677 DF PROTO=TCP SPT=38876 DPT=9101 SEQ=3683062109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986C4950000000001030307) 
Feb 20 09:06:36 np0005625204.localdomain sudo[122812]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:36 np0005625204.localdomain sudo[122905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkcrarolbeizbdeexzwwotyzyvmvpfvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578396.7598436-615-125770516147348/AnsiballZ_file.py
Feb 20 09:06:36 np0005625204.localdomain sudo[122905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:37 np0005625204.localdomain python3.9[122907]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:06:37 np0005625204.localdomain sudo[122905]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:37 np0005625204.localdomain sudo[122997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfmgduvblhcskctgktopjdubypclamfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578397.4059112-639-186698508509354/AnsiballZ_mount.py
Feb 20 09:06:37 np0005625204.localdomain sudo[122997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:38 np0005625204.localdomain python3.9[122999]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 20 09:06:38 np0005625204.localdomain sudo[122997]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:39 np0005625204.localdomain sudo[123089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfdggnpvbekqycjypgwlxotnciomrxnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578399.059852-724-9534375081798/AnsiballZ_file.py
Feb 20 09:06:39 np0005625204.localdomain sudo[123089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21679 DF PROTO=TCP SPT=38876 DPT=9101 SEQ=3683062109 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986D0A80000000001030307) 
Feb 20 09:06:39 np0005625204.localdomain python3.9[123091]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:06:39 np0005625204.localdomain sudo[123089]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:40 np0005625204.localdomain sudo[123181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzkmsrvawcggeeehjlyauomudqmoaogn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578399.774818-748-52794336280186/AnsiballZ_stat.py
Feb 20 09:06:40 np0005625204.localdomain sudo[123181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:40 np0005625204.localdomain python3.9[123183]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:06:40 np0005625204.localdomain sudo[123181]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:40 np0005625204.localdomain sudo[123254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adotbqgjzfgzgbbnjuyitqkgvjcnmngy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578399.774818-748-52794336280186/AnsiballZ_copy.py
Feb 20 09:06:40 np0005625204.localdomain sudo[123254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:40 np0005625204.localdomain python3.9[123256]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578399.774818-748-52794336280186/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:06:40 np0005625204.localdomain sudo[123254]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:41 np0005625204.localdomain sudo[123346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahibhdgmbchhcxnrywmtieamowobtbbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578401.508106-820-236915220897468/AnsiballZ_stat.py
Feb 20 09:06:41 np0005625204.localdomain sudo[123346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:41 np0005625204.localdomain python3.9[123348]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:06:41 np0005625204.localdomain sudo[123346]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57772 DF PROTO=TCP SPT=46492 DPT=9105 SEQ=999787407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986DB680000000001030307) 
Feb 20 09:06:42 np0005625204.localdomain sudo[123440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuzxbcwrvgasrcyuixjibahhtutpslid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578402.5576682-858-25585996801829/AnsiballZ_getent.py
Feb 20 09:06:42 np0005625204.localdomain sudo[123440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:43 np0005625204.localdomain python3.9[123442]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 20 09:06:43 np0005625204.localdomain sudo[123440]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:43 np0005625204.localdomain sudo[123533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yeptxnuukugwscdhzzmklpotdshajvlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578403.6279857-889-25940619955710/AnsiballZ_getent.py
Feb 20 09:06:43 np0005625204.localdomain sudo[123533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:44 np0005625204.localdomain sshd[123536]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:06:44 np0005625204.localdomain sshd[123536]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:06:44 np0005625204.localdomain python3.9[123535]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 20 09:06:44 np0005625204.localdomain sudo[123533]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:44 np0005625204.localdomain sudo[123628]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pujwjgzwextheriskrphlnzfffcptofe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578404.5596786-913-153095619367301/AnsiballZ_group.py
Feb 20 09:06:44 np0005625204.localdomain sudo[123628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:45 np0005625204.localdomain python3.9[123630]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 20 09:06:45 np0005625204.localdomain groupmod[123631]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Feb 20 09:06:45 np0005625204.localdomain groupmod[123631]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Feb 20 09:06:45 np0005625204.localdomain sudo[123628]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:45 np0005625204.localdomain sudo[123726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgllwrsbnkrezmxgsjbohwbuvjrhbjsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578405.4636762-939-134214619599027/AnsiballZ_file.py
Feb 20 09:06:45 np0005625204.localdomain sudo[123726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:45 np0005625204.localdomain python3.9[123728]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 20 09:06:45 np0005625204.localdomain sudo[123726]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51570 DF PROTO=TCP SPT=38710 DPT=9100 SEQ=4250352083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986EB680000000001030307) 
Feb 20 09:06:46 np0005625204.localdomain sudo[123818]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfgdhphklrulsmrzyllyvxqakhithrfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578406.3648982-972-200364446082894/AnsiballZ_dnf.py
Feb 20 09:06:46 np0005625204.localdomain sudo[123818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:46 np0005625204.localdomain python3.9[123820]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:06:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34187 DF PROTO=TCP SPT=48446 DPT=9882 SEQ=991171327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986F11A0000000001030307) 
Feb 20 09:06:50 np0005625204.localdomain sudo[123818]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34189 DF PROTO=TCP SPT=48446 DPT=9882 SEQ=991171327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5986FD280000000001030307) 
Feb 20 09:06:50 np0005625204.localdomain sudo[123912]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qefwnsqoajyzmtqtfqtngcaakapdyper ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578410.582783-998-59766355438245/AnsiballZ_file.py
Feb 20 09:06:50 np0005625204.localdomain sudo[123912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:51 np0005625204.localdomain python3.9[123914]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:06:51 np0005625204.localdomain sudo[123912]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:51 np0005625204.localdomain sudo[124004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubyzptvtkpyhhnzvwvbfmyzedakepfjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578411.7413259-1021-277731476419526/AnsiballZ_stat.py
Feb 20 09:06:51 np0005625204.localdomain sudo[124004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:52 np0005625204.localdomain python3.9[124006]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:06:52 np0005625204.localdomain sudo[124004]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:52 np0005625204.localdomain sudo[124077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhryopoduerlzaxpnmlsjjrutvczqplw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578411.7413259-1021-277731476419526/AnsiballZ_copy.py
Feb 20 09:06:52 np0005625204.localdomain sudo[124077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:52 np0005625204.localdomain python3.9[124079]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578411.7413259-1021-277731476419526/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:06:52 np0005625204.localdomain sudo[124077]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:53 np0005625204.localdomain sudo[124169]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppmvgwewktzrbdzrtucxferbdrjogijd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578412.9660885-1066-59067834648276/AnsiballZ_systemd.py
Feb 20 09:06:53 np0005625204.localdomain sudo[124169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:53 np0005625204.localdomain python3.9[124171]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:06:53 np0005625204.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 20 09:06:53 np0005625204.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 20 09:06:53 np0005625204.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 20 09:06:53 np0005625204.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 20 09:06:53 np0005625204.localdomain systemd-modules-load[124175]: Module 'msr' is built in
Feb 20 09:06:53 np0005625204.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 20 09:06:53 np0005625204.localdomain sudo[124169]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:54 np0005625204.localdomain sudo[124265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txbsdxannglxznrnwruhxapfuupcedsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578414.026495-1090-122631989914202/AnsiballZ_stat.py
Feb 20 09:06:54 np0005625204.localdomain sudo[124265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:54 np0005625204.localdomain python3.9[124267]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:06:54 np0005625204.localdomain sudo[124265]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34190 DF PROTO=TCP SPT=48446 DPT=9882 SEQ=991171327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59870CE80000000001030307) 
Feb 20 09:06:54 np0005625204.localdomain sudo[124338]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cckntoaxxidyxqcjvvlqdyeyssorogbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578414.026495-1090-122631989914202/AnsiballZ_copy.py
Feb 20 09:06:54 np0005625204.localdomain sudo[124338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:55 np0005625204.localdomain python3.9[124340]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578414.026495-1090-122631989914202/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:06:55 np0005625204.localdomain sudo[124338]: pam_unix(sudo:session): session closed for user root
Feb 20 09:06:55 np0005625204.localdomain sudo[124430]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btqtbjpxgzueuozwslefzgryaaxhmlmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578415.5880094-1144-239159361007490/AnsiballZ_dnf.py
Feb 20 09:06:55 np0005625204.localdomain sudo[124430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:06:56 np0005625204.localdomain python3.9[124432]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:06:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15041 DF PROTO=TCP SPT=45392 DPT=9105 SEQ=1214227235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598717A90000000001030307) 
Feb 20 09:06:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15042 DF PROTO=TCP SPT=45392 DPT=9105 SEQ=1214227235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59871FA80000000001030307) 
Feb 20 09:07:02 np0005625204.localdomain sudo[124430]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30894 DF PROTO=TCP SPT=52968 DPT=9102 SEQ=84378607 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59872B680000000001030307) 
Feb 20 09:07:03 np0005625204.localdomain sudo[124482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:07:03 np0005625204.localdomain sudo[124482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:07:03 np0005625204.localdomain sudo[124482]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:03 np0005625204.localdomain sudo[124513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:07:03 np0005625204.localdomain sudo[124513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:07:04 np0005625204.localdomain python3.9[124555]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:07:04 np0005625204.localdomain sudo[124513]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:05 np0005625204.localdomain python3.9[124679]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 20 09:07:05 np0005625204.localdomain sudo[124680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:07:05 np0005625204.localdomain sudo[124680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:07:05 np0005625204.localdomain sudo[124680]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:05 np0005625204.localdomain python3.9[124784]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:07:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27950 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=516693089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598739C50000000001030307) 
Feb 20 09:07:06 np0005625204.localdomain sudo[124874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jahkenayhsyjtyzfgksqcfzbbjkyydyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578426.161123-1267-20797495291289/AnsiballZ_systemd.py
Feb 20 09:07:06 np0005625204.localdomain sudo[124874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:06 np0005625204.localdomain python3.9[124876]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:07:07 np0005625204.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 20 09:07:07 np0005625204.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 20 09:07:07 np0005625204.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 20 09:07:07 np0005625204.localdomain systemd[1]: tuned.service: Consumed 1.845s CPU time, no IO.
Feb 20 09:07:07 np0005625204.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 20 09:07:09 np0005625204.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 20 09:07:09 np0005625204.localdomain sudo[124874]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27952 DF PROTO=TCP SPT=40632 DPT=9101 SEQ=516693089 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598745E90000000001030307) 
Feb 20 09:07:09 np0005625204.localdomain python3.9[124979]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 20 09:07:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15044 DF PROTO=TCP SPT=45392 DPT=9105 SEQ=1214227235 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59874F680000000001030307) 
Feb 20 09:07:12 np0005625204.localdomain sudo[125069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rnjajigvrarpecgnyulimmhfxubpdnup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578432.7140024-1437-71357165136569/AnsiballZ_systemd.py
Feb 20 09:07:12 np0005625204.localdomain sudo[125069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:13 np0005625204.localdomain python3.9[125071]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:07:13 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:07:13 np0005625204.localdomain systemd-rc-local-generator[125100]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:07:13 np0005625204.localdomain systemd-sysv-generator[125104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:07:13 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:07:13 np0005625204.localdomain sudo[125069]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:13 np0005625204.localdomain sudo[125199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jigjunvzygpllbhelhvxyibrmxiwaama ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578433.6909003-1437-67624012344063/AnsiballZ_systemd.py
Feb 20 09:07:13 np0005625204.localdomain sudo[125199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:14 np0005625204.localdomain python3.9[125201]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:07:15 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:07:15 np0005625204.localdomain systemd-rc-local-generator[125227]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:07:15 np0005625204.localdomain systemd-sysv-generator[125233]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:07:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:07:15 np0005625204.localdomain sudo[125199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52830 DF PROTO=TCP SPT=57070 DPT=9100 SEQ=209070533 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59875F680000000001030307) 
Feb 20 09:07:16 np0005625204.localdomain sudo[125329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncznqfzcbmrjjdzdpjyjnwrmbouqmiif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578436.0927868-1486-40580684668751/AnsiballZ_command.py
Feb 20 09:07:16 np0005625204.localdomain sudo[125329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:16 np0005625204.localdomain python3.9[125331]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:07:16 np0005625204.localdomain sudo[125329]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:17 np0005625204.localdomain sudo[125422]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mowzpfccjouqtgckocnzpprimlqyrvtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578436.8914692-1511-180258118437911/AnsiballZ_command.py
Feb 20 09:07:17 np0005625204.localdomain sudo[125422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:17 np0005625204.localdomain python3.9[125424]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:07:17 np0005625204.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Feb 20 09:07:17 np0005625204.localdomain sudo[125422]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29473 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987664A0000000001030307) 
Feb 20 09:07:17 np0005625204.localdomain sudo[125515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmxdvygimydmkyerasrwnvihtvmcivey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578437.5764976-1534-230573564760167/AnsiballZ_command.py
Feb 20 09:07:17 np0005625204.localdomain sudo[125515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:17 np0005625204.localdomain python3.9[125517]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:07:19 np0005625204.localdomain sudo[125515]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:19 np0005625204.localdomain sudo[125614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtfmvovsbonxpqgmpjgpqgokghrltlec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578439.3276918-1558-126755076718670/AnsiballZ_command.py
Feb 20 09:07:19 np0005625204.localdomain sudo[125614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:19 np0005625204.localdomain python3.9[125616]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:07:19 np0005625204.localdomain sudo[125614]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:20 np0005625204.localdomain sudo[125707]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvxruvkeugleromspbckbenqddikzggo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578439.9893296-1582-106026276050055/AnsiballZ_systemd.py
Feb 20 09:07:20 np0005625204.localdomain sudo[125707]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:20 np0005625204.localdomain python3.9[125709]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:07:20 np0005625204.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 20 09:07:20 np0005625204.localdomain systemd[1]: Stopped Apply Kernel Variables.
Feb 20 09:07:20 np0005625204.localdomain systemd[1]: Stopping Apply Kernel Variables...
Feb 20 09:07:20 np0005625204.localdomain systemd[1]: Starting Apply Kernel Variables...
Feb 20 09:07:20 np0005625204.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 20 09:07:20 np0005625204.localdomain systemd[1]: Finished Apply Kernel Variables.
Feb 20 09:07:20 np0005625204.localdomain sudo[125707]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29475 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598772680000000001030307) 
Feb 20 09:07:21 np0005625204.localdomain sshd[119574]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:07:21 np0005625204.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Feb 20 09:07:21 np0005625204.localdomain systemd[1]: session-39.scope: Consumed 1min 59.695s CPU time.
Feb 20 09:07:21 np0005625204.localdomain systemd-logind[759]: Session 39 logged out. Waiting for processes to exit.
Feb 20 09:07:21 np0005625204.localdomain systemd-logind[759]: Removed session 39.
Feb 20 09:07:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29476 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598782280000000001030307) 
Feb 20 09:07:26 np0005625204.localdomain sshd[125729]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:07:26 np0005625204.localdomain sshd[125729]: Accepted publickey for zuul from 192.168.122.30 port 41460 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:07:26 np0005625204.localdomain systemd-logind[759]: New session 40 of user zuul.
Feb 20 09:07:26 np0005625204.localdomain systemd[1]: Started Session 40 of User zuul.
Feb 20 09:07:26 np0005625204.localdomain sshd[125729]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:07:26 np0005625204.localdomain sshd[125765]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:07:26 np0005625204.localdomain sshd[125765]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:07:27 np0005625204.localdomain python3.9[125824]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:07:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48171 DF PROTO=TCP SPT=60372 DPT=9105 SEQ=1810654222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59878CE80000000001030307) 
Feb 20 09:07:28 np0005625204.localdomain python3.9[125918]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:07:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48172 DF PROTO=TCP SPT=60372 DPT=9105 SEQ=1810654222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598794E80000000001030307) 
Feb 20 09:07:29 np0005625204.localdomain sudo[126012]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lucirnwlzajhfgqizbyfacyqmtcayzwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578449.460078-107-161821859327972/AnsiballZ_command.py
Feb 20 09:07:29 np0005625204.localdomain sudo[126012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:30 np0005625204.localdomain python3.9[126014]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:07:30 np0005625204.localdomain sudo[126012]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:31 np0005625204.localdomain python3.9[126105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:07:32 np0005625204.localdomain sudo[126199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkniafcpwxnttavrxrzmdyjqlkzkvyqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578451.9306524-167-74448723605185/AnsiballZ_setup.py
Feb 20 09:07:32 np0005625204.localdomain sudo[126199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:32 np0005625204.localdomain python3.9[126201]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:07:32 np0005625204.localdomain sudo[126199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:33 np0005625204.localdomain sudo[126253]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrpbwzvidupianpcpdbwwdywvnhuqchc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578451.9306524-167-74448723605185/AnsiballZ_dnf.py
Feb 20 09:07:33 np0005625204.localdomain sudo[126253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29477 DF PROTO=TCP SPT=47130 DPT=9882 SEQ=1828196097 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987A3690000000001030307) 
Feb 20 09:07:33 np0005625204.localdomain python3.9[126255]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:07:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24755 DF PROTO=TCP SPT=50880 DPT=9101 SEQ=807687321 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987AEF50000000001030307) 
Feb 20 09:07:37 np0005625204.localdomain sudo[126253]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:37 np0005625204.localdomain sudo[126347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnfvbbihcsetihuramqygysdfowubbvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578457.1586697-203-90605566596537/AnsiballZ_setup.py
Feb 20 09:07:37 np0005625204.localdomain sudo[126347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:37 np0005625204.localdomain python3.9[126349]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:07:38 np0005625204.localdomain sudo[126347]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:39 np0005625204.localdomain sudo[126502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtuqeigstoglymvudjjqackiqtvamwsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578458.5974388-236-203049626348464/AnsiballZ_file.py
Feb 20 09:07:39 np0005625204.localdomain sudo[126502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:39 np0005625204.localdomain python3.9[126504]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:07:39 np0005625204.localdomain sudo[126502]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24757 DF PROTO=TCP SPT=50880 DPT=9101 SEQ=807687321 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987BAE80000000001030307) 
Feb 20 09:07:39 np0005625204.localdomain sudo[126594]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upzwkktbfjmdtjwsdlkrtzthhzurwutv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578459.4576368-260-16570924204572/AnsiballZ_command.py
Feb 20 09:07:39 np0005625204.localdomain sudo[126594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:39 np0005625204.localdomain python3.9[126596]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:07:40 np0005625204.localdomain sudo[126594]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:40 np0005625204.localdomain sudo[126698]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vawmpiencdxaajzqpxmokdoiutehfxkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578460.203366-284-55435558990349/AnsiballZ_stat.py
Feb 20 09:07:40 np0005625204.localdomain sudo[126698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:40 np0005625204.localdomain python3.9[126700]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:07:40 np0005625204.localdomain sudo[126698]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:41 np0005625204.localdomain sudo[126746]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjlnvtndhgcrhaodiwwrmcfdhhldmqby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578460.203366-284-55435558990349/AnsiballZ_file.py
Feb 20 09:07:41 np0005625204.localdomain sudo[126746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:41 np0005625204.localdomain python3.9[126748]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:07:41 np0005625204.localdomain sudo[126746]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:41 np0005625204.localdomain sudo[126838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwcudnelryzfjpoucwiqikeecnwunqlx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578461.4064102-320-28069435415316/AnsiballZ_stat.py
Feb 20 09:07:41 np0005625204.localdomain sudo[126838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:41 np0005625204.localdomain python3.9[126840]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:07:41 np0005625204.localdomain sudo[126838]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48174 DF PROTO=TCP SPT=60372 DPT=9105 SEQ=1810654222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987C5680000000001030307) 
Feb 20 09:07:42 np0005625204.localdomain sudo[126911]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuudgjbqzhzrrifvxbjbwhtsrasqsczk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578461.4064102-320-28069435415316/AnsiballZ_copy.py
Feb 20 09:07:42 np0005625204.localdomain sudo[126911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:42 np0005625204.localdomain python3.9[126913]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578461.4064102-320-28069435415316/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:07:42 np0005625204.localdomain sudo[126911]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:43 np0005625204.localdomain sudo[127003]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjkyszqudkwqdjkrretwhhueskjdfooq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578462.707475-368-194969973412147/AnsiballZ_ini_file.py
Feb 20 09:07:43 np0005625204.localdomain sudo[127003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:43 np0005625204.localdomain python3.9[127005]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:07:43 np0005625204.localdomain sudo[127003]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:43 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 20 09:07:43 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:07:43 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:07:43 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:07:43 np0005625204.localdomain sudo[127096]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzgeydztzxxovzqwfuhryzfpwkxodpvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578463.4476552-368-45366227229874/AnsiballZ_ini_file.py
Feb 20 09:07:43 np0005625204.localdomain sudo[127096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:43 np0005625204.localdomain python3.9[127098]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:07:43 np0005625204.localdomain sudo[127096]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:44 np0005625204.localdomain sudo[127188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrklokxmrocqruuvrpplcmvynkmgvkby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578464.089346-368-5116533417835/AnsiballZ_ini_file.py
Feb 20 09:07:44 np0005625204.localdomain sudo[127188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:44 np0005625204.localdomain python3.9[127190]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:07:44 np0005625204.localdomain sudo[127188]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:44 np0005625204.localdomain sudo[127280]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dazupgxzoscofrzrydpwzebccexmqlud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578464.6382163-368-199457755287703/AnsiballZ_ini_file.py
Feb 20 09:07:44 np0005625204.localdomain sudo[127280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:45 np0005625204.localdomain python3.9[127282]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:07:45 np0005625204.localdomain sudo[127280]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:46 np0005625204.localdomain python3.9[127372]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:07:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21097 DF PROTO=TCP SPT=34696 DPT=9102 SEQ=909875513 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987D5680000000001030307) 
Feb 20 09:07:46 np0005625204.localdomain sudo[127464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbuiirovnrfdsuwdknvffxrbbxqlpvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578466.2745054-489-232910219931743/AnsiballZ_dnf.py
Feb 20 09:07:46 np0005625204.localdomain sudo[127464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:46 np0005625204.localdomain python3.9[127466]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:07:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27106 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987DB7A0000000001030307) 
Feb 20 09:07:50 np0005625204.localdomain sudo[127464]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:50 np0005625204.localdomain sudo[127558]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmcbmgleepmypkjdvvrntnpqurfrbxsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578470.4740927-512-39376361354189/AnsiballZ_dnf.py
Feb 20 09:07:50 np0005625204.localdomain sudo[127558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27108 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987E7680000000001030307) 
Feb 20 09:07:50 np0005625204.localdomain python3.9[127560]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:07:54 np0005625204.localdomain sudo[127558]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27109 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5987F7290000000001030307) 
Feb 20 09:07:55 np0005625204.localdomain sudo[127652]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmsnzfpncktewjoyuizfrobrsvxrqifh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578474.7457817-542-148527406802319/AnsiballZ_dnf.py
Feb 20 09:07:55 np0005625204.localdomain sudo[127652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:55 np0005625204.localdomain python3.9[127654]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:07:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25660 DF PROTO=TCP SPT=57916 DPT=9105 SEQ=3359371827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598802280000000001030307) 
Feb 20 09:07:58 np0005625204.localdomain sudo[127652]: pam_unix(sudo:session): session closed for user root
Feb 20 09:07:59 np0005625204.localdomain sudo[127752]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gopmdmdvxhfxnveqwucpycnmercdleor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578479.1592305-569-228885912635084/AnsiballZ_dnf.py
Feb 20 09:07:59 np0005625204.localdomain sudo[127752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:07:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25661 DF PROTO=TCP SPT=57916 DPT=9105 SEQ=3359371827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59880A280000000001030307) 
Feb 20 09:07:59 np0005625204.localdomain python3.9[127754]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:08:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27110 DF PROTO=TCP SPT=33858 DPT=9882 SEQ=1491845009 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598817680000000001030307) 
Feb 20 09:08:03 np0005625204.localdomain sudo[127752]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:03 np0005625204.localdomain sudo[127846]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpzmcxzcqvcogozihstiwdpjiclxjtnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578483.597445-605-94817474021823/AnsiballZ_dnf.py
Feb 20 09:08:03 np0005625204.localdomain sudo[127846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:04 np0005625204.localdomain python3.9[127848]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:08:05 np0005625204.localdomain sudo[127851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:08:05 np0005625204.localdomain sudo[127851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:08:05 np0005625204.localdomain sudo[127851]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:05 np0005625204.localdomain sudo[127866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:08:05 np0005625204.localdomain sudo[127866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:08:05 np0005625204.localdomain sudo[127866]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57223 DF PROTO=TCP SPT=39340 DPT=9101 SEQ=3203763100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598824250000000001030307) 
Feb 20 09:08:07 np0005625204.localdomain sudo[127846]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:08 np0005625204.localdomain sudo[128002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzsdocnsiavgtnlndptzhsgdtmhwueni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578487.946495-632-232149690369413/AnsiballZ_dnf.py
Feb 20 09:08:08 np0005625204.localdomain sudo[128002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:08 np0005625204.localdomain python3.9[128004]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:08:09 np0005625204.localdomain sshd[128007]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:08:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57225 DF PROTO=TCP SPT=39340 DPT=9101 SEQ=3203763100 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598830280000000001030307) 
Feb 20 09:08:09 np0005625204.localdomain sshd[128007]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:08:09 np0005625204.localdomain sudo[128009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:08:09 np0005625204.localdomain sudo[128009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:08:09 np0005625204.localdomain sudo[128009]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25663 DF PROTO=TCP SPT=57916 DPT=9105 SEQ=3359371827 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598839680000000001030307) 
Feb 20 09:08:11 np0005625204.localdomain sudo[128002]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:12 np0005625204.localdomain sudo[128113]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oojzmsjiwrrecuabxmsxgvrmqziqmxhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578492.1753583-659-7163605155047/AnsiballZ_dnf.py
Feb 20 09:08:12 np0005625204.localdomain sudo[128113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:12 np0005625204.localdomain python3.9[128115]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:08:13 np0005625204.localdomain sshd[128117]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:08:14 np0005625204.localdomain sshd[128117]: Invalid user solana from 45.148.10.240 port 45948
Feb 20 09:08:14 np0005625204.localdomain sshd[128117]: Connection closed by invalid user solana 45.148.10.240 port 45948 [preauth]
Feb 20 09:08:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10558 DF PROTO=TCP SPT=33098 DPT=9102 SEQ=2659812165 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598849680000000001030307) 
Feb 20 09:08:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5348 DF PROTO=TCP SPT=33244 DPT=9882 SEQ=2025381304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598850AA0000000001030307) 
Feb 20 09:08:20 np0005625204.localdomain sshd[128130]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:08:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5350 DF PROTO=TCP SPT=33244 DPT=9882 SEQ=2025381304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59885CA80000000001030307) 
Feb 20 09:08:20 np0005625204.localdomain sshd[128130]: Invalid user claude from 96.78.175.36 port 50236
Feb 20 09:08:21 np0005625204.localdomain sshd[128130]: Received disconnect from 96.78.175.36 port 50236:11: Bye Bye [preauth]
Feb 20 09:08:21 np0005625204.localdomain sshd[128130]: Disconnected from invalid user claude 96.78.175.36 port 50236 [preauth]
Feb 20 09:08:23 np0005625204.localdomain sudo[128113]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5351 DF PROTO=TCP SPT=33244 DPT=9882 SEQ=2025381304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59886C680000000001030307) 
Feb 20 09:08:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34420 DF PROTO=TCP SPT=58272 DPT=9105 SEQ=1369561492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598877690000000001030307) 
Feb 20 09:08:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34421 DF PROTO=TCP SPT=58272 DPT=9105 SEQ=1369561492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59887F680000000001030307) 
Feb 20 09:08:31 np0005625204.localdomain sudo[128285]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdudclocpggoelbxrixjazcmwepbcybk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578511.673822-686-73578318014470/AnsiballZ_dnf.py
Feb 20 09:08:31 np0005625204.localdomain sudo[128285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:32 np0005625204.localdomain python3.9[128287]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:08:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14193 DF PROTO=TCP SPT=33326 DPT=9100 SEQ=2745313112 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59888B690000000001030307) 
Feb 20 09:08:35 np0005625204.localdomain sudo[128285]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:36 np0005625204.localdomain sudo[128380]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abfmelnmboixxnfkdkbbmcnqidvegbxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578515.929067-716-52630528399075/AnsiballZ_dnf.py
Feb 20 09:08:36 np0005625204.localdomain sudo[128380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33125 DF PROTO=TCP SPT=60058 DPT=9101 SEQ=1362994008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598899550000000001030307) 
Feb 20 09:08:36 np0005625204.localdomain python3.9[128382]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['device-mapper-multipath'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:08:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33127 DF PROTO=TCP SPT=60058 DPT=9101 SEQ=1362994008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988A5680000000001030307) 
Feb 20 09:08:39 np0005625204.localdomain sudo[128380]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:40 np0005625204.localdomain sudo[128477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qohuiylnqibqsxdrcuxtfrrjvbesince ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578520.1057332-750-209887362937251/AnsiballZ_file.py
Feb 20 09:08:40 np0005625204.localdomain sudo[128477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:40 np0005625204.localdomain python3.9[128479]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:08:40 np0005625204.localdomain sudo[128477]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:41 np0005625204.localdomain sudo[128582]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ouuraafzcnnqakhxxxybbjxyemueopob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578520.8020444-773-209109761705897/AnsiballZ_stat.py
Feb 20 09:08:41 np0005625204.localdomain sudo[128582]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:41 np0005625204.localdomain python3.9[128584]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:08:41 np0005625204.localdomain sudo[128582]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:41 np0005625204.localdomain sudo[128655]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hssrjlfxdsczxiajfbmlcjuocnzhemsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578520.8020444-773-209109761705897/AnsiballZ_copy.py
Feb 20 09:08:41 np0005625204.localdomain sudo[128655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:41 np0005625204.localdomain python3.9[128657]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1771578520.8020444-773-209109761705897/.source.json _original_basename=.cbwcwmdf follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:08:41 np0005625204.localdomain sudo[128655]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34423 DF PROTO=TCP SPT=58272 DPT=9105 SEQ=1369561492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988AF680000000001030307) 
Feb 20 09:08:43 np0005625204.localdomain sudo[128747]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azbeffgcbwxxgjvqqteqqyrdblggegpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578522.2851079-827-261789381034846/AnsiballZ_podman_image.py
Feb 20 09:08:43 np0005625204.localdomain sudo[128747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:43 np0005625204.localdomain python3.9[128749]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:08:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45207 DF PROTO=TCP SPT=54542 DPT=9102 SEQ=247792192 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988BF680000000001030307) 
Feb 20 09:08:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41054 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988C5DA0000000001030307) 
Feb 20 09:08:49 np0005625204.localdomain podman[128762]: 2026-02-20 09:08:43.466791935 +0000 UTC m=+0.048438438 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 20 09:08:50 np0005625204.localdomain sudo[128747]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:50 np0005625204.localdomain sudo[128961]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwnmrqupzekocedpjrtaahdgcojjrujk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578530.4711819-860-117273186582329/AnsiballZ_podman_image.py
Feb 20 09:08:50 np0005625204.localdomain sudo[128961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:08:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41056 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988D1E80000000001030307) 
Feb 20 09:08:50 np0005625204.localdomain python3.9[128963]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:08:51 np0005625204.localdomain sshd[128988]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:08:53 np0005625204.localdomain sshd[128999]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:08:53 np0005625204.localdomain sshd[128988]: Invalid user cma from 27.112.79.3 port 43604
Feb 20 09:08:53 np0005625204.localdomain sshd[128999]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:08:53 np0005625204.localdomain sshd[128988]: Received disconnect from 27.112.79.3 port 43604:11: Bye Bye [preauth]
Feb 20 09:08:53 np0005625204.localdomain sshd[128988]: Disconnected from invalid user cma 27.112.79.3 port 43604 [preauth]
Feb 20 09:08:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41057 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988E1A80000000001030307) 
Feb 20 09:08:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39695 DF PROTO=TCP SPT=43888 DPT=9105 SEQ=1885777334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988ECA80000000001030307) 
Feb 20 09:08:58 np0005625204.localdomain podman[128975]: 2026-02-20 09:08:51.094113643 +0000 UTC m=+0.044720713 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:08:59 np0005625204.localdomain sudo[128961]: pam_unix(sudo:session): session closed for user root
Feb 20 09:08:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39696 DF PROTO=TCP SPT=43888 DPT=9105 SEQ=1885777334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5988F4A80000000001030307) 
Feb 20 09:08:59 np0005625204.localdomain sudo[129178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mycqlqsccpsjkbtjhccnmihdbclkigxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578539.5885625-891-30061168924728/AnsiballZ_podman_image.py
Feb 20 09:08:59 np0005625204.localdomain sudo[129178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:00 np0005625204.localdomain python3.9[129180]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:09:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41058 DF PROTO=TCP SPT=43172 DPT=9882 SEQ=2963501716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598901690000000001030307) 
Feb 20 09:09:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=892 DF PROTO=TCP SPT=38760 DPT=9101 SEQ=1623084561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59890E840000000001030307) 
Feb 20 09:09:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=894 DF PROTO=TCP SPT=38760 DPT=9101 SEQ=1623084561 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59891AA80000000001030307) 
Feb 20 09:09:09 np0005625204.localdomain sudo[129756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:09:09 np0005625204.localdomain sudo[129756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:09:09 np0005625204.localdomain sudo[129756]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:10 np0005625204.localdomain sudo[129771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:09:10 np0005625204.localdomain sudo[129771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:09:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39698 DF PROTO=TCP SPT=43888 DPT=9105 SEQ=1885777334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598925680000000001030307) 
Feb 20 09:09:12 np0005625204.localdomain podman[129193]: 2026-02-20 09:09:00.185136516 +0000 UTC m=+0.045695875 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:09:12 np0005625204.localdomain sudo[129771]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:12 np0005625204.localdomain sudo[129178]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:12 np0005625204.localdomain sudo[129866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:09:12 np0005625204.localdomain sudo[129866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:09:12 np0005625204.localdomain sudo[129866]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:13 np0005625204.localdomain sudo[129895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:09:13 np0005625204.localdomain sudo[129895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:09:13 np0005625204.localdomain sudo[129895]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38454 DF PROTO=TCP SPT=55242 DPT=9102 SEQ=2644966875 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598935680000000001030307) 
Feb 20 09:09:17 np0005625204.localdomain sudo[130004]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfwirbyzpswhnkpjpvkuzkfaiqbrhgok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578556.9264476-917-100959214447793/AnsiballZ_podman_image.py
Feb 20 09:09:17 np0005625204.localdomain sudo[130004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:17 np0005625204.localdomain python3.9[130006]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:09:17 np0005625204.localdomain sudo[130032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:09:17 np0005625204.localdomain sudo[130032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:09:17 np0005625204.localdomain sudo[130032]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8363 DF PROTO=TCP SPT=56128 DPT=9882 SEQ=1571366773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59893B0B0000000001030307) 
Feb 20 09:09:19 np0005625204.localdomain podman[130019]: 2026-02-20 09:09:17.525113536 +0000 UTC m=+0.031527596 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 20 09:09:19 np0005625204.localdomain sudo[130004]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:19 np0005625204.localdomain sudo[130197]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgvyjxkhqalxrqmdwnaixxnkptimyqkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578559.6977465-944-185617823388459/AnsiballZ_podman_image.py
Feb 20 09:09:19 np0005625204.localdomain sudo[130197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:20 np0005625204.localdomain python3.9[130199]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:09:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8365 DF PROTO=TCP SPT=56128 DPT=9882 SEQ=1571366773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598947280000000001030307) 
Feb 20 09:09:21 np0005625204.localdomain podman[130213]: 2026-02-20 09:09:20.343092889 +0000 UTC m=+0.048204071 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:09:21 np0005625204.localdomain sudo[130197]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:22 np0005625204.localdomain sudo[130374]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdtzwyhjvgpunqqvsovaqwzkiasrqhzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578562.240645-971-206320424544477/AnsiballZ_podman_image.py
Feb 20 09:09:22 np0005625204.localdomain sudo[130374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:22 np0005625204.localdomain python3.9[130376]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:09:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8366 DF PROTO=TCP SPT=56128 DPT=9882 SEQ=1571366773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598956E90000000001030307) 
Feb 20 09:09:26 np0005625204.localdomain podman[130388]: 2026-02-20 09:09:22.860440432 +0000 UTC m=+0.045870869 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 20 09:09:26 np0005625204.localdomain sudo[130374]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:27 np0005625204.localdomain sudo[130563]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbjygwrqcvauccrzyozrfcrcyibsgcsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578566.8727846-971-245322875033952/AnsiballZ_podman_image.py
Feb 20 09:09:27 np0005625204.localdomain sudo[130563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:27 np0005625204.localdomain python3.9[130565]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 20 09:09:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40299 DF PROTO=TCP SPT=59724 DPT=9105 SEQ=2635809123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598961A80000000001030307) 
Feb 20 09:09:29 np0005625204.localdomain podman[130577]: 2026-02-20 09:09:27.521759129 +0000 UTC m=+0.046061545 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Feb 20 09:09:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40300 DF PROTO=TCP SPT=59724 DPT=9105 SEQ=2635809123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598969A90000000001030307) 
Feb 20 09:09:29 np0005625204.localdomain sudo[130563]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:32 np0005625204.localdomain sshd[125729]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:09:32 np0005625204.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Feb 20 09:09:32 np0005625204.localdomain systemd[1]: session-40.scope: Consumed 2min 7.898s CPU time.
Feb 20 09:09:32 np0005625204.localdomain systemd-logind[759]: Session 40 logged out. Waiting for processes to exit.
Feb 20 09:09:32 np0005625204.localdomain systemd-logind[759]: Removed session 40.
Feb 20 09:09:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45354 DF PROTO=TCP SPT=36634 DPT=9100 SEQ=2573324998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598975680000000001030307) 
Feb 20 09:09:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34568 DF PROTO=TCP SPT=42508 DPT=9101 SEQ=3887936909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598983B70000000001030307) 
Feb 20 09:09:37 np0005625204.localdomain sshd[130686]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:09:37 np0005625204.localdomain sshd[130688]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:09:37 np0005625204.localdomain sshd[130686]: Accepted publickey for zuul from 192.168.122.30 port 37866 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:09:37 np0005625204.localdomain systemd-logind[759]: New session 41 of user zuul.
Feb 20 09:09:37 np0005625204.localdomain systemd[1]: Started Session 41 of User zuul.
Feb 20 09:09:37 np0005625204.localdomain sshd[130686]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:09:37 np0005625204.localdomain sshd[130688]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:09:38 np0005625204.localdomain python3.9[130781]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:09:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34570 DF PROTO=TCP SPT=42508 DPT=9101 SEQ=3887936909 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59898FA80000000001030307) 
Feb 20 09:09:39 np0005625204.localdomain sudo[130875]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdlmwbqactwwfxjmzkdbjpdwzabhkdrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578579.3117437-68-96718944164659/AnsiballZ_getent.py
Feb 20 09:09:39 np0005625204.localdomain sudo[130875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:39 np0005625204.localdomain python3.9[130877]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 20 09:09:39 np0005625204.localdomain sudo[130875]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:40 np0005625204.localdomain sudo[130968]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujehwwdshlqwmhhrkpmxvvecqtahzedo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578580.3841605-104-1544876297117/AnsiballZ_setup.py
Feb 20 09:09:40 np0005625204.localdomain sudo[130968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:41 np0005625204.localdomain python3.9[130970]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:09:41 np0005625204.localdomain sudo[130968]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:41 np0005625204.localdomain sudo[131022]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyafislcsramispmsgjhojpbaawpjpjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578580.3841605-104-1544876297117/AnsiballZ_dnf.py
Feb 20 09:09:41 np0005625204.localdomain sudo[131022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40302 DF PROTO=TCP SPT=59724 DPT=9105 SEQ=2635809123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598999680000000001030307) 
Feb 20 09:09:42 np0005625204.localdomain python3.9[131024]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:09:45 np0005625204.localdomain sudo[131022]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63254 DF PROTO=TCP SPT=33380 DPT=9100 SEQ=2575278553 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989A9680000000001030307) 
Feb 20 09:09:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:09:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:09:47 np0005625204.localdomain sudo[131116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etxyucdlphrobrzbheeltjsqvmzsquln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578586.8567305-146-121296680594936/AnsiballZ_dnf.py
Feb 20 09:09:47 np0005625204.localdomain sudo[131116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:47 np0005625204.localdomain python3.9[131118]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:09:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13417 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989B03B0000000001030307) 
Feb 20 09:09:50 np0005625204.localdomain sudo[131116]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13419 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989BC280000000001030307) 
Feb 20 09:09:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:09:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:09:51 np0005625204.localdomain sudo[131210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztbkbffbsftpilozwvmpgmdwslnouytw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578590.8436215-170-168266420762910/AnsiballZ_systemd.py
Feb 20 09:09:51 np0005625204.localdomain sudo[131210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:51 np0005625204.localdomain python3.9[131212]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:09:51 np0005625204.localdomain sudo[131210]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:52 np0005625204.localdomain python3.9[131305]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:09:53 np0005625204.localdomain sudo[131395]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usrpdbyuucwcqruzndibyyefacsdipki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578592.90969-227-185399694652401/AnsiballZ_sefcontext.py
Feb 20 09:09:53 np0005625204.localdomain sudo[131395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:53 np0005625204.localdomain python3.9[131397]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 20 09:09:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13420 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989CBE80000000001030307) 
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  Converting 2756 SID table entries...
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:09:55 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:09:55 np0005625204.localdomain sudo[131395]: pam_unix(sudo:session): session closed for user root
Feb 20 09:09:56 np0005625204.localdomain python3.9[131809]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:09:57 np0005625204.localdomain sudo[131905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kajenqtvmqdzqrnqkdwahxaaivufmxly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578597.090866-281-90420177163591/AnsiballZ_dnf.py
Feb 20 09:09:57 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Feb 20 09:09:57 np0005625204.localdomain sudo[131905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:09:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23474 DF PROTO=TCP SPT=33482 DPT=9105 SEQ=2218267951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989D6E90000000001030307) 
Feb 20 09:09:57 np0005625204.localdomain python3.9[131907]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:09:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23475 DF PROTO=TCP SPT=33482 DPT=9105 SEQ=2218267951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989DEE80000000001030307) 
Feb 20 09:10:01 np0005625204.localdomain sudo[131905]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:01 np0005625204.localdomain sudo[131999]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjlavhwmykamhbggkhatwbaokqbuirsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578601.1775134-306-65371832359626/AnsiballZ_command.py
Feb 20 09:10:01 np0005625204.localdomain sudo[131999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:01 np0005625204.localdomain python3.9[132001]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:10:02 np0005625204.localdomain sudo[131999]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13421 DF PROTO=TCP SPT=33660 DPT=9882 SEQ=2916413918 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989EB680000000001030307) 
Feb 20 09:10:03 np0005625204.localdomain sudo[132244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xavbyckollimcajwvjsrdwlphagszhta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578602.727829-330-167583867905272/AnsiballZ_file.py
Feb 20 09:10:03 np0005625204.localdomain sudo[132244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:03 np0005625204.localdomain python3.9[132246]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 20 09:10:03 np0005625204.localdomain sudo[132244]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:04 np0005625204.localdomain python3.9[132336]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:10:05 np0005625204.localdomain sudo[132428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nurgaefvyfzpyfanlhxodkywcspnhjze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578605.2905107-383-177723127619430/AnsiballZ_dnf.py
Feb 20 09:10:05 np0005625204.localdomain sudo[132428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:05 np0005625204.localdomain python3.9[132430]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:10:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20041 DF PROTO=TCP SPT=48136 DPT=9101 SEQ=2280697435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5989F8E50000000001030307) 
Feb 20 09:10:09 np0005625204.localdomain sudo[132428]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20043 DF PROTO=TCP SPT=48136 DPT=9101 SEQ=2280697435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A04E90000000001030307) 
Feb 20 09:10:10 np0005625204.localdomain sudo[132522]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaraetglkwhvjcbdtzghnedqriijrypw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578610.347671-407-266699074987363/AnsiballZ_dnf.py
Feb 20 09:10:10 np0005625204.localdomain sudo[132522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:10 np0005625204.localdomain python3.9[132524]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:10:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23477 DF PROTO=TCP SPT=33482 DPT=9105 SEQ=2218267951 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A0F680000000001030307) 
Feb 20 09:10:14 np0005625204.localdomain sudo[132522]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:14 np0005625204.localdomain sudo[132616]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdczynghoztzrpvlcatkhzbniizckarc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578614.329483-432-188055786208956/AnsiballZ_systemd.py
Feb 20 09:10:14 np0005625204.localdomain sudo[132616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:14 np0005625204.localdomain python3.9[132618]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 20 09:10:14 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:10:15 np0005625204.localdomain systemd-rc-local-generator[132650]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:10:15 np0005625204.localdomain systemd-sysv-generator[132653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:10:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:10:15 np0005625204.localdomain sudo[132616]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:15 np0005625204.localdomain sudo[132748]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgbshfzpciourupoxyudenizgjqbfjwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578615.6073987-461-166421562893704/AnsiballZ_stat.py
Feb 20 09:10:15 np0005625204.localdomain sudo[132748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:16 np0005625204.localdomain python3.9[132750]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:10:16 np0005625204.localdomain sudo[132748]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44494 DF PROTO=TCP SPT=48560 DPT=9102 SEQ=3346129564 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A1F680000000001030307) 
Feb 20 09:10:16 np0005625204.localdomain sudo[132840]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izucrmaosayksmgsaypohnqpbgayjmyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578616.3387475-489-122046364769770/AnsiballZ_ini_file.py
Feb 20 09:10:16 np0005625204.localdomain sudo[132840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:16 np0005625204.localdomain python3.9[132842]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:16 np0005625204.localdomain sudo[132840]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:17 np0005625204.localdomain sudo[132934]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjvdofpwsgzselyuoeunlvqnpberranz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578617.111091-512-3244404762986/AnsiballZ_ini_file.py
Feb 20 09:10:17 np0005625204.localdomain sudo[132934]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:17 np0005625204.localdomain python3.9[132936]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:17 np0005625204.localdomain sudo[132934]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3305 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A256A0000000001030307) 
Feb 20 09:10:17 np0005625204.localdomain sudo[132951]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:10:17 np0005625204.localdomain sudo[132951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:10:17 np0005625204.localdomain sudo[132951]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:17 np0005625204.localdomain sudo[132966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:10:17 np0005625204.localdomain sudo[132966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:10:18 np0005625204.localdomain sudo[133056]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbrdhyczwnsmnfcrmudpwgvsdrkoufrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578617.783941-536-89037973858118/AnsiballZ_ini_file.py
Feb 20 09:10:18 np0005625204.localdomain sudo[133056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:18 np0005625204.localdomain python3.9[133059]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:18 np0005625204.localdomain sudo[133056]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:18 np0005625204.localdomain sudo[132966]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:18 np0005625204.localdomain sudo[133105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:10:18 np0005625204.localdomain sudo[133105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:10:18 np0005625204.localdomain sudo[133105]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:18 np0005625204.localdomain sudo[133206]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzlnmrqahhmcucnbdcewyuxchsfpacmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578618.5026045-566-147557596304841/AnsiballZ_stat.py
Feb 20 09:10:18 np0005625204.localdomain sudo[133206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:18 np0005625204.localdomain sudo[133187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- inventory --format=json-pretty --filter-for-batch
Feb 20 09:10:18 np0005625204.localdomain sudo[133187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:10:18 np0005625204.localdomain python3.9[133211]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:10:18 np0005625204.localdomain sudo[133206]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 2026-02-20 09:10:19.384350869 +0000 UTC m=+0.081585193 container create 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:10:19 np0005625204.localdomain sudo[133337]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxdvlbtevrebjfnaqjjqghccqookpqoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578618.5026045-566-147557596304841/AnsiballZ_copy.py
Feb 20 09:10:19 np0005625204.localdomain sudo[133337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: Started libpod-conmon-31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e.scope.
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 2026-02-20 09:10:19.35431234 +0000 UTC m=+0.051546694 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 2026-02-20 09:10:19.478843421 +0000 UTC m=+0.176077725 container init 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: tmp-crun.FK723U.mount: Deactivated successfully.
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 2026-02-20 09:10:19.492956977 +0000 UTC m=+0.190191291 container start 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_BRANCH=main, version=7, build-date=2026-02-09T10:25:24Z, ceph=True)
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 2026-02-20 09:10:19.493325968 +0000 UTC m=+0.190560322 container attach 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, vcs-type=git, release=1770267347, RELEASE=main, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:10:19 np0005625204.localdomain upbeat_bouman[133341]: 167 167
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: libpod-31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e.scope: Deactivated successfully.
Feb 20 09:10:19 np0005625204.localdomain podman[133293]: 2026-02-20 09:10:19.49856067 +0000 UTC m=+0.195795004 container died 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, io.openshift.tags=rhceph ceph, release=1770267347, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, name=rhceph, vcs-type=git, GIT_BRANCH=main, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:10:19 np0005625204.localdomain podman[133346]: 2026-02-20 09:10:19.596746266 +0000 UTC m=+0.085977940 container remove 31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_bouman, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, name=rhceph, version=7, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: libpod-conmon-31b9c1ee40dc52399af5cdb1deb7899d3a56c162458879eb3454651dc3cce25e.scope: Deactivated successfully.
Feb 20 09:10:19 np0005625204.localdomain python3.9[133340]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578618.5026045-566-147557596304841/.source _original_basename=.o399jm0v follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:19 np0005625204.localdomain sudo[133337]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:19 np0005625204.localdomain podman[133381]: 
Feb 20 09:10:19 np0005625204.localdomain podman[133381]: 2026-02-20 09:10:19.830579325 +0000 UTC m=+0.077344022 container create 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, io.openshift.expose-services=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.buildah.version=1.42.2, GIT_CLEAN=True, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc.)
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: Started libpod-conmon-99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941.scope.
Feb 20 09:10:19 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:10:19 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 09:10:19 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:10:19 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 09:10:19 np0005625204.localdomain podman[133381]: 2026-02-20 09:10:19.799514825 +0000 UTC m=+0.046279532 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:10:19 np0005625204.localdomain podman[133381]: 2026-02-20 09:10:19.90186471 +0000 UTC m=+0.148629417 container init 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, GIT_CLEAN=True, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, ceph=True, distribution-scope=public, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, version=7, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64)
Feb 20 09:10:19 np0005625204.localdomain podman[133381]: 2026-02-20 09:10:19.911461406 +0000 UTC m=+0.158226113 container start 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vendor=Red Hat, Inc., distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2)
Feb 20 09:10:19 np0005625204.localdomain podman[133381]: 2026-02-20 09:10:19.911733814 +0000 UTC m=+0.158498561 container attach 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:10:20 np0005625204.localdomain sudo[133477]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxomxokuqtncglxucacxtojpoarvsdov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578619.8278656-611-236821461014508/AnsiballZ_file.py
Feb 20 09:10:20 np0005625204.localdomain sudo[133477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:20 np0005625204.localdomain python3.9[133480]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:20 np0005625204.localdomain sudo[133477]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-62dd6aaec95394fbd7fb576113f27cc341e7316b5770e2202ab8860a7447ff48-merged.mount: Deactivated successfully.
Feb 20 09:10:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3307 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A31680000000001030307) 
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]: [
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:     {
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "available": false,
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "ceph_device": false,
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "lsm_data": {},
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "lvs": [],
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "path": "/dev/sr0",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "rejected_reasons": [
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "Has a FileSystem",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "Insufficient space (<5GB)"
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         ],
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         "sys_api": {
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "actuators": null,
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "device_nodes": "sr0",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "human_readable_size": "482.00 KB",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "id_bus": "ata",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "model": "QEMU DVD-ROM",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "nr_requests": "2",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "partitions": {},
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "path": "/dev/sr0",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "removable": "1",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "rev": "2.5+",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "ro": "0",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "rotational": "1",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "sas_address": "",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "sas_device_handle": "",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "scheduler_mode": "mq-deadline",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "sectors": 0,
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "sectorsize": "2048",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "size": 493568.0,
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "support_discard": "0",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "type": "disk",
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:             "vendor": "QEMU"
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:         }
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]:     }
Feb 20 09:10:20 np0005625204.localdomain festive_jang[133423]: ]
Feb 20 09:10:20 np0005625204.localdomain systemd[1]: libpod-99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941.scope: Deactivated successfully.
Feb 20 09:10:20 np0005625204.localdomain podman[133381]: 2026-02-20 09:10:20.81658993 +0000 UTC m=+1.063354657 container died 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7)
Feb 20 09:10:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d57bd2fe32e222dbc5c308ee093f37e64dcec9203f58e66675db0d4a2bc1dfe1-merged.mount: Deactivated successfully.
Feb 20 09:10:20 np0005625204.localdomain podman[135103]: 2026-02-20 09:10:20.934968441 +0000 UTC m=+0.102037237 container remove 99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_jang, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:10:20 np0005625204.localdomain systemd[1]: libpod-conmon-99a347983ea28ec9fd8f2591a368a16f8c5ecd5a0a428c62c6989e1788e88941.scope: Deactivated successfully.
Feb 20 09:10:20 np0005625204.localdomain sudo[135147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tobxccqtcusvmvpmhpigsvhpmtxuqocg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578620.5032485-635-135478944591006/AnsiballZ_edpm_os_net_config_mappings.py
Feb 20 09:10:20 np0005625204.localdomain sudo[135147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:20 np0005625204.localdomain sudo[133187]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:21 np0005625204.localdomain python3.9[135149]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 20 09:10:21 np0005625204.localdomain sudo[135147]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:21 np0005625204.localdomain sudo[135164]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:10:21 np0005625204.localdomain sudo[135164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:10:21 np0005625204.localdomain sudo[135164]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:21 np0005625204.localdomain sudo[135254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiteywdcgailhrvznkamgksqrmnqpxxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578621.5652497-662-246530072482506/AnsiballZ_file.py
Feb 20 09:10:21 np0005625204.localdomain sudo[135254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:21 np0005625204.localdomain python3.9[135256]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:22 np0005625204.localdomain sudo[135254]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:22 np0005625204.localdomain sudo[135346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uskdbiddlsvlgggobvelevozfxwepowy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578622.3412344-692-239749894350298/AnsiballZ_stat.py
Feb 20 09:10:22 np0005625204.localdomain sudo[135346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:22 np0005625204.localdomain python3.9[135348]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:10:22 np0005625204.localdomain sudo[135346]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:22 np0005625204.localdomain sshd[135349]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:10:23 np0005625204.localdomain sshd[135349]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:10:23 np0005625204.localdomain sudo[135421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbxmhsuhiwwqchirkpaixellbccmkngu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578622.3412344-692-239749894350298/AnsiballZ_copy.py
Feb 20 09:10:23 np0005625204.localdomain sudo[135421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:23 np0005625204.localdomain python3.9[135423]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578622.3412344-692-239749894350298/.source.yaml _original_basename=.wwkg5hhx follow=False checksum=4c28d1662755c608a6ffaa942e27a2488c0a78a3 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:23 np0005625204.localdomain sudo[135421]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:23 np0005625204.localdomain sudo[135513]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmyuyclrvsxpwzizcbnawvpjszufqeca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578623.583747-737-158684465425097/AnsiballZ_slurp.py
Feb 20 09:10:23 np0005625204.localdomain sudo[135513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:24 np0005625204.localdomain python3.9[135515]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 20 09:10:24 np0005625204.localdomain sudo[135513]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3308 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A41280000000001030307) 
Feb 20 09:10:25 np0005625204.localdomain sudo[135618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wnnpdtpowvosyuuyzxkticjvvrntxoqz ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578624.5583212-764-118499776117482/async_wrapper.py j558637067917 300 /home/zuul/.ansible/tmp/ansible-tmp-1771578624.5583212-764-118499776117482/AnsiballZ_edpm_os_net_config.py _
Feb 20 09:10:25 np0005625204.localdomain sudo[135618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:25 np0005625204.localdomain ansible-async_wrapper.py[135620]: Invoked with j558637067917 300 /home/zuul/.ansible/tmp/ansible-tmp-1771578624.5583212-764-118499776117482/AnsiballZ_edpm_os_net_config.py _
Feb 20 09:10:25 np0005625204.localdomain ansible-async_wrapper.py[135623]: Starting module and watcher
Feb 20 09:10:25 np0005625204.localdomain ansible-async_wrapper.py[135623]: Start watching 135624 (300)
Feb 20 09:10:25 np0005625204.localdomain ansible-async_wrapper.py[135624]: Start module (135624)
Feb 20 09:10:25 np0005625204.localdomain ansible-async_wrapper.py[135620]: Return async_wrapper task started.
Feb 20 09:10:25 np0005625204.localdomain sudo[135618]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:25 np0005625204.localdomain python3.9[135625]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True remove_config=False safe_defaults=False use_nmstate=False purge_provider=
Feb 20 09:10:26 np0005625204.localdomain ansible-async_wrapper.py[135624]: Module complete (135624)
Feb 20 09:10:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45410 DF PROTO=TCP SPT=47560 DPT=9105 SEQ=2513380928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A4C290000000001030307) 
Feb 20 09:10:28 np0005625204.localdomain sudo[135727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcqffuxyzyrkhvmvzpxpmglmbqcnlxhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578628.550157-764-93324610570867/AnsiballZ_async_status.py
Feb 20 09:10:28 np0005625204.localdomain sudo[135727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:29 np0005625204.localdomain python3.9[135729]: ansible-ansible.legacy.async_status Invoked with jid=j558637067917.135620 mode=status _async_dir=/root/.ansible_async
Feb 20 09:10:29 np0005625204.localdomain sudo[135727]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:29 np0005625204.localdomain sudo[135786]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgzesivzhortqohudhqeiaqdpkteusmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578628.550157-764-93324610570867/AnsiballZ_async_status.py
Feb 20 09:10:29 np0005625204.localdomain sudo[135786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:29 np0005625204.localdomain python3.9[135788]: ansible-ansible.legacy.async_status Invoked with jid=j558637067917.135620 mode=cleanup _async_dir=/root/.ansible_async
Feb 20 09:10:29 np0005625204.localdomain sudo[135786]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45411 DF PROTO=TCP SPT=47560 DPT=9105 SEQ=2513380928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A54280000000001030307) 
Feb 20 09:10:30 np0005625204.localdomain ansible-async_wrapper.py[135623]: Done in kid B.
Feb 20 09:10:30 np0005625204.localdomain sudo[135878]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prrnfynlcocvfdtqbmftcibmkaeaqsbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578630.4778092-830-267898728335266/AnsiballZ_stat.py
Feb 20 09:10:30 np0005625204.localdomain sudo[135878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:30 np0005625204.localdomain python3.9[135880]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:10:30 np0005625204.localdomain sudo[135878]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:31 np0005625204.localdomain sudo[135951]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syudtmiegubdcrgldwdflgbyaojbmkzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578630.4778092-830-267898728335266/AnsiballZ_copy.py
Feb 20 09:10:31 np0005625204.localdomain sudo[135951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:31 np0005625204.localdomain python3.9[135953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578630.4778092-830-267898728335266/.source.returncode _original_basename=.05itn0s6 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:31 np0005625204.localdomain sudo[135951]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:32 np0005625204.localdomain sudo[136043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qifznpysgkbxuyyyhauocmrtnjftxtjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578631.7530682-878-121067531517744/AnsiballZ_stat.py
Feb 20 09:10:32 np0005625204.localdomain sudo[136043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:32 np0005625204.localdomain python3.9[136045]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:10:32 np0005625204.localdomain sudo[136043]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:32 np0005625204.localdomain sudo[136116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkffuwcmjdafpaqlsxpgburwbrtfokka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578631.7530682-878-121067531517744/AnsiballZ_copy.py
Feb 20 09:10:32 np0005625204.localdomain sudo[136116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:32 np0005625204.localdomain python3.9[136118]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578631.7530682-878-121067531517744/.source.cfg _original_basename=.k9psz1h3 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:10:32 np0005625204.localdomain sudo[136116]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3309 DF PROTO=TCP SPT=40502 DPT=9882 SEQ=3880154289 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A61680000000001030307) 
Feb 20 09:10:33 np0005625204.localdomain sudo[136208]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvluehilqejvaeqfaaosuzsojhfsmtpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578633.0681314-924-104540694144569/AnsiballZ_systemd.py
Feb 20 09:10:33 np0005625204.localdomain sudo[136208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:33 np0005625204.localdomain python3.9[136210]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:10:33 np0005625204.localdomain systemd[1]: Reloading Network Manager...
Feb 20 09:10:33 np0005625204.localdomain NetworkManager[5988]: <info>  [1771578633.6925] audit: op="reload" arg="0" pid=136214 uid=0 result="success"
Feb 20 09:10:33 np0005625204.localdomain NetworkManager[5988]: <info>  [1771578633.6934] config: signal: SIGHUP (no changes from disk)
Feb 20 09:10:33 np0005625204.localdomain systemd[1]: Reloaded Network Manager.
Feb 20 09:10:33 np0005625204.localdomain sudo[136208]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:34 np0005625204.localdomain sshd[130686]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:10:34 np0005625204.localdomain systemd-logind[759]: Session 41 logged out. Waiting for processes to exit.
Feb 20 09:10:34 np0005625204.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Feb 20 09:10:34 np0005625204.localdomain systemd[1]: session-41.scope: Consumed 36.534s CPU time.
Feb 20 09:10:34 np0005625204.localdomain systemd-logind[759]: Removed session 41.
Feb 20 09:10:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8425 DF PROTO=TCP SPT=44590 DPT=9101 SEQ=3992755531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A6E150000000001030307) 
Feb 20 09:10:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8427 DF PROTO=TCP SPT=44590 DPT=9101 SEQ=3992755531 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A7A280000000001030307) 
Feb 20 09:10:39 np0005625204.localdomain sshd[136229]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:10:39 np0005625204.localdomain sshd[136229]: Accepted publickey for zuul from 192.168.122.30 port 36244 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:10:39 np0005625204.localdomain systemd-logind[759]: New session 42 of user zuul.
Feb 20 09:10:39 np0005625204.localdomain systemd[1]: Started Session 42 of User zuul.
Feb 20 09:10:39 np0005625204.localdomain sshd[136229]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:10:40 np0005625204.localdomain python3.9[136322]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:10:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45413 DF PROTO=TCP SPT=47560 DPT=9105 SEQ=2513380928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A83680000000001030307) 
Feb 20 09:10:43 np0005625204.localdomain python3.9[136416]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:10:44 np0005625204.localdomain python3.9[136569]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:10:45 np0005625204.localdomain sshd[136229]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:10:45 np0005625204.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Feb 20 09:10:45 np0005625204.localdomain systemd[1]: session-42.scope: Consumed 2.069s CPU time.
Feb 20 09:10:45 np0005625204.localdomain systemd-logind[759]: Session 42 logged out. Waiting for processes to exit.
Feb 20 09:10:45 np0005625204.localdomain systemd-logind[759]: Removed session 42.
Feb 20 09:10:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33727 DF PROTO=TCP SPT=52134 DPT=9102 SEQ=3206208422 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A93690000000001030307) 
Feb 20 09:10:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44480 DF PROTO=TCP SPT=37062 DPT=9882 SEQ=1333836637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598A9A9A0000000001030307) 
Feb 20 09:10:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44482 DF PROTO=TCP SPT=37062 DPT=9882 SEQ=1333836637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AA6A80000000001030307) 
Feb 20 09:10:50 np0005625204.localdomain sshd[136585]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:10:51 np0005625204.localdomain sshd[136587]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:10:51 np0005625204.localdomain sshd[136587]: Accepted publickey for zuul from 192.168.122.30 port 45484 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:10:51 np0005625204.localdomain systemd-logind[759]: New session 43 of user zuul.
Feb 20 09:10:51 np0005625204.localdomain systemd[1]: Started Session 43 of User zuul.
Feb 20 09:10:51 np0005625204.localdomain sshd[136587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:10:51 np0005625204.localdomain sshd[136585]: Invalid user solana from 45.148.10.240 port 40998
Feb 20 09:10:51 np0005625204.localdomain sshd[136585]: Connection closed by invalid user solana 45.148.10.240 port 40998 [preauth]
Feb 20 09:10:52 np0005625204.localdomain python3.9[136680]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:10:53 np0005625204.localdomain python3.9[136774]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:10:54 np0005625204.localdomain sudo[136868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzpyjprigvxgrndmfsbuvxuqkgvraxao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578653.8193045-77-166227512538682/AnsiballZ_setup.py
Feb 20 09:10:54 np0005625204.localdomain sudo[136868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:54 np0005625204.localdomain python3.9[136870]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:10:54 np0005625204.localdomain sudo[136868]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44483 DF PROTO=TCP SPT=37062 DPT=9882 SEQ=1333836637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AB6680000000001030307) 
Feb 20 09:10:55 np0005625204.localdomain sudo[136922]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssatpbiajkdyefmgapfkynrvzzurkifi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578653.8193045-77-166227512538682/AnsiballZ_dnf.py
Feb 20 09:10:55 np0005625204.localdomain sudo[136922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:55 np0005625204.localdomain python3.9[136924]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:10:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59712 DF PROTO=TCP SPT=49296 DPT=9105 SEQ=2544659311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AC1680000000001030307) 
Feb 20 09:10:58 np0005625204.localdomain sudo[136922]: pam_unix(sudo:session): session closed for user root
Feb 20 09:10:59 np0005625204.localdomain sudo[137016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvckjszlvbmufmqcvsoljkjecjxgrluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578658.9574218-113-69155884867297/AnsiballZ_setup.py
Feb 20 09:10:59 np0005625204.localdomain sudo[137016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:10:59 np0005625204.localdomain python3.9[137018]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:10:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59713 DF PROTO=TCP SPT=49296 DPT=9105 SEQ=2544659311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AC9690000000001030307) 
Feb 20 09:10:59 np0005625204.localdomain sudo[137016]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:00 np0005625204.localdomain sudo[137171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbrgedntuxtdatmrmjymppprrhyzbwyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578660.4100206-146-117063538773685/AnsiballZ_file.py
Feb 20 09:11:00 np0005625204.localdomain sudo[137171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:00 np0005625204.localdomain python3.9[137173]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:00 np0005625204.localdomain sudo[137171]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:01 np0005625204.localdomain sudo[137263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqidqwhidruwsxidiculbzvauptzvzgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578661.0939696-170-236589150354992/AnsiballZ_command.py
Feb 20 09:11:01 np0005625204.localdomain sudo[137263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:01 np0005625204.localdomain python3.9[137265]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:11:01 np0005625204.localdomain sudo[137263]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:02 np0005625204.localdomain sudo[137367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmnrnxgbirhhuwqoaxghmvoxwtbusvgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578662.0452816-194-246411448431649/AnsiballZ_stat.py
Feb 20 09:11:02 np0005625204.localdomain sudo[137367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:02 np0005625204.localdomain python3.9[137369]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:02 np0005625204.localdomain sudo[137367]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51232 DF PROTO=TCP SPT=41524 DPT=9100 SEQ=138677609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AD5680000000001030307) 
Feb 20 09:11:02 np0005625204.localdomain sudo[137415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmgurnutdiexagccucjtzcdajpxtwike ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578662.0452816-194-246411448431649/AnsiballZ_file.py
Feb 20 09:11:02 np0005625204.localdomain sudo[137415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:03 np0005625204.localdomain python3.9[137417]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:03 np0005625204.localdomain sudo[137415]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:03 np0005625204.localdomain sudo[137507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztyuubrcfagvisbqmywtzwhsykgliuzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578663.3826957-230-14596400509870/AnsiballZ_stat.py
Feb 20 09:11:03 np0005625204.localdomain sudo[137507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:03 np0005625204.localdomain python3.9[137509]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:03 np0005625204.localdomain sudo[137507]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:04 np0005625204.localdomain sudo[137555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfteikfnmozzwypgkxthwiqxxmzsobvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578663.3826957-230-14596400509870/AnsiballZ_file.py
Feb 20 09:11:04 np0005625204.localdomain sudo[137555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:04 np0005625204.localdomain python3.9[137557]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:04 np0005625204.localdomain sudo[137555]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:05 np0005625204.localdomain sudo[137647]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wruvrgcwzgzsvajtkoiufmjvsdomwmsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578664.8896134-269-225744652967795/AnsiballZ_ini_file.py
Feb 20 09:11:05 np0005625204.localdomain sudo[137647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:05 np0005625204.localdomain python3.9[137649]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:05 np0005625204.localdomain sudo[137647]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:06 np0005625204.localdomain sudo[137739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snqcvwrnixcbjedgrnzcnmenupuaahtp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578665.8992205-269-52444307910405/AnsiballZ_ini_file.py
Feb 20 09:11:06 np0005625204.localdomain sudo[137739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43856 DF PROTO=TCP SPT=40846 DPT=9101 SEQ=1695473391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AE3450000000001030307) 
Feb 20 09:11:06 np0005625204.localdomain python3.9[137741]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:06 np0005625204.localdomain sudo[137739]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:06 np0005625204.localdomain sudo[137831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrelzqggrhzqbiynduiheijygosdyudy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578666.4923923-269-45400553850786/AnsiballZ_ini_file.py
Feb 20 09:11:06 np0005625204.localdomain sudo[137831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:06 np0005625204.localdomain python3.9[137833]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:06 np0005625204.localdomain sudo[137831]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:07 np0005625204.localdomain sudo[137923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwfuzytopavljfcovgpclgbzjsbexpat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578667.1062505-269-144778118752448/AnsiballZ_ini_file.py
Feb 20 09:11:07 np0005625204.localdomain sudo[137923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:07 np0005625204.localdomain python3.9[137925]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:07 np0005625204.localdomain sudo[137923]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:08 np0005625204.localdomain sudo[138015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lspqvuijjtcbdselpveyveafyzwbywki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578667.8845718-362-70755621361045/AnsiballZ_dnf.py
Feb 20 09:11:08 np0005625204.localdomain sudo[138015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:08 np0005625204.localdomain python3.9[138017]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:11:08 np0005625204.localdomain sshd[138020]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:11:09 np0005625204.localdomain sshd[138020]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:11:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43858 DF PROTO=TCP SPT=40846 DPT=9101 SEQ=1695473391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AEF680000000001030307) 
Feb 20 09:11:11 np0005625204.localdomain sudo[138015]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59715 DF PROTO=TCP SPT=49296 DPT=9105 SEQ=2544659311 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598AF9680000000001030307) 
Feb 20 09:11:12 np0005625204.localdomain sudo[138111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joibxzrfmpfzpntfzhzyxxwxrwkpbxib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578672.2326589-395-102488946893273/AnsiballZ_setup.py
Feb 20 09:11:12 np0005625204.localdomain sudo[138111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:12 np0005625204.localdomain python3.9[138113]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:11:12 np0005625204.localdomain sudo[138111]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:13 np0005625204.localdomain sudo[138205]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybvxqjimqzgvuthdqnrvdwtvtybriqjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578673.021562-419-238196516648433/AnsiballZ_stat.py
Feb 20 09:11:13 np0005625204.localdomain sudo[138205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:13 np0005625204.localdomain python3.9[138207]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:11:13 np0005625204.localdomain sudo[138205]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:14 np0005625204.localdomain sudo[138297]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfjodnixmtwpzxfnkzvkoookhhlrxtwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578673.7083182-446-159367510739236/AnsiballZ_stat.py
Feb 20 09:11:14 np0005625204.localdomain sudo[138297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:14 np0005625204.localdomain python3.9[138299]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:11:14 np0005625204.localdomain sudo[138297]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:14 np0005625204.localdomain sudo[138389]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aetmoinigmrltqvjbmevgeznhhlviceg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578674.600125-477-24872405135185/AnsiballZ_command.py
Feb 20 09:11:14 np0005625204.localdomain sudo[138389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:15 np0005625204.localdomain python3.9[138391]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:11:15 np0005625204.localdomain sudo[138389]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:15 np0005625204.localdomain sudo[138482]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yivyfffavuyjzctcwpberrydeouxnduo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578675.3606312-506-66207154774005/AnsiballZ_service_facts.py
Feb 20 09:11:15 np0005625204.localdomain sudo[138482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:16 np0005625204.localdomain python3.9[138484]: ansible-service_facts Invoked
Feb 20 09:11:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46028 DF PROTO=TCP SPT=47372 DPT=9100 SEQ=2546420612 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B09680000000001030307) 
Feb 20 09:11:17 np0005625204.localdomain network[138501]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:11:17 np0005625204.localdomain network[138502]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:11:17 np0005625204.localdomain network[138503]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:11:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45950 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B0FCA0000000001030307) 
Feb 20 09:11:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:11:20 np0005625204.localdomain sudo[138482]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45952 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B1BE90000000001030307) 
Feb 20 09:11:21 np0005625204.localdomain sudo[138693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:11:21 np0005625204.localdomain sudo[138693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:11:21 np0005625204.localdomain sudo[138693]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:21 np0005625204.localdomain sudo[138730]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkrmpxavyghhwzuvevxvzawytffxhidg ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1771578681.3181312-552-128012586820146/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1771578681.3181312-552-128012586820146/args
Feb 20 09:11:21 np0005625204.localdomain sudo[138730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:21 np0005625204.localdomain sudo[138731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:11:21 np0005625204.localdomain sudo[138731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:11:21 np0005625204.localdomain sudo[138730]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:22 np0005625204.localdomain sudo[138876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icjztodulqmakjrzoyjxmaooijazilvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578682.0511093-585-145688619971834/AnsiballZ_dnf.py
Feb 20 09:11:22 np0005625204.localdomain sudo[138876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:22 np0005625204.localdomain sudo[138731]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:22 np0005625204.localdomain python3.9[138887]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:11:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45953 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B2BA90000000001030307) 
Feb 20 09:11:25 np0005625204.localdomain sudo[138876]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:25 np0005625204.localdomain sudo[138896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:11:25 np0005625204.localdomain sudo[138896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:11:25 np0005625204.localdomain sudo[138896]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:26 np0005625204.localdomain sudo[138994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czhjqteyljafnxdazssmqsqprxgkzmdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578686.331184-623-65769196660731/AnsiballZ_package_facts.py
Feb 20 09:11:26 np0005625204.localdomain sudo[138994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:27 np0005625204.localdomain python3.9[138996]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 20 09:11:27 np0005625204.localdomain sudo[138994]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34525 DF PROTO=TCP SPT=51776 DPT=9105 SEQ=3904820507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B36680000000001030307) 
Feb 20 09:11:28 np0005625204.localdomain sudo[139086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sdnkqdhxxqizcmsqftqayziwlknjhxur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578688.2537355-654-205181155832597/AnsiballZ_stat.py
Feb 20 09:11:28 np0005625204.localdomain sudo[139086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:28 np0005625204.localdomain python3.9[139088]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:28 np0005625204.localdomain sudo[139086]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:29 np0005625204.localdomain sudo[139161]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkgywpvdvhhemcqizkbxgaisbqdqhupn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578688.2537355-654-205181155832597/AnsiballZ_copy.py
Feb 20 09:11:29 np0005625204.localdomain sudo[139161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:29 np0005625204.localdomain python3.9[139163]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578688.2537355-654-205181155832597/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:29 np0005625204.localdomain sudo[139161]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34526 DF PROTO=TCP SPT=51776 DPT=9105 SEQ=3904820507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B3E680000000001030307) 
Feb 20 09:11:30 np0005625204.localdomain sudo[139255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckasgmgerkrkdbzzvyofyudeaonhowzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578689.7892437-701-261540160807018/AnsiballZ_stat.py
Feb 20 09:11:30 np0005625204.localdomain sudo[139255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:30 np0005625204.localdomain python3.9[139257]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:30 np0005625204.localdomain sudo[139255]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:30 np0005625204.localdomain sudo[139330]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aenduirrthhzfkjvfjhctbvjllhraddl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578689.7892437-701-261540160807018/AnsiballZ_copy.py
Feb 20 09:11:30 np0005625204.localdomain sudo[139330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:30 np0005625204.localdomain sshd[139333]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:11:30 np0005625204.localdomain python3.9[139332]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578689.7892437-701-261540160807018/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:30 np0005625204.localdomain sudo[139330]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:31 np0005625204.localdomain sshd[139333]: Invalid user cma from 96.78.175.36 port 51088
Feb 20 09:11:31 np0005625204.localdomain sshd[139333]: Received disconnect from 96.78.175.36 port 51088:11: Bye Bye [preauth]
Feb 20 09:11:31 np0005625204.localdomain sshd[139333]: Disconnected from invalid user cma 96.78.175.36 port 51088 [preauth]
Feb 20 09:11:32 np0005625204.localdomain sudo[139426]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlosewopaszamolhbjbgbitmyvcszgjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578691.7969284-762-82750221265468/AnsiballZ_lineinfile.py
Feb 20 09:11:32 np0005625204.localdomain sudo[139426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:32 np0005625204.localdomain python3.9[139428]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:32 np0005625204.localdomain sudo[139426]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45954 DF PROTO=TCP SPT=56276 DPT=9882 SEQ=2520786034 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B4B680000000001030307) 
Feb 20 09:11:33 np0005625204.localdomain sudo[139520]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvpbxghncfcdwlyhktqtkzpdspzimcyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578693.479727-809-42815479860394/AnsiballZ_setup.py
Feb 20 09:11:33 np0005625204.localdomain sudo[139520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:34 np0005625204.localdomain python3.9[139522]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:11:34 np0005625204.localdomain sudo[139520]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:34 np0005625204.localdomain sudo[139574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sknzgrkhzeqridvkdwxyxbxuhewlncjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578693.479727-809-42815479860394/AnsiballZ_systemd.py
Feb 20 09:11:34 np0005625204.localdomain sudo[139574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:35 np0005625204.localdomain python3.9[139576]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:11:35 np0005625204.localdomain sudo[139574]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3021 DF PROTO=TCP SPT=51478 DPT=9101 SEQ=329870299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B58740000000001030307) 
Feb 20 09:11:37 np0005625204.localdomain sudo[139668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kriiutwnplikuaaecqlnxiqxevtflooq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578697.0344787-856-13675348735853/AnsiballZ_setup.py
Feb 20 09:11:37 np0005625204.localdomain sudo[139668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:37 np0005625204.localdomain python3.9[139670]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:11:37 np0005625204.localdomain sudo[139668]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:38 np0005625204.localdomain sudo[139722]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysduiqjidiklqmcpgsgyfwljromwiirr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578697.0344787-856-13675348735853/AnsiballZ_systemd.py
Feb 20 09:11:38 np0005625204.localdomain sudo[139722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:38 np0005625204.localdomain python3.9[139724]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:11:38 np0005625204.localdomain chronyd[26351]: chronyd exiting
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: Stopping NTP client/server...
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: Stopped NTP client/server.
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: Starting NTP client/server...
Feb 20 09:11:38 np0005625204.localdomain chronyd[139732]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 20 09:11:38 np0005625204.localdomain chronyd[139732]: Frequency -30.463 +/- 0.395 ppm read from /var/lib/chrony/drift
Feb 20 09:11:38 np0005625204.localdomain chronyd[139732]: Loaded seccomp filter (level 2)
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: Started NTP client/server.
Feb 20 09:11:38 np0005625204.localdomain sudo[139722]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:38 np0005625204.localdomain sshd[136587]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Feb 20 09:11:38 np0005625204.localdomain systemd[1]: session-43.scope: Consumed 28.846s CPU time.
Feb 20 09:11:38 np0005625204.localdomain systemd-logind[759]: Session 43 logged out. Waiting for processes to exit.
Feb 20 09:11:38 np0005625204.localdomain systemd-logind[759]: Removed session 43.
Feb 20 09:11:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3023 DF PROTO=TCP SPT=51478 DPT=9101 SEQ=329870299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B64680000000001030307) 
Feb 20 09:11:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34528 DF PROTO=TCP SPT=51776 DPT=9105 SEQ=3904820507 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B6F680000000001030307) 
Feb 20 09:11:44 np0005625204.localdomain sshd[139748]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:11:44 np0005625204.localdomain sshd[139748]: Accepted publickey for zuul from 192.168.122.30 port 39504 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:11:44 np0005625204.localdomain systemd-logind[759]: New session 44 of user zuul.
Feb 20 09:11:44 np0005625204.localdomain systemd[1]: Started Session 44 of User zuul.
Feb 20 09:11:44 np0005625204.localdomain sshd[139748]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:11:45 np0005625204.localdomain python3.9[139841]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:11:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63451 DF PROTO=TCP SPT=35400 DPT=9100 SEQ=1136106703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B7F680000000001030307) 
Feb 20 09:11:46 np0005625204.localdomain sudo[139935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlxxfsumdhsadnpqcmxnxjndbfkkwgih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578706.1647012-56-138192224152460/AnsiballZ_file.py
Feb 20 09:11:46 np0005625204.localdomain sudo[139935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:46 np0005625204.localdomain python3.9[139937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:46 np0005625204.localdomain sudo[139935]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:47 np0005625204.localdomain sudo[140040]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-guyyqwigetxzbuvmhtccnszzeexesqph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578706.934987-80-119052987509498/AnsiballZ_stat.py
Feb 20 09:11:47 np0005625204.localdomain sudo[140040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:47 np0005625204.localdomain python3.9[140042]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:47 np0005625204.localdomain sudo[140040]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22224 DF PROTO=TCP SPT=46684 DPT=9882 SEQ=2265295752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B84FA0000000001030307) 
Feb 20 09:11:47 np0005625204.localdomain sudo[140088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbpiuljpkkzzstkmkahyeqxbygfptjxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578706.934987-80-119052987509498/AnsiballZ_file.py
Feb 20 09:11:47 np0005625204.localdomain sudo[140088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:48 np0005625204.localdomain python3.9[140090]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.fum15ngs recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:48 np0005625204.localdomain sudo[140088]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:48 np0005625204.localdomain sudo[140180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otvlnfhhblbafvjxfxvmntvjtbporyjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578708.4834752-141-148363443541352/AnsiballZ_stat.py
Feb 20 09:11:48 np0005625204.localdomain sudo[140180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:48 np0005625204.localdomain python3.9[140182]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:48 np0005625204.localdomain sudo[140180]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:49 np0005625204.localdomain sudo[140255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnjvphxqqfqgszlnmelkkknrvsjlkrxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578708.4834752-141-148363443541352/AnsiballZ_copy.py
Feb 20 09:11:49 np0005625204.localdomain sudo[140255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:49 np0005625204.localdomain python3.9[140257]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578708.4834752-141-148363443541352/.source _original_basename=.7mi9d_a3 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:49 np0005625204.localdomain sudo[140255]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:50 np0005625204.localdomain sudo[140347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwwfouqertcolbdlevgglqorhfulmloo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578709.8857715-188-260273962584713/AnsiballZ_file.py
Feb 20 09:11:50 np0005625204.localdomain sudo[140347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:50 np0005625204.localdomain python3.9[140349]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:50 np0005625204.localdomain sudo[140347]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22226 DF PROTO=TCP SPT=46684 DPT=9882 SEQ=2265295752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598B90E80000000001030307) 
Feb 20 09:11:50 np0005625204.localdomain sudo[140439]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciorsfnbteognoprejttidioawvflnag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578710.5190637-212-13051073380919/AnsiballZ_stat.py
Feb 20 09:11:50 np0005625204.localdomain sudo[140439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:51 np0005625204.localdomain python3.9[140441]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:51 np0005625204.localdomain sudo[140439]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:51 np0005625204.localdomain sudo[140512]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilwuhsfgyzrvvahjvuqpuahiwosvpktc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578710.5190637-212-13051073380919/AnsiballZ_copy.py
Feb 20 09:11:51 np0005625204.localdomain sudo[140512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:51 np0005625204.localdomain python3.9[140514]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578710.5190637-212-13051073380919/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:51 np0005625204.localdomain sudo[140512]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:52 np0005625204.localdomain sudo[140604]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qznyxqtkgewxyrnencagaxqhazdlgzdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578711.7641823-212-183008077849754/AnsiballZ_stat.py
Feb 20 09:11:52 np0005625204.localdomain sudo[140604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:52 np0005625204.localdomain python3.9[140606]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:52 np0005625204.localdomain sudo[140604]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:52 np0005625204.localdomain sudo[140677]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agrtklvtfnufriwimrraxzumasllhnyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578711.7641823-212-183008077849754/AnsiballZ_copy.py
Feb 20 09:11:52 np0005625204.localdomain sudo[140677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:52 np0005625204.localdomain python3.9[140679]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578711.7641823-212-183008077849754/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:11:52 np0005625204.localdomain sudo[140677]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:53 np0005625204.localdomain sudo[140769]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhvbogqebzueeuumemuwzhnjnmjwyrvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578712.906452-299-258374791307052/AnsiballZ_file.py
Feb 20 09:11:53 np0005625204.localdomain sudo[140769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:53 np0005625204.localdomain python3.9[140771]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:53 np0005625204.localdomain sudo[140769]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:53 np0005625204.localdomain sudo[140861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-endmnyuvyrlgbxshuabihtjfcascflgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578713.5401254-323-55671354908258/AnsiballZ_stat.py
Feb 20 09:11:53 np0005625204.localdomain sudo[140861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:54 np0005625204.localdomain python3.9[140863]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:54 np0005625204.localdomain sudo[140861]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:54 np0005625204.localdomain sshd[140885]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:11:54 np0005625204.localdomain sshd[140885]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:11:54 np0005625204.localdomain sudo[140936]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mywktakalzvcjhqrkmanyprkhbbeitvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578713.5401254-323-55671354908258/AnsiballZ_copy.py
Feb 20 09:11:54 np0005625204.localdomain sudo[140936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:54 np0005625204.localdomain python3.9[140938]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578713.5401254-323-55671354908258/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:54 np0005625204.localdomain sudo[140936]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22227 DF PROTO=TCP SPT=46684 DPT=9882 SEQ=2265295752 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BA0A80000000001030307) 
Feb 20 09:11:55 np0005625204.localdomain sudo[141028]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enpqbfgrilkioiykrwaldhfbmxglrbgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578714.7940207-369-253669147733664/AnsiballZ_stat.py
Feb 20 09:11:55 np0005625204.localdomain sudo[141028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:55 np0005625204.localdomain python3.9[141030]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:55 np0005625204.localdomain sudo[141028]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:55 np0005625204.localdomain sudo[141101]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcbiwdjdntxemzdlxhlqnwvqhgyaijvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578714.7940207-369-253669147733664/AnsiballZ_copy.py
Feb 20 09:11:55 np0005625204.localdomain sudo[141101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:55 np0005625204.localdomain python3.9[141103]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578714.7940207-369-253669147733664/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:55 np0005625204.localdomain sudo[141101]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:56 np0005625204.localdomain sudo[141193]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhmazfztdhivunotscrbfvndzwjlucrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578716.1012244-413-252184522846992/AnsiballZ_systemd.py
Feb 20 09:11:56 np0005625204.localdomain sudo[141193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:56 np0005625204.localdomain python3.9[141195]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:11:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:11:57 np0005625204.localdomain systemd-rc-local-generator[141223]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:11:57 np0005625204.localdomain systemd-sysv-generator[141227]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:11:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:11:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:11:57 np0005625204.localdomain systemd-rc-local-generator[141256]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:11:57 np0005625204.localdomain systemd-sysv-generator[141259]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:11:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:11:57 np0005625204.localdomain systemd[1]: Starting EDPM Container Shutdown...
Feb 20 09:11:57 np0005625204.localdomain systemd[1]: Finished EDPM Container Shutdown.
Feb 20 09:11:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40835 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BABA80000000001030307) 
Feb 20 09:11:57 np0005625204.localdomain sudo[141193]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:58 np0005625204.localdomain sudo[141362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwkqaukkjrcjnkgyuaemcdlzjbhciwwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578717.8041644-437-212172924880594/AnsiballZ_stat.py
Feb 20 09:11:58 np0005625204.localdomain sudo[141362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:58 np0005625204.localdomain python3.9[141364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:58 np0005625204.localdomain sudo[141362]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:58 np0005625204.localdomain sudo[141435]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rissxifmwqllyodccpjvvnnfljclxocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578717.8041644-437-212172924880594/AnsiballZ_copy.py
Feb 20 09:11:58 np0005625204.localdomain sudo[141435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:58 np0005625204.localdomain python3.9[141437]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578717.8041644-437-212172924880594/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:11:58 np0005625204.localdomain sudo[141435]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:59 np0005625204.localdomain sudo[141527]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxcirhpkhbhnfdismqgxgxdxjqmyeujj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578719.0774527-483-209021327415134/AnsiballZ_stat.py
Feb 20 09:11:59 np0005625204.localdomain sudo[141527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:11:59 np0005625204.localdomain python3.9[141529]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:11:59 np0005625204.localdomain sudo[141527]: pam_unix(sudo:session): session closed for user root
Feb 20 09:11:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40836 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BB3A90000000001030307) 
Feb 20 09:11:59 np0005625204.localdomain sudo[141600]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsgzkcexgusqltuxmviwcnjjoxmkhdbg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578719.0774527-483-209021327415134/AnsiballZ_copy.py
Feb 20 09:11:59 np0005625204.localdomain sudo[141600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:00 np0005625204.localdomain python3.9[141602]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578719.0774527-483-209021327415134/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:00 np0005625204.localdomain sudo[141600]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:00 np0005625204.localdomain sudo[141692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urfwpldvodeekyfvslwllqzwlukuxgwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578720.2591047-527-39341679311357/AnsiballZ_systemd.py
Feb 20 09:12:00 np0005625204.localdomain sudo[141692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:00 np0005625204.localdomain python3.9[141694]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:12:00 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:12:00 np0005625204.localdomain systemd-rc-local-generator[141717]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:12:00 np0005625204.localdomain systemd-sysv-generator[141721]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:12:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:12:02 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 09:12:02 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 09:12:02 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 09:12:02 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 09:12:02 np0005625204.localdomain sudo[141692]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63452 DF PROTO=TCP SPT=35400 DPT=9100 SEQ=1136106703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BBF680000000001030307) 
Feb 20 09:12:02 np0005625204.localdomain python3.9[141826]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:12:02 np0005625204.localdomain network[141843]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:12:02 np0005625204.localdomain network[141844]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:12:03 np0005625204.localdomain network[141845]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:12:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:12:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24885 DF PROTO=TCP SPT=39744 DPT=9101 SEQ=1925110667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BCDA60000000001030307) 
Feb 20 09:12:07 np0005625204.localdomain sudo[142044]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-semvezbdviozkbrxvkmkixxvhnpceymi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578727.584969-606-127817048547378/AnsiballZ_stat.py
Feb 20 09:12:07 np0005625204.localdomain sudo[142044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:08 np0005625204.localdomain python3.9[142046]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:08 np0005625204.localdomain sudo[142044]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:08 np0005625204.localdomain sudo[142119]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuucpxvpuxnvhbzpgncjlxvlyxplttha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578727.584969-606-127817048547378/AnsiballZ_copy.py
Feb 20 09:12:08 np0005625204.localdomain sudo[142119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:08 np0005625204.localdomain python3.9[142121]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578727.584969-606-127817048547378/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:08 np0005625204.localdomain sudo[142119]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:09 np0005625204.localdomain sudo[142212]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gllfhrqhluepibzxoyynndoweupqaief ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578728.9344873-651-189922046110016/AnsiballZ_systemd.py
Feb 20 09:12:09 np0005625204.localdomain sudo[142212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24887 DF PROTO=TCP SPT=39744 DPT=9101 SEQ=1925110667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BD9A80000000001030307) 
Feb 20 09:12:09 np0005625204.localdomain python3.9[142214]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:12:09 np0005625204.localdomain systemd[1]: Reloading OpenSSH server daemon...
Feb 20 09:12:09 np0005625204.localdomain sshd[121278]: Received SIGHUP; restarting.
Feb 20 09:12:09 np0005625204.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Feb 20 09:12:09 np0005625204.localdomain sshd[121278]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:12:09 np0005625204.localdomain sshd[121278]: Server listening on 0.0.0.0 port 22.
Feb 20 09:12:09 np0005625204.localdomain sshd[121278]: Server listening on :: port 22.
Feb 20 09:12:09 np0005625204.localdomain sudo[142212]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:10 np0005625204.localdomain sudo[142308]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crjelxnkqnlcthhhdnsvlpvuzutdpsxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578729.791022-675-217102823307065/AnsiballZ_file.py
Feb 20 09:12:10 np0005625204.localdomain sudo[142308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:10 np0005625204.localdomain python3.9[142310]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:10 np0005625204.localdomain sudo[142308]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:10 np0005625204.localdomain sudo[142400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgvfifrnklvfhlptaxqljfuhkywfeaeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578730.4907484-700-142983149238182/AnsiballZ_stat.py
Feb 20 09:12:10 np0005625204.localdomain sudo[142400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:10 np0005625204.localdomain python3.9[142402]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:10 np0005625204.localdomain sudo[142400]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:11 np0005625204.localdomain sudo[142473]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljgpgewhgxneygkhjkznapeharfsdzzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578730.4907484-700-142983149238182/AnsiballZ_copy.py
Feb 20 09:12:11 np0005625204.localdomain sudo[142473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:11 np0005625204.localdomain python3.9[142475]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578730.4907484-700-142983149238182/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:11 np0005625204.localdomain sudo[142473]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40838 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BE3680000000001030307) 
Feb 20 09:12:12 np0005625204.localdomain sudo[142565]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjyjhwceoeljxbufjvwopdxgrrbnayjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578731.8464873-752-62813704957711/AnsiballZ_timezone.py
Feb 20 09:12:12 np0005625204.localdomain sudo[142565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:12 np0005625204.localdomain python3.9[142567]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 20 09:12:12 np0005625204.localdomain systemd[1]: Starting Time & Date Service...
Feb 20 09:12:12 np0005625204.localdomain systemd[1]: Started Time & Date Service.
Feb 20 09:12:12 np0005625204.localdomain sudo[142565]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:13 np0005625204.localdomain sudo[142661]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bszbwyifznlxgiekmymlgfqtrhbvlpkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578732.8705554-780-218017129754058/AnsiballZ_file.py
Feb 20 09:12:13 np0005625204.localdomain sudo[142661]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:13 np0005625204.localdomain python3.9[142663]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:13 np0005625204.localdomain sudo[142661]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:13 np0005625204.localdomain sudo[142753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nffqkifdcutugkqxvhijotvcsnrawsps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578733.5220916-804-205492178996494/AnsiballZ_stat.py
Feb 20 09:12:13 np0005625204.localdomain sudo[142753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:13 np0005625204.localdomain python3.9[142755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:13 np0005625204.localdomain sudo[142753]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:14 np0005625204.localdomain sudo[142826]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpadtqtrualrylwsmncmhvqdxnycnffj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578733.5220916-804-205492178996494/AnsiballZ_copy.py
Feb 20 09:12:14 np0005625204.localdomain sudo[142826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:14 np0005625204.localdomain python3.9[142828]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578733.5220916-804-205492178996494/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:14 np0005625204.localdomain sudo[142826]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:14 np0005625204.localdomain sudo[142918]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uclrwbrduvkqnwyysfsuajwcrnpfqjfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578734.6781616-849-28773571617355/AnsiballZ_stat.py
Feb 20 09:12:14 np0005625204.localdomain sudo[142918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:15 np0005625204.localdomain python3.9[142920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:15 np0005625204.localdomain sudo[142918]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:15 np0005625204.localdomain sudo[142991]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgygcojefokpdjjcscyfrwpwkhzvnvur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578734.6781616-849-28773571617355/AnsiballZ_copy.py
Feb 20 09:12:15 np0005625204.localdomain sudo[142991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:15 np0005625204.localdomain python3.9[142993]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578734.6781616-849-28773571617355/.source.yaml _original_basename=.gvis7vl3 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:15 np0005625204.localdomain sudo[142991]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8635 DF PROTO=TCP SPT=48194 DPT=9100 SEQ=3024423624 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BF3680000000001030307) 
Feb 20 09:12:16 np0005625204.localdomain sudo[143083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kabzzjkqbkcmzjqkimgpzezebwzrnqtv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578735.9233341-894-237230867035291/AnsiballZ_stat.py
Feb 20 09:12:16 np0005625204.localdomain sudo[143083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:16 np0005625204.localdomain python3.9[143085]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:16 np0005625204.localdomain sudo[143083]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:16 np0005625204.localdomain sudo[143158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-askvxanfjoumkyaoihrlxvivcsxwwnep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578735.9233341-894-237230867035291/AnsiballZ_copy.py
Feb 20 09:12:16 np0005625204.localdomain sudo[143158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:16 np0005625204.localdomain python3.9[143160]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578735.9233341-894-237230867035291/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:16 np0005625204.localdomain sudo[143158]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:17 np0005625204.localdomain sudo[143250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdlcdxgjqxvkialdthusswvpexpnehbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578737.1894238-939-36878444538427/AnsiballZ_command.py
Feb 20 09:12:17 np0005625204.localdomain sudo[143250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10460 DF PROTO=TCP SPT=42346 DPT=9882 SEQ=3856575038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598BFA2A0000000001030307) 
Feb 20 09:12:17 np0005625204.localdomain python3.9[143252]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:12:17 np0005625204.localdomain sudo[143250]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:18 np0005625204.localdomain sudo[143343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgrqstlzwyhnvqkhqqpsipmwwjvlyuut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578738.0443308-963-37337197113186/AnsiballZ_command.py
Feb 20 09:12:18 np0005625204.localdomain sudo[143343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:18 np0005625204.localdomain python3.9[143345]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:12:18 np0005625204.localdomain sudo[143343]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:19 np0005625204.localdomain sudo[143436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btcokpjfkkmegddtjhruscfxpzmgulin ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771578738.753177-986-134122520453655/AnsiballZ_edpm_nftables_from_files.py
Feb 20 09:12:19 np0005625204.localdomain sudo[143436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:19 np0005625204.localdomain python3[143438]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 20 09:12:19 np0005625204.localdomain sudo[143436]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:20 np0005625204.localdomain sudo[143528]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjpwwpimtyhqneyotibnrxsvemcbfjxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578739.996921-1011-264038146915696/AnsiballZ_stat.py
Feb 20 09:12:20 np0005625204.localdomain sudo[143528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:20 np0005625204.localdomain python3.9[143530]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:20 np0005625204.localdomain sudo[143528]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:20 np0005625204.localdomain sudo[143601]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxqeqgiqkjcsnbjonseqkklnvdaffeig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578739.996921-1011-264038146915696/AnsiballZ_copy.py
Feb 20 09:12:20 np0005625204.localdomain sudo[143601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:21 np0005625204.localdomain python3.9[143603]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578739.996921-1011-264038146915696/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:21 np0005625204.localdomain sudo[143601]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:21 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24889 DF PROTO=TCP SPT=39744 DPT=9101 SEQ=1925110667 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C09690000000001030307) 
Feb 20 09:12:21 np0005625204.localdomain sudo[143693]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihlbkfziddkxpwnbazzzpfyxmafxghfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578741.5345569-1056-280498703820595/AnsiballZ_stat.py
Feb 20 09:12:21 np0005625204.localdomain sudo[143693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:22 np0005625204.localdomain python3.9[143695]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:22 np0005625204.localdomain sudo[143693]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:22 np0005625204.localdomain sudo[143766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdlgwpyiytdgqzlbqirlculxlnkxjqof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578741.5345569-1056-280498703820595/AnsiballZ_copy.py
Feb 20 09:12:22 np0005625204.localdomain sudo[143766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:22 np0005625204.localdomain python3.9[143768]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578741.5345569-1056-280498703820595/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:22 np0005625204.localdomain sudo[143766]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:22 np0005625204.localdomain sudo[143858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtddlogfscdziivmuwwzoauuzwyahkka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578742.7351081-1101-165052859746299/AnsiballZ_stat.py
Feb 20 09:12:23 np0005625204.localdomain sudo[143858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:23 np0005625204.localdomain python3.9[143860]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:23 np0005625204.localdomain sudo[143858]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:23 np0005625204.localdomain sudo[143931]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hlotgyjemeoazaialafuwrrvyvmtqdku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578742.7351081-1101-165052859746299/AnsiballZ_copy.py
Feb 20 09:12:23 np0005625204.localdomain sudo[143931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:23 np0005625204.localdomain python3.9[143933]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578742.7351081-1101-165052859746299/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:23 np0005625204.localdomain sudo[143931]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:24 np0005625204.localdomain sudo[144023]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egapkzglhkpdcyxzhiyjuoufezharocq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578743.8979042-1145-60352927874603/AnsiballZ_stat.py
Feb 20 09:12:24 np0005625204.localdomain sudo[144023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:24 np0005625204.localdomain python3.9[144025]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:24 np0005625204.localdomain sudo[144023]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:24 np0005625204.localdomain sudo[144096]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poaqcsoyliswdrdurjsremrhpmpttdti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578743.8979042-1145-60352927874603/AnsiballZ_copy.py
Feb 20 09:12:24 np0005625204.localdomain sudo[144096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:24 np0005625204.localdomain python3.9[144098]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578743.8979042-1145-60352927874603/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:24 np0005625204.localdomain sudo[144096]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:25 np0005625204.localdomain sudo[144188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffixvukeyqfkbwaromtoznprdkzmvbnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578745.0928807-1191-2234488706410/AnsiballZ_stat.py
Feb 20 09:12:25 np0005625204.localdomain sudo[144188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:25 np0005625204.localdomain python3.9[144190]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:25 np0005625204.localdomain sudo[144188]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:25 np0005625204.localdomain sudo[144261]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymndddnqcqpkkjlikaaylrneqnyuntvc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578745.0928807-1191-2234488706410/AnsiballZ_copy.py
Feb 20 09:12:25 np0005625204.localdomain sudo[144261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:26 np0005625204.localdomain sudo[144263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:12:26 np0005625204.localdomain sudo[144263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:12:26 np0005625204.localdomain sudo[144263]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:26 np0005625204.localdomain sudo[144279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 09:12:26 np0005625204.localdomain sudo[144279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:12:26 np0005625204.localdomain python3.9[144264]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578745.0928807-1191-2234488706410/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:26 np0005625204.localdomain sudo[144261]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:26 np0005625204.localdomain sudo[144279]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:26 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8736 DF PROTO=TCP SPT=44256 DPT=9105 SEQ=1291847517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C1CD00000000001030307) 
Feb 20 09:12:26 np0005625204.localdomain sudo[144375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:12:26 np0005625204.localdomain sudo[144375]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:12:26 np0005625204.localdomain sudo[144375]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:26 np0005625204.localdomain sudo[144429]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqvwvzbnoyvvstseoxmcfhtyudncwwtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578746.4195716-1235-162522456330638/AnsiballZ_file.py
Feb 20 09:12:26 np0005625204.localdomain sudo[144429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:26 np0005625204.localdomain sudo[144412]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:12:26 np0005625204.localdomain sudo[144412]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:12:26 np0005625204.localdomain python3.9[144435]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:26 np0005625204.localdomain sudo[144429]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:27 np0005625204.localdomain sudo[144412]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:27 np0005625204.localdomain sudo[144557]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upylenjsuisazsevfyxpjgixmwwngapm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578747.117314-1260-226365364596066/AnsiballZ_command.py
Feb 20 09:12:27 np0005625204.localdomain sudo[144557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:27 np0005625204.localdomain python3.9[144559]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:12:27 np0005625204.localdomain sudo[144557]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:28 np0005625204.localdomain sudo[144652]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teguedfpptikvbmcqpufukbxkqftimot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578747.7470858-1284-68129423086890/AnsiballZ_blockinfile.py
Feb 20 09:12:28 np0005625204.localdomain sudo[144652]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:28 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40839 DF PROTO=TCP SPT=39812 DPT=9105 SEQ=4143178110 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C23680000000001030307) 
Feb 20 09:12:28 np0005625204.localdomain python3.9[144654]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:28 np0005625204.localdomain sudo[144652]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:28 np0005625204.localdomain sudo[144670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:12:28 np0005625204.localdomain sudo[144670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:12:28 np0005625204.localdomain sudo[144670]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:28 np0005625204.localdomain sudo[144760]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gppwxrzkpnunsurqgbhxrhlshykspklu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578748.6403682-1311-56336103071903/AnsiballZ_file.py
Feb 20 09:12:28 np0005625204.localdomain sudo[144760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:29 np0005625204.localdomain python3.9[144762]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:29 np0005625204.localdomain sudo[144760]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:29 np0005625204.localdomain sudo[144852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wujmovlrvkmkiybkwpacrrqrbbnlbfyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578749.1890354-1311-154185672775695/AnsiballZ_file.py
Feb 20 09:12:29 np0005625204.localdomain sudo[144852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:29 np0005625204.localdomain python3.9[144854]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:29 np0005625204.localdomain sudo[144852]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:30 np0005625204.localdomain sudo[144944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgzfuqfcwsvumtilvqfqdajvnkbyrecy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578749.9811351-1355-278668614384185/AnsiballZ_mount.py
Feb 20 09:12:30 np0005625204.localdomain sudo[144944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:30 np0005625204.localdomain python3.9[144946]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 20 09:12:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62174 DF PROTO=TCP SPT=37522 DPT=9102 SEQ=3832944715 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C2CB50000000001030307) 
Feb 20 09:12:30 np0005625204.localdomain sudo[144944]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:31 np0005625204.localdomain sudo[145037]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koywslwnrgmhkdlwcerbgovnpfimougw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578750.7541475-1355-219918658309179/AnsiballZ_mount.py
Feb 20 09:12:31 np0005625204.localdomain sudo[145037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:31 np0005625204.localdomain python3.9[145039]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 20 09:12:31 np0005625204.localdomain sudo[145037]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:31 np0005625204.localdomain sshd[139748]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:12:31 np0005625204.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Feb 20 09:12:31 np0005625204.localdomain systemd[1]: session-44.scope: Consumed 28.602s CPU time.
Feb 20 09:12:31 np0005625204.localdomain systemd-logind[759]: Session 44 logged out. Waiting for processes to exit.
Feb 20 09:12:31 np0005625204.localdomain systemd-logind[759]: Removed session 44.
Feb 20 09:12:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63453 DF PROTO=TCP SPT=35400 DPT=9100 SEQ=1136106703 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C3D680000000001030307) 
Feb 20 09:12:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14532 DF PROTO=TCP SPT=48870 DPT=9101 SEQ=3105837729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C42D50000000001030307) 
Feb 20 09:12:36 np0005625204.localdomain sshd[145055]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:12:37 np0005625204.localdomain sshd[145055]: Accepted publickey for zuul from 192.168.122.30 port 33218 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:12:37 np0005625204.localdomain systemd-logind[759]: New session 45 of user zuul.
Feb 20 09:12:37 np0005625204.localdomain systemd[1]: Started Session 45 of User zuul.
Feb 20 09:12:37 np0005625204.localdomain sshd[145055]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:12:37 np0005625204.localdomain sudo[145148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqegzfvcujisyynfnqpwwgevyyeuloef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578757.191052-23-109274115523959/AnsiballZ_tempfile.py
Feb 20 09:12:37 np0005625204.localdomain sudo[145148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:37 np0005625204.localdomain python3.9[145150]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 20 09:12:37 np0005625204.localdomain sudo[145148]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:38 np0005625204.localdomain sshd[145165]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:12:38 np0005625204.localdomain sshd[145165]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:12:39 np0005625204.localdomain sudo[145242]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gerditruqfrbljsyfhdxeavrpfpsdnln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578758.7023373-95-157003874077234/AnsiballZ_stat.py
Feb 20 09:12:39 np0005625204.localdomain sudo[145242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:39 np0005625204.localdomain python3.9[145244]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:12:39 np0005625204.localdomain sudo[145242]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:40 np0005625204.localdomain sudo[145336]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqtsflptaiyxrtmalhekwfqxzuvlgmkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578760.0676923-143-263458866154990/AnsiballZ_slurp.py
Feb 20 09:12:40 np0005625204.localdomain sudo[145336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:40 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3027 DF PROTO=TCP SPT=51478 DPT=9101 SEQ=329870299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C53680000000001030307) 
Feb 20 09:12:40 np0005625204.localdomain python3.9[145338]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Feb 20 09:12:40 np0005625204.localdomain sudo[145336]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:41 np0005625204.localdomain sudo[145428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbnyrjnzziwdfdmzqmhrcxxcylpeyabe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578761.4350631-192-223767579384344/AnsiballZ_stat.py
Feb 20 09:12:41 np0005625204.localdomain sudo[145428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:41 np0005625204.localdomain python3.9[145430]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.zwjwm229 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:12:41 np0005625204.localdomain sudo[145428]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:42 np0005625204.localdomain sudo[145503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhztniqbrfmbfvqwdgzgwkbtqtxszntq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578761.4350631-192-223767579384344/AnsiballZ_copy.py
Feb 20 09:12:42 np0005625204.localdomain sudo[145503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:42 np0005625204.localdomain python3.9[145505]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.zwjwm229 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578761.4350631-192-223767579384344/.source.zwjwm229 _original_basename=.waklz5h8 follow=False checksum=831757da1f03f9732785943fa2a05c0d9424aa2f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:42 np0005625204.localdomain sudo[145503]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:42 np0005625204.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 20 09:12:44 np0005625204.localdomain sudo[145597]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehsyfmvpitdejqbtbeqdcrennegxsctd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578763.8568454-281-202913782742301/AnsiballZ_setup.py
Feb 20 09:12:44 np0005625204.localdomain sudo[145597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:45 np0005625204.localdomain python3.9[145599]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:12:45 np0005625204.localdomain sudo[145597]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:45 np0005625204.localdomain sudo[145689]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqlifrbxwjbdruesihcrbdloavsubrke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578765.5783434-331-106144879152857/AnsiballZ_blockinfile.py
Feb 20 09:12:45 np0005625204.localdomain sudo[145689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:46 np0005625204.localdomain python3.9[145691]: ansible-ansible.builtin.blockinfile Invoked with block=np0005625201.localdomain,192.168.122.105,np0005625201* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCyGkX26ECIsvqnvJegedSF6KicDAAqjaifawEd//OuK9zdHIWqO3XmlEszZqWPsdQhPFkelfzXR+sy3gbPNv+yjT7phsw1sq7zHXeogQFlP5iOQZrf6hCnfXxVk2ckIXMT0UJVZ8FCTwsQi+HKkR/IEj08pR7EjrXGWxHkjv5wNj76spF3FJxtwycS4+KzY3UFy7gYWVn2jB0ha966YgjHMPhzQnT33W9myxGH33M1L5ZCGlfH19hLnqTUNMfzIfw3afxHkL5BFZbhthUPmIfLdLtKmZEkpSTBO/CrNA6CmMfY6xnT78hmwXytEQ+jeiRdKXdr9xQ2j6wVmPzckFKBsBYRe4DprKGt93fnKS9Z6A3Sv626DyZgDa8/NXbtAaBxtyix5Vdt872hYvCzYyB/OuSV6PR5DOq8z3fquOwgtka3rA6qL5gxhFJcO5TqtBM76DzOLd9OLM9bIO1yK9sCmbYynMojkXylzhDfcI8kytS5xs9FJEfwTElZRHkEIQE=
                                                            np0005625201.localdomain,192.168.122.105,np0005625201* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINiFV2XLGVf9PGXF0NE4rbupw+vH23sDv10vB3wGrrmN
                                                            np0005625201.localdomain,192.168.122.105,np0005625201* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM/mxytSzwSYcezRRSD4AjPi1j6Bxso/MLXC/NAewzvKThRznoUobc02vzGaO4FrwuZIZ/YHJyAHrQRbtdSPUTU=
                                                            np0005625202.localdomain,192.168.122.106,np0005625202* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDDr8sejencX7nSCX6AegGtTuiZL3yclu/L7ZVN4B6dKPdmHqVr33QJD40sEk28GHpx8BrkPU2Qj1de9H6mGtrlwhmJr7Pccg/YqzKoTCQD5rZQ4youU8H70As6YX5ZlXyulwI1SH70XjMm37x4ptKALFOjRnHg0WIXah/tAmzrY/orh+/eCcns7APVjN9B1o+MqP4r47WrWrGU/KxtsHc6dflWxZW7BWUCCNS0e3C4yWLRjy8Hhj7Qkpssv/UBcj+olVHadUUOYiaQZ5Y33MjxwIg8o1MuC7C1dNIn8eXOXXiA8jd/lJd9kImrCGUtkVqj8VQgsMh4vRYMD+0SNLYRDVwxdemOzJYgwQhgiWZ0G+cVhnTBpMmXyIws2OpOKU8R3HjTC3jz+BxvjwEvMDoQfpGgsHB9NCXnkQzs2F8EA8LpA823Ef1SMgPdDCaQzvN5oQPZkWAPMVHvq31xpN9q+KXg/bg0uDaIZXUxW2rGnem7pFS78rRUGL6MfSMn1zs=
                                                            np0005625202.localdomain,192.168.122.106,np0005625202* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHIvGY3AHSeC6TXoQUOT+qZPpfcpbcCaqWpewY2PaUdr
                                                            np0005625202.localdomain,192.168.122.106,np0005625202* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNhJMOoHTPuI+cufoglj5k5xopCSTjiletXnoJ15KnCBclkNCXy9DqMn/ZeknN3AqFVQZhJfknnRkCXvgtRg7lc=
                                                            np0005625200.localdomain,192.168.122.104,np0005625200* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDW88346W6zU6nxCpqapHtIr5nRG8Jn9LFit3r5klBfauCkmAGONb4X8IwKjo8MD9etebUVbo6aX9gBMBMSs7bSoHzsEQuMLpBDrweSbahQj+gqZ5TmQ/xvwbhws04z3/IJxapAk2xWu7khVGjvOPUE1CROkP+1LiGktQ6Xj1ar1TbLNud2Dq/R5ZalbpK0OT3+no3x0oAJT3W649tW4nmCWcNaxykPsLREsUlH2qVoceAzLEDCSde9/1TONc/URyB4acVqmEwJDHeX51bh31tpQwp/WSe0vKQ6eUw63Tmpn+dRI9xbnFhc6mgGAPcEw7cAUkM7oM6bYMSvVxYDmzMhuXUU/9i3mdMnDBkMyZ5Oed6ZSmFQIJe5k7cz3783d35ZXfl/HsYMqoZ3lmDgbeS59pQrI+BldKyv3sTnoCDahfcmzmiHssxqa7tT5KOuR444q7Nj6wJEIZMEEJEHtMlh1iSBRJZOEOaKjo7h+jV7KMe75aPRasvu9K1v0dqyG6U=
                                                            np0005625200.localdomain,192.168.122.104,np0005625200* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKZd4BJQ7FPHukFUlQ3fRSVsRqMpZA9FFzC98e6Nz+hC
                                                            np0005625200.localdomain,192.168.122.104,np0005625200* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJgelHBDBResuC/7QDQA12qTpLPW1xHX6eUvY/QfQ0s1DYziYEKuSHQhUQMzxPcUq9IVVPnxkoRvZdWPxsh2Cmk=
                                                            np0005625199.localdomain,192.168.122.103,np0005625199* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDrnsozeOPJKYg9sx2Tj6QOLRhujK5RVh5RZQ3sb0pk+DbWHQKqS1YvJUg2hV4WxbxPnNUCBtJ+RZ8lVm6RLM+hc3ffe2sOMOz5upO/hTlIpBSfJpQORkiNW+XIXdDVxgE418veFd2hASFmiCmKoFSKXsvnmFU9oTEpja1plcXSqCobFMVYKlhcRo66O0ySlGOR+o3Ar2yNJQjFErEGvZLoDEa/VlA6zreYmTaIsnlUDie0gbm5teTlsCcEYkvWcTzcfOEX2kXQRQbS5qlPtGg7c+KMv5e40rE+2QOigLmOOPVGwNYuLuhb/EHT0C8hK8otW4tiXxBlSZ5ONKY6YYQOpy7krNkWRxNXzK0LfXo2bt2apDaMzebPOvuBj1YyBiLpa6/aLvS/dtGolQNPDpFivPbP/mSpat1qTs0W3/2HyBovwWSGJDW8MMYxbZJ0Z6tnuOwdrPTdkhIibfW9wxgL7EHrDYrGx5CvA2vUM4KDKRntz/cCMGE/zKacSJ48nNk=
                                                            np0005625199.localdomain,192.168.122.103,np0005625199* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIENpQQgr9IVl8UWbQ9CANzH6ET+G2aHJkzVgu9ObE0o0
                                                            np0005625199.localdomain,192.168.122.103,np0005625199* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJUcn4Y73wlRXKxRegM8lRt5GQ//hAORn8IqrcrC5ZJyjHCZmp+wutQeuPqPsTK4OVK+uH/93l/3Av8AKvpXG3A=
                                                            np0005625203.localdomain,192.168.122.107,np0005625203* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCtf1NXQ3EGQGdpLLLxuODKBdTGwqsiHL2QZ6zcfpGAa7EhDIxuEcLboqOGjQO0FM3u+kl2gIgKF0UsY5Vjcv4mDCMp7A7srq7TVo5lE5cCppbbXr0/PH2L/naHU3W+W83aT5RE17XPJ0Acn3W51WFBoICCCc4jjWTGmkNEgurKBJmdr0n8NeIcUWZ7Abrs/N2xzNftEFIjAPwebxgEwgCx0hMbdjTFhKbB/V7CjKaCU/UjirWMW5aDQJQEfrCM9u4NHuGaWKzJgar4/shNHaRvkCDbVrRPTCyfNebE04J/R42X3yWmvww4TMZVpRROd/u6Pgg1P2tbPGfQ0XvS0rfY6W4/VnHcyRDqxILH5BoeCAbTuVFmR0hbQu9fNbNxTP+o+na9mHEbNxbhcREnkal8+M0l11YftCRkr4132JITxe7y93gN/dwxE3nJLHLXRuRskWc3GTDT2MVU2Sj64yizD9KOM3oiMBXdPbNbgZywu3hqQvpO00GVg6QRjEJoiFc=
                                                            np0005625203.localdomain,192.168.122.107,np0005625203* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPIEBJz4VBziYqCcr9UT9NnbvRxFLoAcnVJLavCpXqHm
                                                            np0005625203.localdomain,192.168.122.107,np0005625203* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9k0T2/IFyFrBAAoi3QqwBKC9bi/bemQO6MNZhrO12MSG3WZcjS1bhOFPw5LuM+f11BFCm5wNyBNY/QmALZTgE=
                                                            np0005625204.localdomain,192.168.122.108,np0005625204* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAo6exxFtNk/Y5qEGYenJyhnCsS7iZmCGsFaQtJElNSeTTX9a1P0P2EmjtHolRxnljCZ2X8HgWx/irhJvWLoS+dzF5l+KcyQy83+048h51mbnj7zV2uG9i8LkO0egs1uBBp5E+hauHMsuf0nIDFl45W86ZXuf+MfFEKCInhjB5gfE9tTjwmKwKhgO1DE7Vpx3OYy1FHkq0YDBCqQHuuhYPrLZPjfVv3vGOaHH/XCsxX3h8/ixsZbobD56dDBKF/8CFyC/guH8pNUhZHG0dEhz5BT8PcE2Q/M9pPttzmRQksfg9+q7lVy9eCoOVpzqfTgjE1cm5yISwuMZzaNxwjJKB54EWpfl5xxnkC14B+xdvowxpl1PcMNZ0q1fWofJF4TrJAwWCUYZf45aUV2yb5R8WavUT0pX32xmd4zFbXusoafiw2FcgnxoGz3N4ZgIxTPPmgUe13blr1SK44huXWPioaolFBo82xVVFHc+01vfLF3xvs86d6EpqpLH+yaCeUjE=
                                                            np0005625204.localdomain,192.168.122.108,np0005625204* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDTY+/nqIDkr9+7jl3LUu4apuQeFzQYkXiSihEezHlEw
                                                            np0005625204.localdomain,192.168.122.108,np0005625204* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPuq/q6JwPgXzS/TgJ6dhP0gZvq89Vk1r9Ou051lEnMdt+NHYUjJx2Tv1oS9A+wQXivor03/iqWU5nj5QHdvHx4=
                                                             create=True mode=0644 path=/tmp/ansible.zwjwm229 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:46 np0005625204.localdomain sudo[145689]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:47 np0005625204.localdomain sudo[145781]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odkcbkvxalylurapqmchcwxjdpsnfsgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578766.9066913-378-106008272345339/AnsiballZ_command.py
Feb 20 09:12:47 np0005625204.localdomain sudo[145781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:47 np0005625204.localdomain python3.9[145783]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.zwjwm229' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:12:47 np0005625204.localdomain sshd[145784]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:12:47 np0005625204.localdomain sudo[145781]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28396 DF PROTO=TCP SPT=45726 DPT=9882 SEQ=3482117758 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C6F5A0000000001030307) 
Feb 20 09:12:48 np0005625204.localdomain sudo[145877]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvymrgeicgbuzobfgnwbbpcqclxwccmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578768.260836-426-135691872977484/AnsiballZ_file.py
Feb 20 09:12:48 np0005625204.localdomain sudo[145877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:48 np0005625204.localdomain python3.9[145879]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.zwjwm229 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:12:48 np0005625204.localdomain sudo[145877]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:48 np0005625204.localdomain sshd[145784]: Invalid user n8n from 27.112.79.3 port 35044
Feb 20 09:12:49 np0005625204.localdomain sshd[145784]: Received disconnect from 27.112.79.3 port 35044:11: Bye Bye [preauth]
Feb 20 09:12:49 np0005625204.localdomain sshd[145784]: Disconnected from invalid user n8n 27.112.79.3 port 35044 [preauth]
Feb 20 09:12:49 np0005625204.localdomain sshd[145055]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:12:49 np0005625204.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Feb 20 09:12:49 np0005625204.localdomain systemd[1]: session-45.scope: Consumed 4.273s CPU time.
Feb 20 09:12:49 np0005625204.localdomain systemd-logind[759]: Session 45 logged out. Waiting for processes to exit.
Feb 20 09:12:49 np0005625204.localdomain systemd-logind[759]: Removed session 45.
Feb 20 09:12:55 np0005625204.localdomain sshd[145895]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:12:55 np0005625204.localdomain sshd[145895]: Accepted publickey for zuul from 192.168.122.30 port 39888 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:12:55 np0005625204.localdomain systemd-logind[759]: New session 46 of user zuul.
Feb 20 09:12:55 np0005625204.localdomain systemd[1]: Started Session 46 of User zuul.
Feb 20 09:12:55 np0005625204.localdomain sshd[145895]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:12:56 np0005625204.localdomain python3.9[145988]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:12:56 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17313 DF PROTO=TCP SPT=40530 DPT=9105 SEQ=2733664460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598C92000000000001030307) 
Feb 20 09:12:57 np0005625204.localdomain sudo[146082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paukuhgqjacogjzckscphiajlatjkomz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578776.7367847-54-140093680138895/AnsiballZ_systemd.py
Feb 20 09:12:57 np0005625204.localdomain sudo[146082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:57 np0005625204.localdomain python3.9[146084]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 20 09:12:57 np0005625204.localdomain sudo[146082]: pam_unix(sudo:session): session closed for user root
Feb 20 09:12:59 np0005625204.localdomain sudo[146176]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-virtrmjyjyrocnuohqcxuflloflewavl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578778.8951313-78-154661524996058/AnsiballZ_systemd.py
Feb 20 09:12:59 np0005625204.localdomain sudo[146176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:12:59 np0005625204.localdomain python3.9[146178]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:12:59 np0005625204.localdomain sudo[146176]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1414 DF PROTO=TCP SPT=49152 DPT=9102 SEQ=3230471770 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CA1EA0000000001030307) 
Feb 20 09:13:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20305 DF PROTO=TCP SPT=37448 DPT=9100 SEQ=293745995 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CA2400000000001030307) 
Feb 20 09:13:01 np0005625204.localdomain sudo[146269]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adpofvtfmtautbxbsaqvztljuyefllcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578780.7701252-104-94305350169329/AnsiballZ_command.py
Feb 20 09:13:01 np0005625204.localdomain sudo[146269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:01 np0005625204.localdomain python3.9[146271]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:13:01 np0005625204.localdomain sudo[146269]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:01 np0005625204.localdomain sudo[146362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sjcxwyjllgmceqfqwsrknausbfieelff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578781.5499532-128-11629168530954/AnsiballZ_stat.py
Feb 20 09:13:01 np0005625204.localdomain sudo[146362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:02 np0005625204.localdomain python3.9[146364]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:13:02 np0005625204.localdomain sudo[146362]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:02 np0005625204.localdomain sudo[146456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttczkqmoihctfupjeorkvezcqwuxxfak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578782.3583982-152-78898031416556/AnsiballZ_command.py
Feb 20 09:13:02 np0005625204.localdomain sudo[146456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:02 np0005625204.localdomain python3.9[146458]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:13:02 np0005625204.localdomain sudo[146456]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:03 np0005625204.localdomain sudo[146551]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgzxumhbhtsuknhvslxtbplluxvvgcrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578783.06242-176-138382868429974/AnsiballZ_file.py
Feb 20 09:13:03 np0005625204.localdomain sudo[146551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:03 np0005625204.localdomain python3.9[146553]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:03 np0005625204.localdomain sudo[146551]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:03 np0005625204.localdomain sshd[145895]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:13:03 np0005625204.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Feb 20 09:13:03 np0005625204.localdomain systemd[1]: session-46.scope: Consumed 4.132s CPU time.
Feb 20 09:13:03 np0005625204.localdomain systemd-logind[759]: Session 46 logged out. Waiting for processes to exit.
Feb 20 09:13:03 np0005625204.localdomain systemd-logind[759]: Removed session 46.
Feb 20 09:13:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26591 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CB8040000000001030307) 
Feb 20 09:13:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26592 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CBC280000000001030307) 
Feb 20 09:13:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26593 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CC4280000000001030307) 
Feb 20 09:13:09 np0005625204.localdomain sshd[146568]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:13:09 np0005625204.localdomain sshd[146568]: Accepted publickey for zuul from 192.168.122.30 port 54816 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:13:09 np0005625204.localdomain systemd-logind[759]: New session 47 of user zuul.
Feb 20 09:13:09 np0005625204.localdomain systemd[1]: Started Session 47 of User zuul.
Feb 20 09:13:09 np0005625204.localdomain sshd[146568]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:13:10 np0005625204.localdomain python3.9[146661]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:13:11 np0005625204.localdomain sudo[146755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtzhttrpubgyhomykfgpjtewrhxoxekt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578791.4672964-59-280209290488969/AnsiballZ_setup.py
Feb 20 09:13:11 np0005625204.localdomain sudo[146755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:12 np0005625204.localdomain python3.9[146757]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:13:12 np0005625204.localdomain sudo[146755]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:12 np0005625204.localdomain sudo[146809]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sohbzmpthsrefljvfgsowlaifnolcmlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578791.4672964-59-280209290488969/AnsiballZ_dnf.py
Feb 20 09:13:12 np0005625204.localdomain sudo[146809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:13 np0005625204.localdomain python3.9[146811]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 20 09:13:13 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26594 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CD3E80000000001030307) 
Feb 20 09:13:16 np0005625204.localdomain sudo[146809]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:17 np0005625204.localdomain python3.9[146903]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:13:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9274 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CE48C0000000001030307) 
Feb 20 09:13:18 np0005625204.localdomain sudo[146994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwpskoqddatkdkxjaumacyreqxkfopqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578797.9399426-122-44252738027764/AnsiballZ_file.py
Feb 20 09:13:18 np0005625204.localdomain sudo[146994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:18 np0005625204.localdomain python3.9[146996]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:18 np0005625204.localdomain sudo[146994]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:18 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9275 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CE8A80000000001030307) 
Feb 20 09:13:19 np0005625204.localdomain sudo[147086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blxrnbodlcxqdhuwmoiqtyrtitvbsqrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578798.776172-146-80777860121051/AnsiballZ_file.py
Feb 20 09:13:19 np0005625204.localdomain sudo[147086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:19 np0005625204.localdomain python3.9[147088]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:19 np0005625204.localdomain sudo[147086]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:19 np0005625204.localdomain sudo[147178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfttutbgzylevwbziegodiqbttmjewta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578799.4351232-170-275839665144530/AnsiballZ_lineinfile.py
Feb 20 09:13:19 np0005625204.localdomain sudo[147178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:20 np0005625204.localdomain python3.9[147180]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:20 np0005625204.localdomain sudo[147178]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:20 np0005625204.localdomain python3.9[147270]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:13:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9276 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CF0A80000000001030307) 
Feb 20 09:13:21 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26595 DF PROTO=TCP SPT=38650 DPT=9101 SEQ=4294763668 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598CF3690000000001030307) 
Feb 20 09:13:21 np0005625204.localdomain python3.9[147360]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:13:22 np0005625204.localdomain python3.9[147452]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:13:22 np0005625204.localdomain sshd[147469]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:13:22 np0005625204.localdomain sshd[147469]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:13:22 np0005625204.localdomain sshd[146568]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:13:22 np0005625204.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Feb 20 09:13:22 np0005625204.localdomain systemd[1]: session-47.scope: Consumed 9.130s CPU time.
Feb 20 09:13:22 np0005625204.localdomain systemd-logind[759]: Session 47 logged out. Waiting for processes to exit.
Feb 20 09:13:22 np0005625204.localdomain systemd-logind[759]: Removed session 47.
Feb 20 09:13:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9277 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D00690000000001030307) 
Feb 20 09:13:26 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1814 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D07300000000001030307) 
Feb 20 09:13:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1815 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D0B290000000001030307) 
Feb 20 09:13:28 np0005625204.localdomain sshd[147471]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:13:28 np0005625204.localdomain sshd[147471]: Accepted publickey for zuul from 192.168.122.30 port 38812 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:13:28 np0005625204.localdomain systemd-logind[759]: New session 48 of user zuul.
Feb 20 09:13:28 np0005625204.localdomain systemd[1]: Started Session 48 of User zuul.
Feb 20 09:13:28 np0005625204.localdomain sshd[147471]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:13:28 np0005625204.localdomain sudo[147473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:13:28 np0005625204.localdomain sudo[147473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:13:28 np0005625204.localdomain sudo[147473]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:28 np0005625204.localdomain sudo[147490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:13:28 np0005625204.localdomain sudo[147490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:13:29 np0005625204.localdomain sudo[147490]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1816 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D13280000000001030307) 
Feb 20 09:13:29 np0005625204.localdomain python3.9[147614]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:13:30 np0005625204.localdomain sudo[147645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:13:30 np0005625204.localdomain sudo[147645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:13:30 np0005625204.localdomain sudo[147645]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47218 DF PROTO=TCP SPT=57806 DPT=9102 SEQ=795303546 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D17140000000001030307) 
Feb 20 09:13:31 np0005625204.localdomain sshd[147692]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:13:31 np0005625204.localdomain sudo[147737]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnlernrevbodoiqmwclmmsoukmicvxhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578811.280802-156-3425980895331/AnsiballZ_file.py
Feb 20 09:13:31 np0005625204.localdomain sudo[147737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:31 np0005625204.localdomain python3.9[147739]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:31 np0005625204.localdomain sudo[147737]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:32 np0005625204.localdomain sshd[147692]: Invalid user sol from 45.148.10.240 port 35076
Feb 20 09:13:32 np0005625204.localdomain sshd[147692]: Connection closed by invalid user sol 45.148.10.240 port 35076 [preauth]
Feb 20 09:13:32 np0005625204.localdomain sudo[147829]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clhkcnjzfgrnwrbsfrmngvclrlyuffzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578812.0421615-180-164270337795508/AnsiballZ_stat.py
Feb 20 09:13:32 np0005625204.localdomain sudo[147829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:32 np0005625204.localdomain python3.9[147831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:32 np0005625204.localdomain sudo[147829]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:33 np0005625204.localdomain sudo[147902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoejwkrlxxuadermkakekizjhxtagokr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578812.0421615-180-164270337795508/AnsiballZ_copy.py
Feb 20 09:13:33 np0005625204.localdomain sudo[147902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:33 np0005625204.localdomain python3.9[147904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578812.0421615-180-164270337795508/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:33 np0005625204.localdomain sudo[147902]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9278 DF PROTO=TCP SPT=41468 DPT=9882 SEQ=486299558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D21690000000001030307) 
Feb 20 09:13:33 np0005625204.localdomain sudo[147994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inyrkbmrkjaawdfmrlakqbojwuehcjjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578813.4023876-226-65285575476330/AnsiballZ_file.py
Feb 20 09:13:33 np0005625204.localdomain sudo[147994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:33 np0005625204.localdomain python3.9[147996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:33 np0005625204.localdomain sudo[147994]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:34 np0005625204.localdomain sudo[148086]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-piibzywzyhelvwybwzuhrsqmtaqhmspq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578813.9976404-248-84902633670387/AnsiballZ_stat.py
Feb 20 09:13:34 np0005625204.localdomain sudo[148086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:34 np0005625204.localdomain python3.9[148088]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:34 np0005625204.localdomain sudo[148086]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:34 np0005625204.localdomain sudo[148159]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hztriygngrnhsbhnlqlrxbzoyhdziqpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578813.9976404-248-84902633670387/AnsiballZ_copy.py
Feb 20 09:13:34 np0005625204.localdomain sudo[148159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:34 np0005625204.localdomain python3.9[148161]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578813.9976404-248-84902633670387/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:35 np0005625204.localdomain sudo[148159]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:35 np0005625204.localdomain sudo[148251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icbjkwmufxqirptblpmrjwmtjgxrrozh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578815.1951838-295-259402045808038/AnsiballZ_file.py
Feb 20 09:13:35 np0005625204.localdomain sudo[148251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:35 np0005625204.localdomain python3.9[148253]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:35 np0005625204.localdomain sudo[148251]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:35 np0005625204.localdomain sshd[148281]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:13:36 np0005625204.localdomain sudo[148345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekbtmiqtczpjmsbykxwlwxlbhzsdixwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578815.842179-320-144077208891245/AnsiballZ_stat.py
Feb 20 09:13:36 np0005625204.localdomain sudo[148345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39658 DF PROTO=TCP SPT=53832 DPT=9101 SEQ=499422902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D2D350000000001030307) 
Feb 20 09:13:36 np0005625204.localdomain python3.9[148347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:36 np0005625204.localdomain sudo[148345]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:36 np0005625204.localdomain sshd[148281]: Received disconnect from 54.36.99.29 port 56816:11: Bye Bye [preauth]
Feb 20 09:13:36 np0005625204.localdomain sshd[148281]: Disconnected from authenticating user root 54.36.99.29 port 56816 [preauth]
Feb 20 09:13:36 np0005625204.localdomain sudo[148418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkdcstacmtmmwsnvnfkpssagjfrpgklm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578815.842179-320-144077208891245/AnsiballZ_copy.py
Feb 20 09:13:36 np0005625204.localdomain sudo[148418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:36 np0005625204.localdomain python3.9[148420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578815.842179-320-144077208891245/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:36 np0005625204.localdomain sudo[148418]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:37 np0005625204.localdomain sudo[148510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvacppifaswnrpdivjzwgryvtqqqavgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578817.1051192-372-7316155949100/AnsiballZ_file.py
Feb 20 09:13:37 np0005625204.localdomain sudo[148510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:37 np0005625204.localdomain python3.9[148512]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:37 np0005625204.localdomain sudo[148510]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:37 np0005625204.localdomain sudo[148602]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zktkbccnrcecrhbfdnbgmdtxbfvtpcox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578817.7024274-395-253454785170199/AnsiballZ_stat.py
Feb 20 09:13:37 np0005625204.localdomain sudo[148602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:38 np0005625204.localdomain python3.9[148604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:38 np0005625204.localdomain sudo[148602]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:38 np0005625204.localdomain sudo[148675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnswlmuwejgohetqruvojdtxskexovui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578817.7024274-395-253454785170199/AnsiballZ_copy.py
Feb 20 09:13:38 np0005625204.localdomain sudo[148675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:38 np0005625204.localdomain python3.9[148677]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578817.7024274-395-253454785170199/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:38 np0005625204.localdomain sudo[148675]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:39 np0005625204.localdomain sudo[148767]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxvyadcwjkydcppzxqnxksksgapmbqxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578818.9144983-445-23392937216634/AnsiballZ_file.py
Feb 20 09:13:39 np0005625204.localdomain sudo[148767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39660 DF PROTO=TCP SPT=53832 DPT=9101 SEQ=499422902 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D39280000000001030307) 
Feb 20 09:13:39 np0005625204.localdomain python3.9[148769]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:39 np0005625204.localdomain sudo[148767]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:39 np0005625204.localdomain sudo[148859]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqatbdkjliggghtebsoibhaeplqocemf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578819.5241244-471-227440666638848/AnsiballZ_stat.py
Feb 20 09:13:39 np0005625204.localdomain sudo[148859]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:39 np0005625204.localdomain python3.9[148861]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:39 np0005625204.localdomain sudo[148859]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:40 np0005625204.localdomain sudo[148932]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgwdnabenuimqvfxbrmzhdsxykyuiarp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578819.5241244-471-227440666638848/AnsiballZ_copy.py
Feb 20 09:13:40 np0005625204.localdomain sudo[148932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:40 np0005625204.localdomain python3.9[148934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578819.5241244-471-227440666638848/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:40 np0005625204.localdomain sudo[148932]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:41 np0005625204.localdomain sudo[149024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcmkkbtbeuijsmikdqgbwznvdcuopqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578820.7412229-519-279211326912589/AnsiballZ_file.py
Feb 20 09:13:41 np0005625204.localdomain sudo[149024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:41 np0005625204.localdomain python3.9[149026]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:41 np0005625204.localdomain sudo[149024]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:41 np0005625204.localdomain sudo[149116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtkfazhkkkmjkcitjetnopoeiwkdnodi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578821.411394-547-258153400884304/AnsiballZ_stat.py
Feb 20 09:13:41 np0005625204.localdomain sudo[149116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:41 np0005625204.localdomain python3.9[149118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:41 np0005625204.localdomain sudo[149116]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1818 DF PROTO=TCP SPT=38098 DPT=9105 SEQ=2797951751 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D43680000000001030307) 
Feb 20 09:13:42 np0005625204.localdomain sudo[149189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffunwejwxuaccgqgirbqlsvsjuikvzfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578821.411394-547-258153400884304/AnsiballZ_copy.py
Feb 20 09:13:42 np0005625204.localdomain sudo[149189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:42 np0005625204.localdomain python3.9[149191]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578821.411394-547-258153400884304/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:42 np0005625204.localdomain sudo[149189]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:42 np0005625204.localdomain sudo[149281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqqdzlnaxgnkzhjzqdtxfzbrohxazfwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578822.6081872-597-167925319006731/AnsiballZ_file.py
Feb 20 09:13:42 np0005625204.localdomain sudo[149281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:43 np0005625204.localdomain python3.9[149283]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:43 np0005625204.localdomain sudo[149281]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:43 np0005625204.localdomain sudo[149373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjckvvngujfcjagbycxzldxkscikbizd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578823.2096717-621-224646440991240/AnsiballZ_stat.py
Feb 20 09:13:43 np0005625204.localdomain sudo[149373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:43 np0005625204.localdomain python3.9[149375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:43 np0005625204.localdomain sudo[149373]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:44 np0005625204.localdomain sudo[149446]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyywswnxfkbofddpkydfsqaelrguegna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578823.2096717-621-224646440991240/AnsiballZ_copy.py
Feb 20 09:13:44 np0005625204.localdomain sudo[149446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:44 np0005625204.localdomain python3.9[149448]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578823.2096717-621-224646440991240/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:44 np0005625204.localdomain sudo[149446]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:44 np0005625204.localdomain sudo[149538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjjsnkzzhltqyazdghllobdflxjuyewj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578824.4703493-671-253621490344811/AnsiballZ_file.py
Feb 20 09:13:44 np0005625204.localdomain sudo[149538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:44 np0005625204.localdomain python3.9[149540]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:13:44 np0005625204.localdomain sudo[149538]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:45 np0005625204.localdomain sudo[149630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrrzerfewhoarmqamwuxoqphbswvdiro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578825.145644-695-133933484168486/AnsiballZ_stat.py
Feb 20 09:13:45 np0005625204.localdomain sudo[149630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:45 np0005625204.localdomain python3.9[149632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:45 np0005625204.localdomain sudo[149630]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:45 np0005625204.localdomain sudo[149703]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-flznlfvuydjcpdnmrwyjfyzxpzmovamh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578825.145644-695-133933484168486/AnsiballZ_copy.py
Feb 20 09:13:45 np0005625204.localdomain sudo[149703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14934 DF PROTO=TCP SPT=48324 DPT=9100 SEQ=1107177072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D53680000000001030307) 
Feb 20 09:13:46 np0005625204.localdomain python3.9[149705]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578825.145644-695-133933484168486/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=cb143134597e8d09980d1dcc1949f9a4232e36a1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:46 np0005625204.localdomain sudo[149703]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:46 np0005625204.localdomain sshd[147471]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:13:46 np0005625204.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Feb 20 09:13:46 np0005625204.localdomain systemd[1]: session-48.scope: Consumed 11.895s CPU time.
Feb 20 09:13:46 np0005625204.localdomain systemd-logind[759]: Session 48 logged out. Waiting for processes to exit.
Feb 20 09:13:46 np0005625204.localdomain systemd-logind[759]: Removed session 48.
Feb 20 09:13:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51396 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D59BA0000000001030307) 
Feb 20 09:13:47 np0005625204.localdomain chronyd[139732]: Selected source 23.133.168.246 (pool.ntp.org)
Feb 20 09:13:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51398 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D65A80000000001030307) 
Feb 20 09:13:51 np0005625204.localdomain sshd[149721]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:13:51 np0005625204.localdomain sshd[149721]: Accepted publickey for zuul from 192.168.122.30 port 36566 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:13:51 np0005625204.localdomain systemd-logind[759]: New session 49 of user zuul.
Feb 20 09:13:51 np0005625204.localdomain systemd[1]: Started Session 49 of User zuul.
Feb 20 09:13:51 np0005625204.localdomain sshd[149721]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:13:52 np0005625204.localdomain sudo[149814]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltmdgfrtlhxcntyqtalowqtesqmewaro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578832.027442-23-157056145539022/AnsiballZ_file.py
Feb 20 09:13:52 np0005625204.localdomain sudo[149814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:52 np0005625204.localdomain python3.9[149816]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:52 np0005625204.localdomain sudo[149814]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:53 np0005625204.localdomain sudo[149906]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swsjdhpbtjmbdnojrqiiznonkysyyvpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578832.8325264-59-156393497156177/AnsiballZ_stat.py
Feb 20 09:13:53 np0005625204.localdomain sudo[149906]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:53 np0005625204.localdomain python3.9[149908]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:53 np0005625204.localdomain sudo[149906]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:53 np0005625204.localdomain sudo[149979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bztaojnvmiaxnlnndyupdymagzdjwzkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578832.8325264-59-156393497156177/AnsiballZ_copy.py
Feb 20 09:13:53 np0005625204.localdomain sudo[149979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:54 np0005625204.localdomain python3.9[149981]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578832.8325264-59-156393497156177/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=8e2004121a34320613d32710ae37702da8d027e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:54 np0005625204.localdomain sudo[149979]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:54 np0005625204.localdomain sudo[150071]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idmyjorcgawgzegdunqjllifjedjzuey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578834.236101-59-16596974300586/AnsiballZ_stat.py
Feb 20 09:13:54 np0005625204.localdomain sudo[150071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:54 np0005625204.localdomain python3.9[150073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:13:54 np0005625204.localdomain sudo[150071]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51399 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D75690000000001030307) 
Feb 20 09:13:55 np0005625204.localdomain sudo[150144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqqmvdywkkawfrpfthfrevkmwbuexzyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578834.236101-59-16596974300586/AnsiballZ_copy.py
Feb 20 09:13:55 np0005625204.localdomain sudo[150144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:13:55 np0005625204.localdomain python3.9[150146]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578834.236101-59-16596974300586/.source.conf _original_basename=ceph.conf follow=False checksum=936d449f31af670125791fe297b02d275b2ba4b7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:13:55 np0005625204.localdomain sudo[150144]: pam_unix(sudo:session): session closed for user root
Feb 20 09:13:55 np0005625204.localdomain sshd[149721]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:13:55 np0005625204.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Feb 20 09:13:55 np0005625204.localdomain systemd[1]: session-49.scope: Consumed 2.304s CPU time.
Feb 20 09:13:55 np0005625204.localdomain systemd-logind[759]: Session 49 logged out. Waiting for processes to exit.
Feb 20 09:13:55 np0005625204.localdomain systemd-logind[759]: Removed session 49.
Feb 20 09:13:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42356 DF PROTO=TCP SPT=38558 DPT=9105 SEQ=1121670155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D80680000000001030307) 
Feb 20 09:13:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42357 DF PROTO=TCP SPT=38558 DPT=9105 SEQ=1121670155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D88680000000001030307) 
Feb 20 09:14:00 np0005625204.localdomain sshd[150161]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:14:00 np0005625204.localdomain sshd[150161]: Accepted publickey for zuul from 192.168.122.30 port 36578 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:14:00 np0005625204.localdomain systemd-logind[759]: New session 50 of user zuul.
Feb 20 09:14:00 np0005625204.localdomain systemd[1]: Started Session 50 of User zuul.
Feb 20 09:14:00 np0005625204.localdomain sshd[150161]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:14:01 np0005625204.localdomain python3.9[150254]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:14:02 np0005625204.localdomain sudo[150348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvdadwpjkqyyjrbobujvqpfzaupjbinv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578842.4318829-59-174371255352832/AnsiballZ_file.py
Feb 20 09:14:02 np0005625204.localdomain sudo[150348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51400 DF PROTO=TCP SPT=55106 DPT=9882 SEQ=2494164795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598D95680000000001030307) 
Feb 20 09:14:03 np0005625204.localdomain python3.9[150350]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:03 np0005625204.localdomain sudo[150348]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:03 np0005625204.localdomain sudo[150440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tawwtvwzlpdgzhcagbawqyzcmnpfsuxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578843.15858-59-59497729977959/AnsiballZ_file.py
Feb 20 09:14:03 np0005625204.localdomain sudo[150440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:03 np0005625204.localdomain python3.9[150442]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:03 np0005625204.localdomain sudo[150440]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:04 np0005625204.localdomain python3.9[150532]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:14:05 np0005625204.localdomain sudo[150622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pretqobjmxtsmauwoqhkfizvwwofdcjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578845.0329843-128-119973323046539/AnsiballZ_seboolean.py
Feb 20 09:14:05 np0005625204.localdomain sudo[150622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:05 np0005625204.localdomain python3.9[150624]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 20 09:14:05 np0005625204.localdomain sudo[150622]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44676 DF PROTO=TCP SPT=54666 DPT=9101 SEQ=1974479732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DA2650000000001030307) 
Feb 20 09:14:06 np0005625204.localdomain sudo[150714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfenrmthqetwizxpobkasqhantndijdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578846.1766274-158-130885508258220/AnsiballZ_setup.py
Feb 20 09:14:06 np0005625204.localdomain sudo[150714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:06 np0005625204.localdomain sshd[150717]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:14:06 np0005625204.localdomain python3.9[150716]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:14:06 np0005625204.localdomain sshd[150717]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:14:07 np0005625204.localdomain sudo[150714]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:07 np0005625204.localdomain sudo[150770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvvqqdpbhzgiciirmoluvtuqszsmupwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578846.1766274-158-130885508258220/AnsiballZ_dnf.py
Feb 20 09:14:07 np0005625204.localdomain sudo[150770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:07 np0005625204.localdomain python3.9[150772]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:14:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44678 DF PROTO=TCP SPT=54666 DPT=9101 SEQ=1974479732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DAE680000000001030307) 
Feb 20 09:14:11 np0005625204.localdomain sudo[150770]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42359 DF PROTO=TCP SPT=38558 DPT=9105 SEQ=1121670155 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DB9680000000001030307) 
Feb 20 09:14:12 np0005625204.localdomain sudo[150865]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryzyvsscgklvuyrdcfzldjgrhedtlaye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578851.7047763-194-160707773511680/AnsiballZ_systemd.py
Feb 20 09:14:12 np0005625204.localdomain sudo[150865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:12 np0005625204.localdomain python3.9[150867]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:14:12 np0005625204.localdomain sudo[150865]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:13 np0005625204.localdomain sudo[150960]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwbuwlmzfmtiisrnmsdzwfxoonckhgfy ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771578852.7948127-218-78319980570788/AnsiballZ_edpm_nftables_snippet.py
Feb 20 09:14:13 np0005625204.localdomain sudo[150960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:13 np0005625204.localdomain python3[150962]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 20 09:14:13 np0005625204.localdomain sudo[150960]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:13 np0005625204.localdomain sudo[151052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdyxtnbiiddroeyfhlbkhexynidlzdsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578853.6779253-245-193871251288001/AnsiballZ_file.py
Feb 20 09:14:13 np0005625204.localdomain sudo[151052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:14 np0005625204.localdomain python3.9[151054]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:14 np0005625204.localdomain sudo[151052]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:14 np0005625204.localdomain sudo[151144]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmhpinzsiuhygcqdveaiciuolpyymffn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578854.3262672-270-114763192518148/AnsiballZ_stat.py
Feb 20 09:14:14 np0005625204.localdomain sudo[151144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:14 np0005625204.localdomain python3.9[151146]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:14 np0005625204.localdomain sudo[151144]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:15 np0005625204.localdomain sudo[151192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khhgnccmghgcykdnswyalkpokobyczgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578854.3262672-270-114763192518148/AnsiballZ_file.py
Feb 20 09:14:15 np0005625204.localdomain sudo[151192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:15 np0005625204.localdomain python3.9[151194]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:15 np0005625204.localdomain sudo[151192]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:15 np0005625204.localdomain sudo[151284]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xshfhbyrdsvqyrhoqfuweofaghiqynhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578855.512276-306-206499152135238/AnsiballZ_stat.py
Feb 20 09:14:15 np0005625204.localdomain sudo[151284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:15 np0005625204.localdomain python3.9[151286]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:15 np0005625204.localdomain sudo[151284]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:16 np0005625204.localdomain sudo[151332]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvvtnigecxngbduqcarolnwnmxerfzeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578855.512276-306-206499152135238/AnsiballZ_file.py
Feb 20 09:14:16 np0005625204.localdomain sudo[151332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47665 DF PROTO=TCP SPT=55052 DPT=9100 SEQ=3674430406 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DC9680000000001030307) 
Feb 20 09:14:16 np0005625204.localdomain python3.9[151334]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.r8tqv3z1 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:16 np0005625204.localdomain sudo[151332]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:16 np0005625204.localdomain sudo[151424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtrwtghmpmwhjxlbvbfexmpidsiapwtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578856.598125-341-225414256633557/AnsiballZ_stat.py
Feb 20 09:14:16 np0005625204.localdomain sudo[151424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:17 np0005625204.localdomain python3.9[151426]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:17 np0005625204.localdomain sudo[151424]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:17 np0005625204.localdomain sudo[151472]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-przrfuhnpcqtjlezdzwivhpzubfayhjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578856.598125-341-225414256633557/AnsiballZ_file.py
Feb 20 09:14:17 np0005625204.localdomain sudo[151472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:17 np0005625204.localdomain python3.9[151474]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:17 np0005625204.localdomain sudo[151472]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33327 DF PROTO=TCP SPT=53850 DPT=9882 SEQ=2368391008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DCEEA0000000001030307) 
Feb 20 09:14:19 np0005625204.localdomain sudo[151564]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzvroqwffelxpparesljexymgnuaxuej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578858.7009041-381-189519470582513/AnsiballZ_command.py
Feb 20 09:14:19 np0005625204.localdomain sudo[151564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:19 np0005625204.localdomain python3.9[151566]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:14:19 np0005625204.localdomain sudo[151564]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:19 np0005625204.localdomain sudo[151657]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnxblzcuodlsaqujisfodinqouxxrssw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771578859.4768193-405-188188510008688/AnsiballZ_edpm_nftables_from_files.py
Feb 20 09:14:19 np0005625204.localdomain sudo[151657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:20 np0005625204.localdomain python3[151659]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 20 09:14:20 np0005625204.localdomain sudo[151657]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:20 np0005625204.localdomain sudo[151749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixeafqajtfvsdnlowzdfgoqlahbbgwsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578860.3127856-428-199159928053917/AnsiballZ_stat.py
Feb 20 09:14:20 np0005625204.localdomain sudo[151749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33329 DF PROTO=TCP SPT=53850 DPT=9882 SEQ=2368391008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DDAE80000000001030307) 
Feb 20 09:14:20 np0005625204.localdomain python3.9[151751]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:20 np0005625204.localdomain sudo[151749]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:21 np0005625204.localdomain sudo[151824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbdgaczgxclsrdphkxkmlywdzvnqivay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578860.3127856-428-199159928053917/AnsiballZ_copy.py
Feb 20 09:14:21 np0005625204.localdomain sudo[151824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:21 np0005625204.localdomain python3.9[151826]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578860.3127856-428-199159928053917/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:21 np0005625204.localdomain sudo[151824]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:21 np0005625204.localdomain sudo[151916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdgafavcvgxnsfintzgvaroczwndvqho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578861.6026163-474-244494251495405/AnsiballZ_stat.py
Feb 20 09:14:21 np0005625204.localdomain sudo[151916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:22 np0005625204.localdomain python3.9[151918]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:22 np0005625204.localdomain sudo[151916]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:22 np0005625204.localdomain sudo[151991]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncvuvoqirxejoaydqamvrunqhlyjxxcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578861.6026163-474-244494251495405/AnsiballZ_copy.py
Feb 20 09:14:22 np0005625204.localdomain sudo[151991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:22 np0005625204.localdomain python3.9[151993]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578861.6026163-474-244494251495405/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:22 np0005625204.localdomain sudo[151991]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:23 np0005625204.localdomain sudo[152083]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwhnisfycrlwsrhpvbrwsoixuhdxshce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578862.8613636-519-157750907599517/AnsiballZ_stat.py
Feb 20 09:14:23 np0005625204.localdomain sudo[152083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:23 np0005625204.localdomain python3.9[152085]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:23 np0005625204.localdomain sudo[152083]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:23 np0005625204.localdomain sudo[152158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkxkdhfhpswmnxohxhmeccoiiboankud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578862.8613636-519-157750907599517/AnsiballZ_copy.py
Feb 20 09:14:23 np0005625204.localdomain sudo[152158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:23 np0005625204.localdomain python3.9[152160]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578862.8613636-519-157750907599517/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:23 np0005625204.localdomain sudo[152158]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:24 np0005625204.localdomain sudo[152250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbnolnybuuwapuxltnneblzxsghtnpty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578864.0132675-564-35817176487239/AnsiballZ_stat.py
Feb 20 09:14:24 np0005625204.localdomain sudo[152250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:24 np0005625204.localdomain python3.9[152252]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:24 np0005625204.localdomain sudo[152250]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33330 DF PROTO=TCP SPT=53850 DPT=9882 SEQ=2368391008 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DEAA80000000001030307) 
Feb 20 09:14:24 np0005625204.localdomain sudo[152325]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmgifedrdjkjhqzauqsyoraprxotxlgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578864.0132675-564-35817176487239/AnsiballZ_copy.py
Feb 20 09:14:24 np0005625204.localdomain sudo[152325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:25 np0005625204.localdomain python3.9[152327]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578864.0132675-564-35817176487239/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:25 np0005625204.localdomain sudo[152325]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:25 np0005625204.localdomain sudo[152417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uiainrpjzrsjmofpfbtmotnmgtzvnqts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578865.2043014-608-150432252551954/AnsiballZ_stat.py
Feb 20 09:14:25 np0005625204.localdomain sudo[152417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:25 np0005625204.localdomain python3.9[152419]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:25 np0005625204.localdomain sudo[152417]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:26 np0005625204.localdomain sudo[152492]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khsgibozvnbjhpkexojmnddzhetgbpst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578865.2043014-608-150432252551954/AnsiballZ_copy.py
Feb 20 09:14:26 np0005625204.localdomain sudo[152492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:26 np0005625204.localdomain python3.9[152494]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771578865.2043014-608-150432252551954/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:26 np0005625204.localdomain sudo[152492]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:26 np0005625204.localdomain sudo[152584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pypbszpinyhdgknzatjfeozdjahxwmjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578866.6618948-653-222384743920693/AnsiballZ_file.py
Feb 20 09:14:26 np0005625204.localdomain sudo[152584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:27 np0005625204.localdomain python3.9[152586]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:27 np0005625204.localdomain sudo[152584]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:27 np0005625204.localdomain sudo[152676]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucktfywmszrzbvunqfryktzkwprizorb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578867.2889526-677-256022065333225/AnsiballZ_command.py
Feb 20 09:14:27 np0005625204.localdomain sudo[152676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55238 DF PROTO=TCP SPT=50838 DPT=9105 SEQ=3095575396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DF5A80000000001030307) 
Feb 20 09:14:27 np0005625204.localdomain python3.9[152678]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:14:27 np0005625204.localdomain sudo[152676]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:28 np0005625204.localdomain sudo[152771]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zagqogbcnwhjczptbnaynfxvimagpybm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578867.9580352-701-1683300827862/AnsiballZ_blockinfile.py
Feb 20 09:14:28 np0005625204.localdomain sudo[152771]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:28 np0005625204.localdomain python3.9[152773]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:28 np0005625204.localdomain sudo[152771]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:29 np0005625204.localdomain sudo[152863]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cziipyybszxndwerdbxwbpubsxygrvdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578868.836475-728-36978534930992/AnsiballZ_command.py
Feb 20 09:14:29 np0005625204.localdomain sudo[152863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:29 np0005625204.localdomain python3.9[152865]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:14:29 np0005625204.localdomain sudo[152863]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55239 DF PROTO=TCP SPT=50838 DPT=9105 SEQ=3095575396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598DFDA80000000001030307) 
Feb 20 09:14:29 np0005625204.localdomain sudo[152956]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phmrhtavwscnzqtkylvvshqsunrzbdwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578869.5257967-752-28703050651106/AnsiballZ_stat.py
Feb 20 09:14:29 np0005625204.localdomain sudo[152956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:29 np0005625204.localdomain python3.9[152958]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:14:29 np0005625204.localdomain sudo[152956]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:30 np0005625204.localdomain sudo[152988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:14:30 np0005625204.localdomain sudo[152988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:14:30 np0005625204.localdomain sudo[152988]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:30 np0005625204.localdomain sudo[153022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:14:30 np0005625204.localdomain sudo[153022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:14:30 np0005625204.localdomain sudo[153080]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlurkuseuxnptkcggqdrqwkzsqvjxfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578870.1188707-777-120595054073696/AnsiballZ_command.py
Feb 20 09:14:30 np0005625204.localdomain sudo[153080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:30 np0005625204.localdomain python3.9[153082]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:14:30 np0005625204.localdomain sudo[153080]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:31 np0005625204.localdomain podman[153183]: 2026-02-20 09:14:31.077418463 +0000 UTC m=+0.098397303 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 20 09:14:31 np0005625204.localdomain sudo[153262]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjhwxutplilrxcqibhszjhypzzjpuqse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578870.9741082-800-43337106908315/AnsiballZ_file.py
Feb 20 09:14:31 np0005625204.localdomain sudo[153262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:31 np0005625204.localdomain podman[153183]: 2026-02-20 09:14:31.212168477 +0000 UTC m=+0.233147317 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.openshift.expose-services=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, version=7, io.buildah.version=1.42.2, name=rhceph)
Feb 20 09:14:31 np0005625204.localdomain python3.9[153270]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:31 np0005625204.localdomain sudo[153262]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:31 np0005625204.localdomain sudo[153022]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:31 np0005625204.localdomain sudo[153331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:14:31 np0005625204.localdomain sudo[153331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:14:31 np0005625204.localdomain sudo[153331]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:31 np0005625204.localdomain sudo[153346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:14:31 np0005625204.localdomain sudo[153346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:14:32 np0005625204.localdomain sudo[153346]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:32 np0005625204.localdomain python3.9[153465]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:14:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2161 DF PROTO=TCP SPT=50532 DPT=9102 SEQ=761812327 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E09680000000001030307) 
Feb 20 09:14:32 np0005625204.localdomain sudo[153483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:14:32 np0005625204.localdomain sudo[153483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:14:32 np0005625204.localdomain sudo[153483]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:33 np0005625204.localdomain sudo[153573]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onemqozthdvxomisngmunscfovnwzsxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578873.3944643-923-69344317810273/AnsiballZ_command.py
Feb 20 09:14:33 np0005625204.localdomain sudo[153573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:33 np0005625204.localdomain python3.9[153575]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005625204.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:e8:77:41:0b" external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:14:33 np0005625204.localdomain ovs-vsctl[153576]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005625204.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:e8:77:41:0b external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 20 09:14:33 np0005625204.localdomain sudo[153573]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:34 np0005625204.localdomain sudo[153666]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zslhrgbjorpdltdotixpsuwhgwogmbfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578874.1824386-950-20236925605162/AnsiballZ_command.py
Feb 20 09:14:34 np0005625204.localdomain sudo[153666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:34 np0005625204.localdomain python3.9[153668]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:14:34 np0005625204.localdomain sudo[153666]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:35 np0005625204.localdomain python3.9[153761]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:14:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64736 DF PROTO=TCP SPT=44892 DPT=9101 SEQ=3976226729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E17950000000001030307) 
Feb 20 09:14:36 np0005625204.localdomain sudo[153853]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfsfdkpdsjkofaaaodxvlgquttxqteks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578876.2113862-1005-210263958391231/AnsiballZ_file.py
Feb 20 09:14:36 np0005625204.localdomain sudo[153853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:36 np0005625204.localdomain python3.9[153855]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:36 np0005625204.localdomain sudo[153853]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:37 np0005625204.localdomain sudo[153945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioitsttzjyftrwcnaneiisujfmgwcsbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578876.9361732-1028-72443357285508/AnsiballZ_stat.py
Feb 20 09:14:37 np0005625204.localdomain sudo[153945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:37 np0005625204.localdomain sshd[153948]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:14:37 np0005625204.localdomain python3.9[153947]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:37 np0005625204.localdomain sudo[153945]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:37 np0005625204.localdomain sshd[153948]: Invalid user n8n from 96.78.175.36 port 51938
Feb 20 09:14:37 np0005625204.localdomain sshd[153948]: Received disconnect from 96.78.175.36 port 51938:11: Bye Bye [preauth]
Feb 20 09:14:37 np0005625204.localdomain sshd[153948]: Disconnected from invalid user n8n 96.78.175.36 port 51938 [preauth]
Feb 20 09:14:37 np0005625204.localdomain sudo[153995]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttviwmqkduoshvqfmxzcpuuttsmqijfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578876.9361732-1028-72443357285508/AnsiballZ_file.py
Feb 20 09:14:37 np0005625204.localdomain sudo[153995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:38 np0005625204.localdomain python3.9[153997]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:38 np0005625204.localdomain sudo[153995]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:38 np0005625204.localdomain sudo[154087]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdgmvylwkexamzztdsicmviouupcoeks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578878.274873-1028-198410760802748/AnsiballZ_stat.py
Feb 20 09:14:38 np0005625204.localdomain sudo[154087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:38 np0005625204.localdomain python3.9[154089]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:38 np0005625204.localdomain sudo[154087]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:38 np0005625204.localdomain sudo[154135]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thrvqmutseggfuqvhntsoisbmzdypkcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578878.274873-1028-198410760802748/AnsiballZ_file.py
Feb 20 09:14:38 np0005625204.localdomain sudo[154135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:39 np0005625204.localdomain python3.9[154137]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:39 np0005625204.localdomain sudo[154135]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64738 DF PROTO=TCP SPT=44892 DPT=9101 SEQ=3976226729 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E23A80000000001030307) 
Feb 20 09:14:39 np0005625204.localdomain sudo[154227]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzjbutuybhiueiedvypteosaszupokik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578879.359738-1098-146217107630042/AnsiballZ_file.py
Feb 20 09:14:39 np0005625204.localdomain sudo[154227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:39 np0005625204.localdomain python3.9[154229]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:39 np0005625204.localdomain sudo[154227]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:40 np0005625204.localdomain sudo[154319]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfpkxwatpebwkgepzhulhdavlgdmypre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578879.9915028-1121-37436879113249/AnsiballZ_stat.py
Feb 20 09:14:40 np0005625204.localdomain sudo[154319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:40 np0005625204.localdomain python3.9[154321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:40 np0005625204.localdomain sudo[154319]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:40 np0005625204.localdomain sudo[154367]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izxoddzfqagyshkrbgpvnmpkkycvdtsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578879.9915028-1121-37436879113249/AnsiballZ_file.py
Feb 20 09:14:40 np0005625204.localdomain sudo[154367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:40 np0005625204.localdomain python3.9[154369]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:40 np0005625204.localdomain sudo[154367]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:41 np0005625204.localdomain sudo[154459]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iluvzsuufrdxjsqatjdagvdxkpqfzdlb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578881.08631-1158-5388749819356/AnsiballZ_stat.py
Feb 20 09:14:41 np0005625204.localdomain sudo[154459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:41 np0005625204.localdomain python3.9[154461]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:41 np0005625204.localdomain sudo[154459]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:41 np0005625204.localdomain sudo[154507]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpsijgznvbahqkjpjzrtdrqjbpmtwnlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578881.08631-1158-5388749819356/AnsiballZ_file.py
Feb 20 09:14:41 np0005625204.localdomain sudo[154507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55241 DF PROTO=TCP SPT=50838 DPT=9105 SEQ=3095575396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E2D680000000001030307) 
Feb 20 09:14:41 np0005625204.localdomain python3.9[154509]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:42 np0005625204.localdomain sudo[154507]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:42 np0005625204.localdomain sudo[154599]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oiqowioolzzeqkxgrmuumtivntwawriq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578882.177332-1193-54958002446255/AnsiballZ_systemd.py
Feb 20 09:14:42 np0005625204.localdomain sudo[154599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:42 np0005625204.localdomain python3.9[154601]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:14:42 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:14:42 np0005625204.localdomain systemd-rc-local-generator[154626]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:14:42 np0005625204.localdomain systemd-sysv-generator[154629]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:14:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:14:43 np0005625204.localdomain sudo[154599]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:43 np0005625204.localdomain sudo[154728]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czlprmzfmfraytlgnhatelqrupxausdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578883.6319892-1217-269336470363418/AnsiballZ_stat.py
Feb 20 09:14:43 np0005625204.localdomain sudo[154728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:44 np0005625204.localdomain python3.9[154730]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:44 np0005625204.localdomain sudo[154728]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:44 np0005625204.localdomain sudo[154776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alghpssjqtxlwbzjfrtbkqifsnzhyzsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578883.6319892-1217-269336470363418/AnsiballZ_file.py
Feb 20 09:14:44 np0005625204.localdomain sudo[154776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:44 np0005625204.localdomain python3.9[154778]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:44 np0005625204.localdomain sudo[154776]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:45 np0005625204.localdomain sudo[154868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbuxxkoapkwswiavxznszkgbtqmthjmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578884.7437398-1254-202925256105150/AnsiballZ_stat.py
Feb 20 09:14:45 np0005625204.localdomain sudo[154868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:45 np0005625204.localdomain python3.9[154870]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20070 DF PROTO=TCP SPT=49028 DPT=9100 SEQ=3165136517 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E3D680000000001030307) 
Feb 20 09:14:46 np0005625204.localdomain sudo[154868]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:46 np0005625204.localdomain sudo[154916]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnahutijqxbncmgarxdlhnrqoeuuxfwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578884.7437398-1254-202925256105150/AnsiballZ_file.py
Feb 20 09:14:46 np0005625204.localdomain sudo[154916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:46 np0005625204.localdomain python3.9[154918]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:46 np0005625204.localdomain sudo[154916]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:47 np0005625204.localdomain sudo[155008]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmmqnllykjihzerqruquvwruvdnbjjyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578886.8875659-1289-215275798348019/AnsiballZ_systemd.py
Feb 20 09:14:47 np0005625204.localdomain sudo[155008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:47 np0005625204.localdomain python3.9[155010]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:14:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:14:47 np0005625204.localdomain systemd-rc-local-generator[155032]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:14:47 np0005625204.localdomain systemd-sysv-generator[155036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:14:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:14:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29220 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E441B0000000001030307) 
Feb 20 09:14:47 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 09:14:47 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 09:14:47 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 09:14:47 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 09:14:47 np0005625204.localdomain sudo[155008]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:49 np0005625204.localdomain sudo[155141]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kvgmqgkwtwrwhafeunhnwbpowntrxdkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578888.8459153-1320-261258226858040/AnsiballZ_file.py
Feb 20 09:14:49 np0005625204.localdomain sudo[155141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:49 np0005625204.localdomain python3.9[155143]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:49 np0005625204.localdomain sudo[155141]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:49 np0005625204.localdomain sudo[155233]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhpgioqhtcebadtdknqkhnavgshvvyvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578889.4770167-1343-209736468978954/AnsiballZ_stat.py
Feb 20 09:14:49 np0005625204.localdomain sudo[155233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:49 np0005625204.localdomain python3.9[155235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:49 np0005625204.localdomain sudo[155233]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:50 np0005625204.localdomain sudo[155306]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwedgxzamhqayzjjjodnfxbkmscntrle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578889.4770167-1343-209736468978954/AnsiballZ_copy.py
Feb 20 09:14:50 np0005625204.localdomain sudo[155306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:50 np0005625204.localdomain python3.9[155308]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578889.4770167-1343-209736468978954/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:50 np0005625204.localdomain sudo[155306]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:50 np0005625204.localdomain sshd[155309]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:14:50 np0005625204.localdomain sshd[155309]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:14:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29222 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E50280000000001030307) 
Feb 20 09:14:51 np0005625204.localdomain sudo[155400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyagxivyrclwwegniyjtvfupmozpgfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578890.9913363-1395-53094800063675/AnsiballZ_file.py
Feb 20 09:14:51 np0005625204.localdomain sudo[155400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:51 np0005625204.localdomain python3.9[155402]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:51 np0005625204.localdomain sudo[155400]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:51 np0005625204.localdomain sudo[155492]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezfxydbenhbsylesicodymxcrnribprg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578891.6532512-1418-115595731568040/AnsiballZ_file.py
Feb 20 09:14:51 np0005625204.localdomain sudo[155492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:52 np0005625204.localdomain python3.9[155494]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:14:52 np0005625204.localdomain sudo[155492]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:52 np0005625204.localdomain sudo[155584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxbddsdmfosavymgghkbkimsyuvdggcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578892.4211583-1442-247753634620878/AnsiballZ_stat.py
Feb 20 09:14:52 np0005625204.localdomain sudo[155584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:52 np0005625204.localdomain python3.9[155586]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:14:52 np0005625204.localdomain sudo[155584]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:53 np0005625204.localdomain sudo[155659]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skunxlzxeqeomyozkjeujsengozklwll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578892.4211583-1442-247753634620878/AnsiballZ_copy.py
Feb 20 09:14:53 np0005625204.localdomain sudo[155659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:53 np0005625204.localdomain python3.9[155661]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578892.4211583-1442-247753634620878/.source.json _original_basename=.2g6eflyg follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:53 np0005625204.localdomain sudo[155659]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:54 np0005625204.localdomain python3.9[155751]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:14:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29223 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E5FE80000000001030307) 
Feb 20 09:14:56 np0005625204.localdomain sudo[156002]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhsrybqtzcfeoqzxfvhormdiibbsrogy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578895.69318-1563-69016667050940/AnsiballZ_container_config_data.py
Feb 20 09:14:56 np0005625204.localdomain sudo[156002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:56 np0005625204.localdomain python3.9[156004]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 20 09:14:56 np0005625204.localdomain sudo[156002]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:57 np0005625204.localdomain sudo[156094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnsdavghjotmbllfvlglphuemndtjdsh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578896.7105927-1595-127958928140248/AnsiballZ_container_config_hash.py
Feb 20 09:14:57 np0005625204.localdomain sudo[156094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:57 np0005625204.localdomain python3.9[156096]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:14:57 np0005625204.localdomain sudo[156094]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28620 DF PROTO=TCP SPT=34900 DPT=9105 SEQ=68265424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E6AE80000000001030307) 
Feb 20 09:14:58 np0005625204.localdomain sudo[156186]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmewwhanbeieyfhcfijybafnnauxydaw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771578897.9633684-1625-13452856917606/AnsiballZ_edpm_container_manage.py
Feb 20 09:14:58 np0005625204.localdomain sudo[156186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:58 np0005625204.localdomain python3[156188]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:14:59 np0005625204.localdomain python3[156188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e",
                                                                    "Digest": "sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:38:56.623500445Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 346422728,
                                                                    "VirtualSize": 346422728,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:033e0289d512b27a678c3feb7195acb9c5f2fbb27c9b2d8c8b5b5f6156f0d11f",
                                                                              "sha256:f848a534c5dfe59c31c3da34c3d2466bdea7e8da7def4225acdd3ffef1544d2f"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:00.623406883Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:55.918991169Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:57.814850041Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:38:21.443386852Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:38:56.622512308Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:38:57.466949121Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 20 09:14:59 np0005625204.localdomain podman[156239]: 2026-02-20 09:14:59.152231438 +0000 UTC m=+0.079357820 container remove 0e44212bd1127302658fb9dccc5fbbc354b54cd5d0d5fb52bab7f99a38620850 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 20 09:14:59 np0005625204.localdomain python3[156188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Feb 20 09:14:59 np0005625204.localdomain podman[156253]: 
Feb 20 09:14:59 np0005625204.localdomain podman[156253]: 2026-02-20 09:14:59.251059241 +0000 UTC m=+0.081389956 container create 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Feb 20 09:14:59 np0005625204.localdomain podman[156253]: 2026-02-20 09:14:59.212906384 +0000 UTC m=+0.043237129 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 20 09:14:59 np0005625204.localdomain python3[156188]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 20 09:14:59 np0005625204.localdomain sudo[156186]: pam_unix(sudo:session): session closed for user root
Feb 20 09:14:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28621 DF PROTO=TCP SPT=34900 DPT=9105 SEQ=68265424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E72E80000000001030307) 
Feb 20 09:14:59 np0005625204.localdomain sudo[156382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpblzczzoyucbdvetuwzmsgzwkeqcczz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578899.5788527-1649-178586729204583/AnsiballZ_stat.py
Feb 20 09:14:59 np0005625204.localdomain sudo[156382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:14:59 np0005625204.localdomain python3.9[156384]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:14:59 np0005625204.localdomain sudo[156382]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:00 np0005625204.localdomain sudo[156476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbpjwvzitvrmwxyyxkqchibkhgoljevx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578900.2884188-1676-84015199127190/AnsiballZ_file.py
Feb 20 09:15:00 np0005625204.localdomain sudo[156476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:00 np0005625204.localdomain python3.9[156479]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:00 np0005625204.localdomain sudo[156476]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:00 np0005625204.localdomain sudo[156523]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhcjdresjyhuibssdkxgblugwnbojxct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578900.2884188-1676-84015199127190/AnsiballZ_stat.py
Feb 20 09:15:00 np0005625204.localdomain sudo[156523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:01 np0005625204.localdomain python3.9[156525]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:15:01 np0005625204.localdomain sudo[156523]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:01 np0005625204.localdomain sudo[156614]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuzykkiekaojrssevprnsebdjedvviwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578901.1729815-1676-16375319791171/AnsiballZ_copy.py
Feb 20 09:15:01 np0005625204.localdomain sudo[156614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:01 np0005625204.localdomain python3.9[156616]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771578901.1729815-1676-16375319791171/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:01 np0005625204.localdomain sudo[156614]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:02 np0005625204.localdomain sudo[156660]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdiokmdblttvdnjaddjjmvhwcfrkikcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578901.1729815-1676-16375319791171/AnsiballZ_systemd.py
Feb 20 09:15:02 np0005625204.localdomain sudo[156660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:02 np0005625204.localdomain python3.9[156662]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:15:02 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:15:02 np0005625204.localdomain systemd-rc-local-generator[156684]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:15:02 np0005625204.localdomain systemd-sysv-generator[156688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:15:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:15:02 np0005625204.localdomain sudo[156660]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29224 DF PROTO=TCP SPT=52666 DPT=9882 SEQ=753057179 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E7F680000000001030307) 
Feb 20 09:15:02 np0005625204.localdomain sudo[156742]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjbjlxtflqgbupeywnuhfkamkesdnici ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578901.1729815-1676-16375319791171/AnsiballZ_systemd.py
Feb 20 09:15:02 np0005625204.localdomain sudo[156742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:03 np0005625204.localdomain python3.9[156744]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:15:03 np0005625204.localdomain systemd-rc-local-generator[156768]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:15:03 np0005625204.localdomain systemd-sysv-generator[156772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Starting ovn_controller container...
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:15:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8083c7e517bedac32994ed107decec2298ae4369d578f2e10ab7db3bec0cf06e/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:15:03 np0005625204.localdomain podman[156786]: 2026-02-20 09:15:03.792025944 +0000 UTC m=+0.159277873 container init 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:15:03 np0005625204.localdomain ovn_controller[156798]: + sudo -E kolla_set_configs
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:15:03 np0005625204.localdomain podman[156786]: 2026-02-20 09:15:03.837209836 +0000 UTC m=+0.204461765 container start 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:15:03 np0005625204.localdomain edpm-start-podman-container[156786]: ovn_controller
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 20 09:15:03 np0005625204.localdomain systemd[156827]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 20 09:15:03 np0005625204.localdomain edpm-start-podman-container[156785]: Creating additional drop-in dependency for "ovn_controller" (67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f)
Feb 20 09:15:03 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:15:04 np0005625204.localdomain podman[156806]: 2026-02-20 09:15:04.013601249 +0000 UTC m=+0.168734225 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Queued start job for default target Main User Target.
Feb 20 09:15:04 np0005625204.localdomain podman[156806]: 2026-02-20 09:15:04.053193256 +0000 UTC m=+0.208326252 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Created slice User Application Slice.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 20 09:15:04 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 20 09:15:04 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:15:04 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Reached target Paths.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Reached target Timers.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Starting D-Bus User Message Bus Socket...
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Starting Create User's Volatile Files and Directories...
Feb 20 09:15:04 np0005625204.localdomain podman[156806]: unhealthy
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Listening on D-Bus User Message Bus Socket.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Finished Create User's Volatile Files and Directories.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Reached target Sockets.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Reached target Basic System.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Reached target Main User Target.
Feb 20 09:15:04 np0005625204.localdomain systemd[156827]: Startup finished in 125ms.
Feb 20 09:15:04 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:15:04 np0005625204.localdomain systemd-rc-local-generator[156889]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:15:04 np0005625204.localdomain systemd-sysv-generator[156893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: Started User Manager for UID 0.
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: Started ovn_controller container.
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Failed with result 'exit-code'.
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: Started Session c12 of User root.
Feb 20 09:15:04 np0005625204.localdomain sudo[156742]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: INFO:__main__:Validating config file
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: INFO:__main__:Writing out command to execute
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: ++ cat /run_command
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + ARGS=
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + sudo kolla_copy_cacerts
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: Started Session c13 of User root.
Feb 20 09:15:04 np0005625204.localdomain systemd[1]: session-c13.scope: Deactivated successfully.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + [[ ! -n '' ]]
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + . kolla_extend_start
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + umask 0022
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00013|main|INFO|OVS feature set changed, force recompute.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00021|main|INFO|OVS feature set changed, force recompute.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00022|ovn_bfd|INFO|Disabled BFD on interface ovn-2df8cc-0
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00023|ovn_bfd|INFO|Disabled BFD on interface ovn-0c414b-0
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00024|ovn_bfd|INFO|Disabled BFD on interface ovn-2275c3-0
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00025|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00026|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00027|binding|INFO|Claiming lport e7aa8e2a-27a6-452b-906c-21cea166b882 for this chassis.
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00028|binding|INFO|e7aa8e2a-27a6-452b-906c-21cea166b882: Claiming fa:16:3e:b0:ed:d2 192.168.0.140
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00029|binding|INFO|Removing lport e7aa8e2a-27a6-452b-906c-21cea166b882 ovn-installed in OVS
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00030|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00031|ovn_bfd|INFO|Enabled BFD on interface ovn-2df8cc-0
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00032|ovn_bfd|INFO|Enabled BFD on interface ovn-0c414b-0
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00033|ovn_bfd|INFO|Enabled BFD on interface ovn-2275c3-0
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00034|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00035|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00036|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00037|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:04Z|00038|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:05 np0005625204.localdomain python3.9[157000]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:15:05 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:05Z|00039|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:05 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:05Z|00040|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:06 np0005625204.localdomain sudo[157090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiwxfpxknsptpyedwlpcefiyjgmfbagr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578905.7964308-1811-10320456623650/AnsiballZ_stat.py
Feb 20 09:15:06 np0005625204.localdomain sudo[157090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:06 np0005625204.localdomain python3.9[157092]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:06 np0005625204.localdomain sudo[157090]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:06Z|00041|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:15:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27168 DF PROTO=TCP SPT=52160 DPT=9101 SEQ=2015171662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E8CC50000000001030307) 
Feb 20 09:15:06 np0005625204.localdomain sudo[157164]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fratmdzbmpwcgzgxeazhjlxnohcvozfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578905.7964308-1811-10320456623650/AnsiballZ_copy.py
Feb 20 09:15:06 np0005625204.localdomain sudo[157164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:06 np0005625204.localdomain python3.9[157166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578905.7964308-1811-10320456623650/.source.yaml _original_basename=.tyebha5s follow=False checksum=035aea7be6ab20b22f84818c544954f904d1fea8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:06 np0005625204.localdomain sudo[157164]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:07 np0005625204.localdomain sudo[157256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgoobpduhfcxawhzsklyorljvryxghwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578906.997843-1856-240615650418018/AnsiballZ_command.py
Feb 20 09:15:07 np0005625204.localdomain sudo[157256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:07 np0005625204.localdomain python3.9[157258]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:15:07 np0005625204.localdomain ovs-vsctl[157259]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 20 09:15:07 np0005625204.localdomain sudo[157256]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:07 np0005625204.localdomain sudo[157349]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpqwsrjwxflbtisfkyboaazmzoiujgfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578907.7016292-1880-250997742730047/AnsiballZ_command.py
Feb 20 09:15:07 np0005625204.localdomain sudo[157349]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:08 np0005625204.localdomain python3.9[157351]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:15:08 np0005625204.localdomain ovs-vsctl[157353]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 20 09:15:08 np0005625204.localdomain sudo[157349]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:09 np0005625204.localdomain sudo[157444]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htoikuyapdflnhqpoxxramnjlxcswrgf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578908.8236284-1923-168846783668744/AnsiballZ_command.py
Feb 20 09:15:09 np0005625204.localdomain sudo[157444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:09 np0005625204.localdomain python3.9[157446]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:15:09 np0005625204.localdomain ovs-vsctl[157447]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 20 09:15:09 np0005625204.localdomain sudo[157444]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27170 DF PROTO=TCP SPT=52160 DPT=9101 SEQ=2015171662 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598E98E80000000001030307) 
Feb 20 09:15:09 np0005625204.localdomain sshd[150161]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:15:09 np0005625204.localdomain systemd[1]: session-50.scope: Deactivated successfully.
Feb 20 09:15:09 np0005625204.localdomain systemd[1]: session-50.scope: Consumed 42.127s CPU time.
Feb 20 09:15:09 np0005625204.localdomain systemd-logind[759]: Session 50 logged out. Waiting for processes to exit.
Feb 20 09:15:09 np0005625204.localdomain systemd-logind[759]: Removed session 50.
Feb 20 09:15:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28623 DF PROTO=TCP SPT=34900 DPT=9105 SEQ=68265424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EA3680000000001030307) 
Feb 20 09:15:12 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:12Z|00042|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 ovn-installed in OVS
Feb 20 09:15:12 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:12Z|00043|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 up in Southbound
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Activating special unit Exit the Session...
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped target Main User Target.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped target Basic System.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped target Paths.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped target Sockets.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped target Timers.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Closed D-Bus User Message Bus Socket.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Stopped Create User's Volatile Files and Directories.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Removed slice User Application Slice.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Reached target Shutdown.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Finished Exit the Session.
Feb 20 09:15:14 np0005625204.localdomain systemd[156827]: Reached target Exit the Session.
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 20 09:15:14 np0005625204.localdomain sshd[157464]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 20 09:15:14 np0005625204.localdomain sshd[157464]: Accepted publickey for zuul from 192.168.122.30 port 53702 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:15:14 np0005625204.localdomain systemd-logind[759]: New session 52 of user zuul.
Feb 20 09:15:14 np0005625204.localdomain systemd[1]: Started Session 52 of User zuul.
Feb 20 09:15:14 np0005625204.localdomain sshd[157464]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:15:15 np0005625204.localdomain python3.9[157557]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:15:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61125 DF PROTO=TCP SPT=42648 DPT=9100 SEQ=1731224291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EB3690000000001030307) 
Feb 20 09:15:16 np0005625204.localdomain sudo[157651]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvggylbfrqkeuazggxcrfatfcmfkshqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578916.4454005-59-73799380858116/AnsiballZ_file.py
Feb 20 09:15:16 np0005625204.localdomain sudo[157651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:17 np0005625204.localdomain python3.9[157653]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:17 np0005625204.localdomain sudo[157651]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:17 np0005625204.localdomain sudo[157743]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxchvndbfssbrclpjvvaymemhulcxskn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578917.190038-59-129989367373157/AnsiballZ_file.py
Feb 20 09:15:17 np0005625204.localdomain sudo[157743]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:17 np0005625204.localdomain python3.9[157745]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:17 np0005625204.localdomain sudo[157743]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61090 DF PROTO=TCP SPT=53842 DPT=9882 SEQ=3464857739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EB94A0000000001030307) 
Feb 20 09:15:17 np0005625204.localdomain sudo[157835]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yehitmutolcmmatjcpywywwihwralvaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578917.7324698-59-103506784434264/AnsiballZ_file.py
Feb 20 09:15:17 np0005625204.localdomain sudo[157835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:18 np0005625204.localdomain python3.9[157837]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:18 np0005625204.localdomain sudo[157835]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:18 np0005625204.localdomain sudo[157927]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfkghajtzmdzizqiridgtjbrzeddxpji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578918.270941-59-192119935711121/AnsiballZ_file.py
Feb 20 09:15:18 np0005625204.localdomain sudo[157927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:18 np0005625204.localdomain python3.9[157929]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:18 np0005625204.localdomain sudo[157927]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:19 np0005625204.localdomain sudo[158019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byooxflibpgzzmiqtowflzfhponqvjkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578918.8695452-59-44493668639399/AnsiballZ_file.py
Feb 20 09:15:19 np0005625204.localdomain sudo[158019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:19 np0005625204.localdomain python3.9[158021]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:19 np0005625204.localdomain sudo[158019]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:20 np0005625204.localdomain python3.9[158111]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:15:20 np0005625204.localdomain sudo[158201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwoucezeaazeepaociwnwtseizgmdbvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578920.3615215-191-43221515567599/AnsiballZ_seboolean.py
Feb 20 09:15:20 np0005625204.localdomain sudo[158201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61092 DF PROTO=TCP SPT=53842 DPT=9882 SEQ=3464857739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EC5680000000001030307) 
Feb 20 09:15:20 np0005625204.localdomain python3.9[158203]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 20 09:15:21 np0005625204.localdomain sudo[158201]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:21 np0005625204.localdomain python3.9[158293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:22 np0005625204.localdomain python3.9[158367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578921.304057-215-173553106590839/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:23 np0005625204.localdomain python3.9[158457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:24 np0005625204.localdomain python3.9[158530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578923.0118837-261-28122298995605/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:24 np0005625204.localdomain sudo[158620]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdltfdqiqmkiyzdrdmbdeuiabadpbslb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578924.4089162-311-135073841668871/AnsiballZ_setup.py
Feb 20 09:15:24 np0005625204.localdomain sudo[158620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61093 DF PROTO=TCP SPT=53842 DPT=9882 SEQ=3464857739 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598ED5290000000001030307) 
Feb 20 09:15:24 np0005625204.localdomain python3.9[158622]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:15:25 np0005625204.localdomain sudo[158620]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:25 np0005625204.localdomain sudo[158674]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrizzletjenyhzuiksavdipxcwbihwlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578924.4089162-311-135073841668871/AnsiballZ_dnf.py
Feb 20 09:15:25 np0005625204.localdomain sudo[158674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:25 np0005625204.localdomain python3.9[158676]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:15:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2765 DF PROTO=TCP SPT=60428 DPT=9105 SEQ=3726470640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EDFE90000000001030307) 
Feb 20 09:15:29 np0005625204.localdomain sudo[158674]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2766 DF PROTO=TCP SPT=60428 DPT=9105 SEQ=3726470640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EE7E80000000001030307) 
Feb 20 09:15:30 np0005625204.localdomain sudo[158768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyhwczdjffxqonctqxhofakmcbxedvqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578929.5022187-347-126941690193392/AnsiballZ_systemd.py
Feb 20 09:15:30 np0005625204.localdomain sudo[158768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:30 np0005625204.localdomain python3.9[158770]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:15:30 np0005625204.localdomain sudo[158768]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:32 np0005625204.localdomain python3.9[158863]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:32 np0005625204.localdomain python3.9[158934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578931.6544049-371-5781413151881/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35328 DF PROTO=TCP SPT=53118 DPT=9102 SEQ=3089569175 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598EF3680000000001030307) 
Feb 20 09:15:32 np0005625204.localdomain sudo[159024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:15:33 np0005625204.localdomain sudo[159024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:15:33 np0005625204.localdomain sudo[159024]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:33 np0005625204.localdomain sudo[159040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:15:33 np0005625204.localdomain sudo[159040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:15:33 np0005625204.localdomain python3.9[159025]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:33 np0005625204.localdomain python3.9[159140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578932.7195594-371-194159434887944/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:33 np0005625204.localdomain sudo[159040]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:34 np0005625204.localdomain sudo[159195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:15:34 np0005625204.localdomain sudo[159195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:15:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:15:34 np0005625204.localdomain sudo[159195]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:34 np0005625204.localdomain podman[159231]: 2026-02-20 09:15:34.658609342 +0000 UTC m=+0.085493349 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 20 09:15:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:34Z|00044|memory|INFO|19064 kB peak resident set size after 30.2 seconds
Feb 20 09:15:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:34Z|00045|memory|INFO|idl-cells-OVN_Southbound:4072 idl-cells-Open_vSwitch:1045 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:80 lflow-cache-entries-cache-matches:195 lflow-cache-size-KB:348 local_datapath_usage-KB:1 ofctrl_desired_flow_usage-KB:157 ofctrl_installed_flow_usage-KB:114 ofctrl_sb_flow_ref_usage-KB:68
Feb 20 09:15:34 np0005625204.localdomain podman[159231]: 2026-02-20 09:15:34.722252349 +0000 UTC m=+0.149136336 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:15:34 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:15:34 np0005625204.localdomain sshd[159289]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:15:34 np0005625204.localdomain python3.9[159282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:34 np0005625204.localdomain sshd[159289]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:15:35 np0005625204.localdomain python3.9[159361]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578934.4460106-504-83461560113569/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:35 np0005625204.localdomain python3.9[159451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50310 DF PROTO=TCP SPT=36576 DPT=9101 SEQ=1287031837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F01F50000000001030307) 
Feb 20 09:15:36 np0005625204.localdomain python3.9[159522]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578935.503693-504-29905404860244/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:36 np0005625204.localdomain python3.9[159612]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:15:37 np0005625204.localdomain sudo[159704]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urpbfdkynoooovbwkysyhpbcuvsleank ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578937.2627897-618-29896405385778/AnsiballZ_file.py
Feb 20 09:15:37 np0005625204.localdomain sudo[159704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:37 np0005625204.localdomain python3.9[159706]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:37 np0005625204.localdomain sudo[159704]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:38 np0005625204.localdomain sudo[159796]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jiosmryjlunksqrzpvtwinbzhjvlnhdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578938.0660527-642-221164945932965/AnsiballZ_stat.py
Feb 20 09:15:38 np0005625204.localdomain sudo[159796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:38 np0005625204.localdomain python3.9[159798]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:38 np0005625204.localdomain sudo[159796]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:38 np0005625204.localdomain sudo[159844]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhtzwritxgjwwheovubvnjlphpgiizwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578938.0660527-642-221164945932965/AnsiballZ_file.py
Feb 20 09:15:38 np0005625204.localdomain sudo[159844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:38 np0005625204.localdomain python3.9[159846]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:38 np0005625204.localdomain sudo[159844]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50312 DF PROTO=TCP SPT=36576 DPT=9101 SEQ=1287031837 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F0DE80000000001030307) 
Feb 20 09:15:39 np0005625204.localdomain sudo[159936]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-samhuhhdrcsxacqqwwmfgaibyvxlwqcc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578939.0976272-642-103870625972459/AnsiballZ_stat.py
Feb 20 09:15:39 np0005625204.localdomain sudo[159936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:39 np0005625204.localdomain python3.9[159938]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:39 np0005625204.localdomain sudo[159936]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:39 np0005625204.localdomain sudo[159984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjaalgmqjnududxuoptgumgdkcsckhys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578939.0976272-642-103870625972459/AnsiballZ_file.py
Feb 20 09:15:39 np0005625204.localdomain sudo[159984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:40 np0005625204.localdomain python3.9[159986]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:40 np0005625204.localdomain sudo[159984]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:40 np0005625204.localdomain sudo[160076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxmoadoilslxjdaoijrmufmicyzhynuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578940.231018-710-188553121467311/AnsiballZ_file.py
Feb 20 09:15:40 np0005625204.localdomain sudo[160076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:40 np0005625204.localdomain python3.9[160078]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:40 np0005625204.localdomain sudo[160076]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:41 np0005625204.localdomain sudo[160168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txjtuocavbsgnmxkayxasdcdukmgcaxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578940.9363952-735-128693149841486/AnsiballZ_stat.py
Feb 20 09:15:41 np0005625204.localdomain sudo[160168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:41 np0005625204.localdomain python3.9[160170]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:41 np0005625204.localdomain sudo[160168]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:41 np0005625204.localdomain sudo[160216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpemdnbehmjiuhggrsctvjcbujpimlpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578940.9363952-735-128693149841486/AnsiballZ_file.py
Feb 20 09:15:41 np0005625204.localdomain sudo[160216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2768 DF PROTO=TCP SPT=60428 DPT=9105 SEQ=3726470640 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F17680000000001030307) 
Feb 20 09:15:41 np0005625204.localdomain python3.9[160218]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:41 np0005625204.localdomain sudo[160216]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:42 np0005625204.localdomain sudo[160308]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpiudozxxbwkiysgteremsigkehlwsjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578941.9912794-770-100386270013352/AnsiballZ_stat.py
Feb 20 09:15:42 np0005625204.localdomain sudo[160308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:42 np0005625204.localdomain python3.9[160310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:42 np0005625204.localdomain sudo[160308]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:15:42Z|00046|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 20 09:15:42 np0005625204.localdomain sudo[160356]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzpaublbrkkycgulrvsohnlibicrojgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578941.9912794-770-100386270013352/AnsiballZ_file.py
Feb 20 09:15:42 np0005625204.localdomain sudo[160356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:43 np0005625204.localdomain python3.9[160358]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:43 np0005625204.localdomain sudo[160356]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:43 np0005625204.localdomain sudo[160448]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upytqubigbgdksoblwscpmzcitxxjndl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578943.3299382-806-146534802204524/AnsiballZ_systemd.py
Feb 20 09:15:43 np0005625204.localdomain sudo[160448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:43 np0005625204.localdomain python3.9[160450]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:15:43 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:15:44 np0005625204.localdomain systemd-sysv-generator[160478]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:15:44 np0005625204.localdomain systemd-rc-local-generator[160473]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:15:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:15:44 np0005625204.localdomain sudo[160448]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:45 np0005625204.localdomain sudo[160578]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejuwcafqolxemlmwmutdwyslvusjtntq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578944.984367-831-274170590707456/AnsiballZ_stat.py
Feb 20 09:15:45 np0005625204.localdomain sudo[160578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:45 np0005625204.localdomain python3.9[160580]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:45 np0005625204.localdomain sudo[160578]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:45 np0005625204.localdomain sudo[160626]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cisxfbxphiuimddjxqatglyiscmfgmfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578944.984367-831-274170590707456/AnsiballZ_file.py
Feb 20 09:15:45 np0005625204.localdomain sudo[160626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34196 DF PROTO=TCP SPT=35094 DPT=9102 SEQ=887650660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F27680000000001030307) 
Feb 20 09:15:45 np0005625204.localdomain python3.9[160628]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:45 np0005625204.localdomain sudo[160626]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:46 np0005625204.localdomain sudo[160718]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tburcxjkpceqlyewemkmxtrplxuqpyte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578946.0897882-866-91479867778557/AnsiballZ_stat.py
Feb 20 09:15:46 np0005625204.localdomain sudo[160718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:46 np0005625204.localdomain python3.9[160720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:46 np0005625204.localdomain sudo[160718]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:46 np0005625204.localdomain sudo[160766]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrehrbjvueqtnyyewsbgrohlrdvhlmwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578946.0897882-866-91479867778557/AnsiballZ_file.py
Feb 20 09:15:46 np0005625204.localdomain sudo[160766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:46 np0005625204.localdomain python3.9[160768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:47 np0005625204.localdomain sudo[160766]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:47 np0005625204.localdomain sudo[160858]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgxdrfzzcwinzjybjvmawneulnvrciot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578947.1800349-902-254241304298528/AnsiballZ_systemd.py
Feb 20 09:15:47 np0005625204.localdomain sudo[160858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57035 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F2E7A0000000001030307) 
Feb 20 09:15:47 np0005625204.localdomain python3.9[160860]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:15:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:15:47 np0005625204.localdomain systemd-rc-local-generator[160883]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:15:47 np0005625204.localdomain systemd-sysv-generator[160887]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:15:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:15:48 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 09:15:48 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 09:15:48 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 09:15:48 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 09:15:48 np0005625204.localdomain sudo[160858]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:48 np0005625204.localdomain sudo[160993]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsjqdhfpwxsujewfktrtdtmurvmqrxrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578948.5053422-932-272763377949484/AnsiballZ_file.py
Feb 20 09:15:48 np0005625204.localdomain sudo[160993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:48 np0005625204.localdomain python3.9[160995]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:48 np0005625204.localdomain sudo[160993]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:49 np0005625204.localdomain sudo[161085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejfvmxnmaamumhjhluitxucigmrwrhkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578949.110569-956-145697128815745/AnsiballZ_stat.py
Feb 20 09:15:49 np0005625204.localdomain sudo[161085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:49 np0005625204.localdomain python3.9[161087]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:49 np0005625204.localdomain sudo[161085]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:49 np0005625204.localdomain sudo[161158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhhsvmaridsiwufqbrlpyfrzmhkqlaxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578949.110569-956-145697128815745/AnsiballZ_copy.py
Feb 20 09:15:49 np0005625204.localdomain sudo[161158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:50 np0005625204.localdomain python3.9[161160]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771578949.110569-956-145697128815745/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:50 np0005625204.localdomain sudo[161158]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57037 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F3A690000000001030307) 
Feb 20 09:15:50 np0005625204.localdomain sudo[161250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eeukbcbxlqrdsdvjykkewpewziwcmqvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578950.437737-1007-11794743277123/AnsiballZ_file.py
Feb 20 09:15:50 np0005625204.localdomain sudo[161250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:50 np0005625204.localdomain python3.9[161252]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:51 np0005625204.localdomain sudo[161250]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:51 np0005625204.localdomain sudo[161342]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxmxrmsyrrhxhaqfppshsdndmpjlgotd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578951.2088156-1032-92469867937346/AnsiballZ_file.py
Feb 20 09:15:51 np0005625204.localdomain sudo[161342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:51 np0005625204.localdomain python3.9[161344]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:15:51 np0005625204.localdomain sudo[161342]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:52 np0005625204.localdomain sudo[161434]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfugiqqsvsnjgslndhqpwmwgpkqxuyxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578951.8998153-1055-39258560950211/AnsiballZ_stat.py
Feb 20 09:15:52 np0005625204.localdomain sudo[161434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:52 np0005625204.localdomain python3.9[161436]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:15:52 np0005625204.localdomain sudo[161434]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:52 np0005625204.localdomain sudo[161509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djcldeidlsckeepvoimfwnxnbgtcquck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578951.8998153-1055-39258560950211/AnsiballZ_copy.py
Feb 20 09:15:52 np0005625204.localdomain sudo[161509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:52 np0005625204.localdomain python3.9[161511]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578951.8998153-1055-39258560950211/.source.json _original_basename=.qk7skhyr follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:52 np0005625204.localdomain sudo[161509]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:53 np0005625204.localdomain python3.9[161601]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:15:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57038 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F4A280000000001030307) 
Feb 20 09:15:55 np0005625204.localdomain sudo[161852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytgnerdfhluglfsqwatwxfbhwmmdjvrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578955.1970456-1175-100643714636947/AnsiballZ_container_config_data.py
Feb 20 09:15:55 np0005625204.localdomain sudo[161852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:55 np0005625204.localdomain python3.9[161854]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 20 09:15:55 np0005625204.localdomain sudo[161852]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:56 np0005625204.localdomain sudo[161944]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgxvxgvqhornbasemtdyhbtcvjxsexhg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578956.4270976-1208-192080387681744/AnsiballZ_container_config_hash.py
Feb 20 09:15:56 np0005625204.localdomain sudo[161944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:57 np0005625204.localdomain python3.9[161946]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:15:57 np0005625204.localdomain sudo[161944]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11200 DF PROTO=TCP SPT=45326 DPT=9105 SEQ=1923906977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F55280000000001030307) 
Feb 20 09:15:57 np0005625204.localdomain sudo[162036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anizhbenxgghlisyfpdtlrocaqcitxzi ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771578957.4343004-1238-207179576225717/AnsiballZ_edpm_container_manage.py
Feb 20 09:15:57 np0005625204.localdomain sudo[162036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:58 np0005625204.localdomain python3[162038]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:15:58 np0005625204.localdomain python3[162038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8",
                                                                    "Digest": "sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:29:34.446261637Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 785500417,
                                                                    "VirtualSize": 785500417,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc/diff:/var/lib/containers/storage/overlay/33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:d3cc9cdab7e3e7c1a0a6c80e61bbd8cc5eeeba7069bab1cc064ed2e6cc28ed58",
                                                                              "sha256:d5cbf3016eca6267717119e8ebab3c6c083cae6c589c6961ae23bfa93ef3afa4",
                                                                              "sha256:0096ee5d07436ac5b94d9d58b8b2407cc5e6854d70de5e7f89b9a7a1ad4912ad"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:16:21.310836362Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:16:46.153105676Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:23.560707988Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:41.849131913Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.744796961Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.044382348Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:27:49.126765909Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:28:47.079155224Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:28:49.983056567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:28:56.370338178Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:34.44483218Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:34.444891241Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:36.920021505Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:15:58 np0005625204.localdomain podman[162090]: 2026-02-20 09:15:58.412687628 +0000 UTC m=+0.089762231 container remove 8a5c9afc3ad45b38c32e485e6c3ea56e43506a66ceb8ddba96c788cb17701b59 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '684ebb6e94768a0a31a4d8592f0686b3'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 20 09:15:58 np0005625204.localdomain python3[162038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Feb 20 09:15:58 np0005625204.localdomain podman[162103]: 
Feb 20 09:15:58 np0005625204.localdomain podman[162103]: 2026-02-20 09:15:58.523267929 +0000 UTC m=+0.093679764 container create ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:15:58 np0005625204.localdomain podman[162103]: 2026-02-20 09:15:58.47757017 +0000 UTC m=+0.047982105 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:15:58 np0005625204.localdomain python3[162038]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:15:58 np0005625204.localdomain sudo[162036]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:59 np0005625204.localdomain sudo[162232]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuabhjimyuvaezdoxffufcvsbgpmjxll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578958.9859262-1263-6799590397827/AnsiballZ_stat.py
Feb 20 09:15:59 np0005625204.localdomain sudo[162232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:15:59 np0005625204.localdomain python3.9[162234]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:15:59 np0005625204.localdomain sudo[162232]: pam_unix(sudo:session): session closed for user root
Feb 20 09:15:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11201 DF PROTO=TCP SPT=45326 DPT=9105 SEQ=1923906977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F5D280000000001030307) 
Feb 20 09:16:01 np0005625204.localdomain sudo[162326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbcjxdefacinrmecvelxvrryfktfnzdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578960.7470806-1289-278727827448748/AnsiballZ_file.py
Feb 20 09:16:01 np0005625204.localdomain sudo[162326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:01 np0005625204.localdomain python3.9[162328]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:01 np0005625204.localdomain sudo[162326]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:01 np0005625204.localdomain sudo[162372]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-javkzhsmlfdkgpzjzlhcjfenmekeyzpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578960.7470806-1289-278727827448748/AnsiballZ_stat.py
Feb 20 09:16:01 np0005625204.localdomain sudo[162372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:01 np0005625204.localdomain python3.9[162374]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:16:01 np0005625204.localdomain sudo[162372]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:02 np0005625204.localdomain sudo[162463]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wytqhbwqbzqadjkfnexpvyyofvvgywfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578961.7971914-1289-214797618670992/AnsiballZ_copy.py
Feb 20 09:16:02 np0005625204.localdomain sudo[162463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:02 np0005625204.localdomain python3.9[162465]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771578961.7971914-1289-214797618670992/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:02 np0005625204.localdomain sudo[162463]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:02 np0005625204.localdomain sudo[162509]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tibodfripzwqcernplzdiezqwcbvvzoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578961.7971914-1289-214797618670992/AnsiballZ_systemd.py
Feb 20 09:16:02 np0005625204.localdomain sudo[162509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:02 np0005625204.localdomain python3.9[162511]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:16:02 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:16:03 np0005625204.localdomain systemd-sysv-generator[162542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:16:03 np0005625204.localdomain systemd-rc-local-generator[162539]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:16:03 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:03 np0005625204.localdomain sudo[162509]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57039 DF PROTO=TCP SPT=46410 DPT=9882 SEQ=1134785639 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F6B680000000001030307) 
Feb 20 09:16:03 np0005625204.localdomain sudo[162591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axutdxxhtliebrwtytvkfgqhqdqgxbmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578961.7971914-1289-214797618670992/AnsiballZ_systemd.py
Feb 20 09:16:03 np0005625204.localdomain sudo[162591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:03 np0005625204.localdomain python3.9[162593]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:03 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:16:03 np0005625204.localdomain systemd-sysv-generator[162625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:16:03 np0005625204.localdomain systemd-rc-local-generator[162620]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:16:03 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Starting ovn_metadata_agent container...
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: tmp-crun.7PUSCp.mount: Deactivated successfully.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:16:04 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a2e6d660eb1e53ecd61332ea1a4a8a42043dad2af13c4e9eca5a18e5c0fd3f/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 20 09:16:04 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a2e6d660eb1e53ecd61332ea1a4a8a42043dad2af13c4e9eca5a18e5c0fd3f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:16:04 np0005625204.localdomain podman[162635]: 2026-02-20 09:16:04.309887588 +0000 UTC m=+0.150666408 container init ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + sudo -E kolla_set_configs
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:16:04 np0005625204.localdomain podman[162635]: 2026-02-20 09:16:04.356301588 +0000 UTC m=+0.197080348 container start ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:16:04 np0005625204.localdomain edpm-start-podman-container[162635]: ovn_metadata_agent
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Validating config file
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Copying service configuration files
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Writing out command to execute
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: ++ cat /run_command
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + CMD=neutron-ovn-metadata-agent
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + ARGS=
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + sudo kolla_copy_cacerts
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: Running command: 'neutron-ovn-metadata-agent'
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + [[ ! -n '' ]]
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + . kolla_extend_start
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + umask 0022
Feb 20 09:16:04 np0005625204.localdomain ovn_metadata_agent[162647]: + exec neutron-ovn-metadata-agent
Feb 20 09:16:04 np0005625204.localdomain podman[162655]: 2026-02-20 09:16:04.449669042 +0000 UTC m=+0.088737812 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:16:04 np0005625204.localdomain edpm-start-podman-container[162634]: Creating additional drop-in dependency for "ovn_metadata_agent" (ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916)
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:16:04 np0005625204.localdomain podman[162655]: 2026-02-20 09:16:04.531476552 +0000 UTC m=+0.170545312 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:16:04 np0005625204.localdomain systemd-rc-local-generator[162722]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:16:04 np0005625204.localdomain systemd-sysv-generator[162726]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: tmp-crun.uPmUMk.mount: Deactivated successfully.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Started ovn_metadata_agent container.
Feb 20 09:16:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:16:04 np0005625204.localdomain sudo[162591]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:04 np0005625204.localdomain podman[162737]: 2026-02-20 09:16:04.943226062 +0000 UTC m=+0.084458098 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller)
Feb 20 09:16:05 np0005625204.localdomain podman[162737]: 2026-02-20 09:16:05.008224548 +0000 UTC m=+0.149456624 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:16:05 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.927 162652 INFO neutron.common.config [-] Logging enabled!
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.927 162652 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.927 162652 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.928 162652 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.929 162652 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.930 162652 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.931 162652 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.932 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.933 162652 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.934 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.935 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.936 162652 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.937 162652 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.938 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.939 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.940 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.941 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.942 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.943 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.944 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.945 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.946 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.947 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.948 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.949 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.950 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.951 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.952 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.953 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.954 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.955 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.956 162652 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.957 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.958 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.959 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.960 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.961 162652 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.970 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.971 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 20 09:16:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:05.986 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name e6b84e4d-7dff-4c2c-96db-c41e3ef520c6 (UUID: e6b84e4d-7dff-4c2c-96db-c41e3ef520c6) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.000 162652 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.002 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.003 162652 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.011 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: PortBindingCreateWithChassis(events=('create',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:ed:d2 192.168.0.140'], port_security=['fa:16:3e:b0:ed:d2 192.168.0.140'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.140/24', 'neutron:device_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005625204.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de929a91-c460-4398-96e0-15a80685a485', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '91bce661d685472eb3e7cacab17bf52a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '571bc6f6-22b1-4aad-9b70-3481475089c6 dd806cfc-5243-4295-bd9f-cfd9f58a9f1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee1d7cd7-5f4f-4b75-a06c-f37c0ef97c77, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e7aa8e2a-27a6-452b-906c-21cea166b882) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.012 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'e6b84e4d-7dff-4c2c-96db-c41e3ef520c6'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], external_ids={'neutron:ovn-metadata-id': '5583b60b-563a-5b85-8f5f-a322cc499504', 'neutron:ovn-metadata-sb-cfg': '1'}, name=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, nb_cfg_timestamp=1771578913263, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.012 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e7aa8e2a-27a6-452b-906c-21cea166b882 in datapath de929a91-c460-4398-96e0-15a80685a485 bound to our chassis on insert
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.013 162652 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7fbba61c5b50>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.014 162652 INFO oslo_service.service [-] Starting 1 workers
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.016 162652 DEBUG oslo_service.service [-] Started child 162777 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.019 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de929a91-c460-4398-96e0-15a80685a485
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.020 162652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpkw_qn71r/privsep.sock']
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.020 162777 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-383389'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.045 162777 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.045 162777 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.046 162777 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.049 162777 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.052 162777 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.064 162777 INFO eventlet.wsgi.server [-] (162777) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 20 09:16:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2048 DF PROTO=TCP SPT=49734 DPT=9101 SEQ=3173854424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F77250000000001030307) 
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.670 162652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.671 162652 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpkw_qn71r/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.551 162782 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.557 162782 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.561 162782 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.561 162782 INFO oslo.privsep.daemon [-] privsep daemon running as pid 162782
Feb 20 09:16:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:06.674 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fa6b76-f103-4179-83dd-e50cb02b89de]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:07.103 162782 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:16:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:07.103 162782 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:16:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:07.103 162782 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:16:07 np0005625204.localdomain python3.9[162862]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:16:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:07.574 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[497afd09-98b7-4178-a4d5-03a8b33b2c77]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:07.576 162652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp4kurx1fa/privsep.sock']
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.167 162652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.168 162652 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp4kurx1fa/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.048 162915 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.052 162915 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.055 162915 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.055 162915 INFO oslo.privsep.daemon [-] privsep daemon running as pid 162915
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.171 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef5062e-311b-4182-8cb0-cd8f575de127]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:08 np0005625204.localdomain sudo[162962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uonjklnuzejhipsarpghpiqzvbffkakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578967.9937506-1424-172256819390940/AnsiballZ_stat.py
Feb 20 09:16:08 np0005625204.localdomain sudo[162962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:08 np0005625204.localdomain python3.9[162964]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:16:08 np0005625204.localdomain sudo[162962]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.610 162915 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.610 162915 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:16:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:08.611 162915 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:16:08 np0005625204.localdomain sudo[163038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbluuyawltvlghxkeqdhmtnkxnygwcds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578967.9937506-1424-172256819390940/AnsiballZ_copy.py
Feb 20 09:16:08 np0005625204.localdomain sudo[163038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:09 np0005625204.localdomain python3.9[163040]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771578967.9937506-1424-172256819390940/.source.yaml _original_basename=.7diqfqtd follow=False checksum=00f5f1349c1b2f1d82b680e3efe9b7b384555dee backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:09 np0005625204.localdomain sudo[163038]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.079 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[4036496c-845b-4350-a15c-c529a8957301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.082 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[462ca290-4442-49fc-84b4-8dcbe7c61ea7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.102 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[794e88bb-b6e6-44ca-bd44-5575295d349d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.115 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[72ad446d-c238-483c-af44-f9b58f550c1b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde929a91-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:09:c2:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 104, 'tx_packets': 68, 'rx_bytes': 8926, 'tx_bytes': 7143, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483664], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637504, 'reachable_time': 41962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 17, 'outoctets': 1164, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 17, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 1164, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 17, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 163060, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.129 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[31fd05b8-49b2-4987-809e-984a2b0b1bf6]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '192.168.0.2'], ['IFA_LOCAL', '192.168.0.2'], ['IFA_BROADCAST', '192.168.0.255'], ['IFA_LABEL', 'tapde929a91-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637514, 'tstamp': 637514}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapde929a91-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637516, 'tstamp': 637516}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::a9fe:a9fe'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637512, 'tstamp': 637512}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 10, 'prefixlen': 64, 'flags': 128, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:c288'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 637504, 'tstamp': 637504}], ['IFA_FLAGS', 128]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 163061, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.181 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[2b61d1eb-9fa9-4b12-86c4-03df96583e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.183 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde929a91-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.188 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde929a91-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.188 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.189 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde929a91-c0, col_values=(('external_ids', {'iface-id': '3323e11d-576a-42f3-bcca-e10425268e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.190 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.194 162652 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp6c2bt20r/privsep.sock']
Feb 20 09:16:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2050 DF PROTO=TCP SPT=49734 DPT=9101 SEQ=3173854424 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F83280000000001030307) 
Feb 20 09:16:09 np0005625204.localdomain sshd[157464]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:16:09 np0005625204.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Feb 20 09:16:09 np0005625204.localdomain systemd[1]: session-52.scope: Consumed 32.822s CPU time.
Feb 20 09:16:09 np0005625204.localdomain systemd-logind[759]: Session 52 logged out. Waiting for processes to exit.
Feb 20 09:16:09 np0005625204.localdomain systemd-logind[759]: Removed session 52.
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.784 162652 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.786 162652 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp6c2bt20r/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.669 163070 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.675 163070 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.679 163070 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.679 163070 INFO oslo.privsep.daemon [-] privsep daemon running as pid 163070
Feb 20 09:16:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:09.789 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[74590516-1d7b-4131-b7f3-e14068df2bfb]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.237 163070 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.237 163070 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.238 163070 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.697 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[58177eb7-4569-40df-b1b8-d83039047cd2]: (4, ['ovnmeta-de929a91-c460-4398-96e0-15a80685a485']) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.701 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, column=external_ids, values=({'neutron:ovn-metadata-id': '5583b60b-563a-5b85-8f5f-a322cc499504'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.702 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.703 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.712 162652 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.712 162652 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.712 162652 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.713 162652 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.714 162652 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.715 162652 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.716 162652 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.717 162652 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.718 162652 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.719 162652 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.720 162652 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.720 162652 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.720 162652 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.721 162652 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.722 162652 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.723 162652 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.724 162652 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.725 162652 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.726 162652 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.727 162652 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.728 162652 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.729 162652 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.730 162652 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.731 162652 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.732 162652 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.733 162652 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.734 162652 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.735 162652 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.736 162652 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.737 162652 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.738 162652 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.739 162652 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.740 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.741 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.742 162652 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.743 162652 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.744 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.745 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.746 162652 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.747 162652 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.748 162652 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.749 162652 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.750 162652 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.751 162652 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.752 162652 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.753 162652 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.754 162652 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.755 162652 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.756 162652 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.757 162652 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.758 162652 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.759 162652 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.760 162652 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.761 162652 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.762 162652 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.763 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.764 162652 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.765 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.766 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.767 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.768 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.769 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:16:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:16:10.770 162652 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:16:11 np0005625204.localdomain sshd[163075]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:16:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11203 DF PROTO=TCP SPT=45326 DPT=9105 SEQ=1923906977 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F8D680000000001030307) 
Feb 20 09:16:12 np0005625204.localdomain sshd[163075]: Invalid user sol from 45.148.10.240 port 45476
Feb 20 09:16:12 np0005625204.localdomain sshd[163075]: Connection closed by invalid user sol 45.148.10.240 port 45476 [preauth]
Feb 20 09:16:14 np0005625204.localdomain sshd[163077]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:16:14 np0005625204.localdomain sshd[163077]: Accepted publickey for zuul from 192.168.122.30 port 50370 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:16:14 np0005625204.localdomain systemd-logind[759]: New session 53 of user zuul.
Feb 20 09:16:14 np0005625204.localdomain systemd[1]: Started Session 53 of User zuul.
Feb 20 09:16:14 np0005625204.localdomain sshd[163077]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:16:15 np0005625204.localdomain python3.9[163170]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:16:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21837 DF PROTO=TCP SPT=52104 DPT=9100 SEQ=504754750 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598F9D690000000001030307) 
Feb 20 09:16:16 np0005625204.localdomain sudo[163264]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wncmvsxysazrzrzbvljrnostpqprpcfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578976.117203-60-89333197182421/AnsiballZ_command.py
Feb 20 09:16:16 np0005625204.localdomain sudo[163264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:16 np0005625204.localdomain python3.9[163266]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:16 np0005625204.localdomain sudo[163264]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:17 np0005625204.localdomain sudo[163369]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccfzqsggeieyhdkprxoaoxzswmsuiozl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578976.9801588-83-103959451356119/AnsiballZ_command.py
Feb 20 09:16:17 np0005625204.localdomain sudo[163369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:17 np0005625204.localdomain python3.9[163371]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12391 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FA3AA0000000001030307) 
Feb 20 09:16:17 np0005625204.localdomain systemd[1]: libpod-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def.scope: Deactivated successfully.
Feb 20 09:16:17 np0005625204.localdomain podman[163372]: 2026-02-20 09:16:17.758031796 +0000 UTC m=+0.073570094 container died 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Feb 20 09:16:17 np0005625204.localdomain podman[163372]: 2026-02-20 09:16:17.78556363 +0000 UTC m=+0.101101918 container cleanup 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.)
Feb 20 09:16:17 np0005625204.localdomain sudo[163369]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:17 np0005625204.localdomain podman[163385]: 2026-02-20 09:16:17.841121032 +0000 UTC m=+0.077420324 container remove 5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.openshift.expose-services=, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13)
Feb 20 09:16:17 np0005625204.localdomain systemd[1]: libpod-conmon-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def.scope: Deactivated successfully.
Feb 20 09:16:18 np0005625204.localdomain sudo[163491]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxyfbzyzimdqnqektyssizgwllmezqxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578978.0807304-114-24960738990035/AnsiballZ_systemd_service.py
Feb 20 09:16:18 np0005625204.localdomain sudo[163491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0c3d83e6f4e20364b7353f7344121b41304d9d03338ccbc401cf17207dc116b9-merged.mount: Deactivated successfully.
Feb 20 09:16:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5ccae9a2c8a04d03796cbbdf89d19cad856e3ab703b2a028f9abcc07d91c9def-userdata-shm.mount: Deactivated successfully.
Feb 20 09:16:18 np0005625204.localdomain python3.9[163493]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:16:18 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:16:19 np0005625204.localdomain systemd-sysv-generator[163519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:16:19 np0005625204.localdomain systemd-rc-local-generator[163515]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:16:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:19 np0005625204.localdomain sudo[163491]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:20 np0005625204.localdomain python3.9[163619]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:16:20 np0005625204.localdomain network[163636]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:16:20 np0005625204.localdomain network[163637]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:16:20 np0005625204.localdomain network[163638]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:16:21 np0005625204.localdomain sshd[163649]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:16:21 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12393 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FAFA80000000001030307) 
Feb 20 09:16:21 np0005625204.localdomain sshd[163649]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:16:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:24 np0005625204.localdomain sudo[163839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwoastxkviovrzmsrtzryxgvoifsbyao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578984.16693-171-268352366421896/AnsiballZ_systemd_service.py
Feb 20 09:16:24 np0005625204.localdomain sudo[163839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:24 np0005625204.localdomain python3.9[163841]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:24 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:16:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12394 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FBF680000000001030307) 
Feb 20 09:16:24 np0005625204.localdomain systemd-rc-local-generator[163868]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:16:24 np0005625204.localdomain systemd-sysv-generator[163873]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:16:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:25 np0005625204.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Feb 20 09:16:25 np0005625204.localdomain sudo[163839]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:25 np0005625204.localdomain sudo[163971]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzazqehphqvqimmfmytkmtvnbrjhokoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578985.1984081-171-34577339115179/AnsiballZ_systemd_service.py
Feb 20 09:16:25 np0005625204.localdomain sudo[163971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:25 np0005625204.localdomain python3.9[163973]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:25 np0005625204.localdomain sudo[163971]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:26 np0005625204.localdomain sudo[164064]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szrzypwnqqastftddcjugbptjlwpmumy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578985.909492-171-28651335185322/AnsiballZ_systemd_service.py
Feb 20 09:16:26 np0005625204.localdomain sudo[164064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:26 np0005625204.localdomain python3.9[164066]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:26 np0005625204.localdomain sudo[164064]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:27 np0005625204.localdomain sudo[164157]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trpbnunixurjcwafbzgaiyyalxlwzikp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578986.629555-171-95570996282268/AnsiballZ_systemd_service.py
Feb 20 09:16:27 np0005625204.localdomain sudo[164157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:27 np0005625204.localdomain python3.9[164159]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:27 np0005625204.localdomain sudo[164157]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17314 DF PROTO=TCP SPT=39374 DPT=9105 SEQ=4190471795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FCA680000000001030307) 
Feb 20 09:16:27 np0005625204.localdomain sudo[164250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jznyzscsdamundafpzigawbsgsbbebej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578987.5190272-171-257713903436497/AnsiballZ_systemd_service.py
Feb 20 09:16:27 np0005625204.localdomain sudo[164250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:28 np0005625204.localdomain python3.9[164252]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:28 np0005625204.localdomain sudo[164250]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:28 np0005625204.localdomain sudo[164343]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbvilnfikfytblmbncldswqhwrurysog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578988.2339694-171-241717311339756/AnsiballZ_systemd_service.py
Feb 20 09:16:28 np0005625204.localdomain sudo[164343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:28 np0005625204.localdomain python3.9[164345]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17315 DF PROTO=TCP SPT=39374 DPT=9105 SEQ=4190471795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FD2680000000001030307) 
Feb 20 09:16:29 np0005625204.localdomain sudo[164343]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:30 np0005625204.localdomain sudo[164436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqyazdbxhtfknfdifrnmhqcqeychswov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578989.935831-171-73420483917785/AnsiballZ_systemd_service.py
Feb 20 09:16:30 np0005625204.localdomain sudo[164436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:30 np0005625204.localdomain sshd[164439]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:16:30 np0005625204.localdomain sshd[164439]: Invalid user iksi from 18.221.252.160 port 55136
Feb 20 09:16:30 np0005625204.localdomain sshd[164439]: Received disconnect from 18.221.252.160 port 55136:11: Bye Bye [preauth]
Feb 20 09:16:30 np0005625204.localdomain sshd[164439]: Disconnected from invalid user iksi 18.221.252.160 port 55136 [preauth]
Feb 20 09:16:30 np0005625204.localdomain python3.9[164438]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:16:31 np0005625204.localdomain sudo[164436]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:32 np0005625204.localdomain sudo[164531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvcoepmvtccffntfxcqjyhicalvcwhkt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578991.9588168-327-162700709460669/AnsiballZ_file.py
Feb 20 09:16:32 np0005625204.localdomain sudo[164531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:32 np0005625204.localdomain python3.9[164533]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:32 np0005625204.localdomain sudo[164531]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:32 np0005625204.localdomain sudo[164623]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hucipknszocscamqdfyhhtmxbhrtrufj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578992.6997967-327-203407558349757/AnsiballZ_file.py
Feb 20 09:16:32 np0005625204.localdomain sudo[164623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12395 DF PROTO=TCP SPT=55174 DPT=9882 SEQ=2870933854 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FDF680000000001030307) 
Feb 20 09:16:33 np0005625204.localdomain python3.9[164625]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:33 np0005625204.localdomain sudo[164623]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:33 np0005625204.localdomain sshd[164672]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:16:33 np0005625204.localdomain sudo[164717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phudoronppjtdnmhyturnpxtqxokumhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578993.2532449-327-88397914705102/AnsiballZ_file.py
Feb 20 09:16:33 np0005625204.localdomain sudo[164717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:33 np0005625204.localdomain python3.9[164719]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:33 np0005625204.localdomain sudo[164717]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:33 np0005625204.localdomain sshd[164672]: Invalid user n8n from 54.36.99.29 port 59108
Feb 20 09:16:33 np0005625204.localdomain sshd[164672]: Received disconnect from 54.36.99.29 port 59108:11: Bye Bye [preauth]
Feb 20 09:16:33 np0005625204.localdomain sshd[164672]: Disconnected from invalid user n8n 54.36.99.29 port 59108 [preauth]
Feb 20 09:16:34 np0005625204.localdomain sudo[164809]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsfakajgditoyqkmcinupdeenudnffcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578993.8612237-327-224646070764926/AnsiballZ_file.py
Feb 20 09:16:34 np0005625204.localdomain sudo[164809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:34 np0005625204.localdomain python3.9[164811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:34 np0005625204.localdomain sudo[164809]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:34 np0005625204.localdomain sudo[164908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luyqtmlmfpvrtreekmdbuthyxxubgniw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578994.409361-327-229476659715205/AnsiballZ_file.py
Feb 20 09:16:34 np0005625204.localdomain sudo[164908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:34 np0005625204.localdomain sudo[164896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:16:34 np0005625204.localdomain sudo[164896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:16:34 np0005625204.localdomain sudo[164896]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:34 np0005625204.localdomain sudo[164919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:16:34 np0005625204.localdomain sudo[164919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:16:34 np0005625204.localdomain python3.9[164917]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:34 np0005625204.localdomain sudo[164908]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:16:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:16:35 np0005625204.localdomain podman[164993]: 2026-02-20 09:16:35.150103725 +0000 UTC m=+0.081245385 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:16:35 np0005625204.localdomain podman[164993]: 2026-02-20 09:16:35.158516167 +0000 UTC m=+0.089657827 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:16:35 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:16:35 np0005625204.localdomain podman[164992]: 2026-02-20 09:16:35.204975998 +0000 UTC m=+0.136723276 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 20 09:16:35 np0005625204.localdomain podman[164992]: 2026-02-20 09:16:35.242050098 +0000 UTC m=+0.173797376 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:16:35 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:16:35 np0005625204.localdomain sudo[165081]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rivoyhsblfvauqpmrqykvyapbchypumr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578994.9966054-327-133863031945432/AnsiballZ_file.py
Feb 20 09:16:35 np0005625204.localdomain sudo[165081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:35 np0005625204.localdomain sudo[164919]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:35 np0005625204.localdomain python3.9[165087]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:35 np0005625204.localdomain sudo[165081]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:35 np0005625204.localdomain sudo[165189]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkgnjxqzhwztxqjbpsxavykyntrpysbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578995.5804448-327-245206794055229/AnsiballZ_file.py
Feb 20 09:16:35 np0005625204.localdomain sudo[165189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:36 np0005625204.localdomain python3.9[165191]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:36 np0005625204.localdomain sudo[165189]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:36 np0005625204.localdomain sudo[165192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:16:36 np0005625204.localdomain sudo[165192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:16:36 np0005625204.localdomain sudo[165192]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28772 DF PROTO=TCP SPT=58014 DPT=9101 SEQ=813715877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FEC550000000001030307) 
Feb 20 09:16:36 np0005625204.localdomain sudo[165296]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsfbyqxqlazipsybykpuhuphcridcmzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578996.3022397-477-43448543895593/AnsiballZ_file.py
Feb 20 09:16:36 np0005625204.localdomain sudo[165296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:36 np0005625204.localdomain python3.9[165298]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:36 np0005625204.localdomain sudo[165296]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:37 np0005625204.localdomain sudo[165388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfzizrougagidmoksqtfhsatwlroazck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578996.877729-477-158630007360891/AnsiballZ_file.py
Feb 20 09:16:37 np0005625204.localdomain sudo[165388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:37 np0005625204.localdomain python3.9[165390]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:37 np0005625204.localdomain sudo[165388]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:37 np0005625204.localdomain sudo[165480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktnpxwdpwwvbilxdqzgngdqunnwvifki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578997.5069869-477-29961381103196/AnsiballZ_file.py
Feb 20 09:16:37 np0005625204.localdomain sudo[165480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:37 np0005625204.localdomain python3.9[165482]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:37 np0005625204.localdomain sudo[165480]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:38 np0005625204.localdomain sudo[165572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goshgzcydbaqzickaubvpmnfmtsvezmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578998.1251447-477-44370947609896/AnsiballZ_file.py
Feb 20 09:16:38 np0005625204.localdomain sudo[165572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:38 np0005625204.localdomain python3.9[165574]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:38 np0005625204.localdomain sudo[165572]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:39 np0005625204.localdomain sudo[165664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axctmzwwmumymyoqqrievjrazvwnwssc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578998.7130082-477-57010313536014/AnsiballZ_file.py
Feb 20 09:16:39 np0005625204.localdomain sudo[165664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:39 np0005625204.localdomain python3.9[165666]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:39 np0005625204.localdomain sudo[165664]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28774 DF PROTO=TCP SPT=58014 DPT=9101 SEQ=813715877 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A598FF8690000000001030307) 
Feb 20 09:16:39 np0005625204.localdomain sudo[165756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncegbpkyzkmkwpycmshmdwmjzvsettnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578999.3392565-477-82622784861887/AnsiballZ_file.py
Feb 20 09:16:39 np0005625204.localdomain sudo[165756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:39 np0005625204.localdomain python3.9[165758]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:39 np0005625204.localdomain sudo[165756]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:40 np0005625204.localdomain sudo[165848]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxrmnvoyjsytcekliyguibudillamhyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771578999.9042099-477-5119630257600/AnsiballZ_file.py
Feb 20 09:16:40 np0005625204.localdomain sudo[165848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:40 np0005625204.localdomain python3.9[165850]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:16:40 np0005625204.localdomain sudo[165848]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:40 np0005625204.localdomain sudo[165940]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iylssrdcszvzjhrnjlhsatmknyjhgqxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579000.6627574-630-82115433884216/AnsiballZ_command.py
Feb 20 09:16:40 np0005625204.localdomain sudo[165940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:41 np0005625204.localdomain python3.9[165942]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:41 np0005625204.localdomain sudo[165940]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:41 np0005625204.localdomain python3.9[166034]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:16:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17317 DF PROTO=TCP SPT=39374 DPT=9105 SEQ=4190471795 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599003680000000001030307) 
Feb 20 09:16:42 np0005625204.localdomain sudo[166124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbtqaeccygezkvkxceontswvelrhqtmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579002.2126863-684-134206553040034/AnsiballZ_systemd_service.py
Feb 20 09:16:42 np0005625204.localdomain sudo[166124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:42 np0005625204.localdomain python3.9[166126]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:16:42 np0005625204.localdomain sshd[166127]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:16:42 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:16:42 np0005625204.localdomain systemd-rc-local-generator[166152]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:16:42 np0005625204.localdomain systemd-sysv-generator[166159]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:16:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:16:43 np0005625204.localdomain sudo[166124]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:43 np0005625204.localdomain sudo[166254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyvrwqxnjxeroukpehzvdewpjddyoknn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579003.504834-708-71396359817177/AnsiballZ_command.py
Feb 20 09:16:43 np0005625204.localdomain sudo[166254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:43 np0005625204.localdomain python3.9[166256]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:43 np0005625204.localdomain sudo[166254]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:44 np0005625204.localdomain sshd[166127]: Invalid user systemd from 27.112.79.3 port 42334
Feb 20 09:16:44 np0005625204.localdomain sudo[166347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keqtkgnatxuphxzsrxveqwjgpbpxrqdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579004.1027217-708-8264433851566/AnsiballZ_command.py
Feb 20 09:16:44 np0005625204.localdomain sudo[166347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:44 np0005625204.localdomain sshd[166127]: Received disconnect from 27.112.79.3 port 42334:11: Bye Bye [preauth]
Feb 20 09:16:44 np0005625204.localdomain sshd[166127]: Disconnected from invalid user systemd 27.112.79.3 port 42334 [preauth]
Feb 20 09:16:44 np0005625204.localdomain python3.9[166349]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:44 np0005625204.localdomain sudo[166347]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:44 np0005625204.localdomain sudo[166440]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-veenltaebjwirovhhmzangnlcxqopmcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579004.689132-708-221540968356450/AnsiballZ_command.py
Feb 20 09:16:44 np0005625204.localdomain sudo[166440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:45 np0005625204.localdomain python3.9[166442]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:45 np0005625204.localdomain sudo[166440]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:45 np0005625204.localdomain sudo[166533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsvmxlqfypghutiajwrlmmiyiroywkcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579005.2333064-708-279647833697845/AnsiballZ_command.py
Feb 20 09:16:45 np0005625204.localdomain sudo[166533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:45 np0005625204.localdomain python3.9[166535]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:45 np0005625204.localdomain sudo[166533]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8053 DF PROTO=TCP SPT=33848 DPT=9102 SEQ=2895410563 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599011680000000001030307) 
Feb 20 09:16:46 np0005625204.localdomain sudo[166626]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hivdyznzzhcdiugslapeporpknqklywi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579005.8115697-708-102106223454809/AnsiballZ_command.py
Feb 20 09:16:46 np0005625204.localdomain sudo[166626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:46 np0005625204.localdomain python3.9[166628]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:46 np0005625204.localdomain sudo[166626]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:46 np0005625204.localdomain sudo[166719]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvvfiittyhtwvoflsxqargkmebeggovp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579006.4034648-708-175736996060096/AnsiballZ_command.py
Feb 20 09:16:46 np0005625204.localdomain sudo[166719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:46 np0005625204.localdomain python3.9[166721]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:46 np0005625204.localdomain sudo[166719]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:47 np0005625204.localdomain sudo[166812]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfplaaoqigwjxjstsjwqtnlfvprnciek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579006.970949-708-166218845124973/AnsiballZ_command.py
Feb 20 09:16:47 np0005625204.localdomain sudo[166812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:47 np0005625204.localdomain python3.9[166814]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:16:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25090 DF PROTO=TCP SPT=47008 DPT=9882 SEQ=3729068539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599018DA0000000001030307) 
Feb 20 09:16:48 np0005625204.localdomain sudo[166812]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:49 np0005625204.localdomain sudo[166905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmetjxkilplxwtmqoywwexrkqrrybnkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579009.0962956-870-97565872908537/AnsiballZ_getent.py
Feb 20 09:16:49 np0005625204.localdomain sudo[166905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:49 np0005625204.localdomain python3.9[166907]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 20 09:16:49 np0005625204.localdomain sudo[166905]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:50 np0005625204.localdomain sudo[166998]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbrhnhxtxjhxlwiqrphggaagzroubyhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579009.9028745-893-146918840165144/AnsiballZ_group.py
Feb 20 09:16:50 np0005625204.localdomain sudo[166998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:50 np0005625204.localdomain python3.9[167000]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 20 09:16:50 np0005625204.localdomain groupadd[167001]: group added to /etc/group: name=libvirt, GID=42473
Feb 20 09:16:50 np0005625204.localdomain groupadd[167001]: group added to /etc/gshadow: name=libvirt
Feb 20 09:16:50 np0005625204.localdomain groupadd[167001]: new group: name=libvirt, GID=42473
Feb 20 09:16:50 np0005625204.localdomain sudo[166998]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25092 DF PROTO=TCP SPT=47008 DPT=9882 SEQ=3729068539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599024E80000000001030307) 
Feb 20 09:16:51 np0005625204.localdomain sudo[167096]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxgfhmxcgzmfkjhfjsuiqrylitsxvixv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579010.793864-917-270935542979223/AnsiballZ_user.py
Feb 20 09:16:51 np0005625204.localdomain sudo[167096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:51 np0005625204.localdomain python3.9[167098]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 20 09:16:51 np0005625204.localdomain useradd[167100]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/1
Feb 20 09:16:51 np0005625204.localdomain sudo[167096]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:52 np0005625204.localdomain sudo[167196]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txtswqkdlbapwbibatiwdygliycfhxza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579012.0082452-950-18321020353031/AnsiballZ_setup.py
Feb 20 09:16:52 np0005625204.localdomain sudo[167196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:52 np0005625204.localdomain python3.9[167198]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:16:52 np0005625204.localdomain sudo[167196]: pam_unix(sudo:session): session closed for user root
Feb 20 09:16:53 np0005625204.localdomain sudo[167250]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqhprpigzqjpqlmrwqdfgapbgrihgtud ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579012.0082452-950-18321020353031/AnsiballZ_dnf.py
Feb 20 09:16:53 np0005625204.localdomain sudo[167250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:16:53 np0005625204.localdomain python3.9[167252]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:16:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25093 DF PROTO=TCP SPT=47008 DPT=9882 SEQ=3729068539 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599034A80000000001030307) 
Feb 20 09:16:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53974 DF PROTO=TCP SPT=42128 DPT=9105 SEQ=3538631678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59903FA80000000001030307) 
Feb 20 09:16:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53975 DF PROTO=TCP SPT=42128 DPT=9105 SEQ=3538631678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599047A80000000001030307) 
Feb 20 09:17:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28627 DF PROTO=TCP SPT=54526 DPT=9100 SEQ=1245713767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599053680000000001030307) 
Feb 20 09:17:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:17:05.973 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:17:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:17:05.975 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:17:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:17:05.977 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:17:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:17:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:17:06 np0005625204.localdomain podman[167324]: 2026-02-20 09:17:06.14360864 +0000 UTC m=+0.069570630 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:17:06 np0005625204.localdomain podman[167324]: 2026-02-20 09:17:06.147342134 +0000 UTC m=+0.073304084 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:17:06 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:17:06 np0005625204.localdomain podman[167323]: 2026-02-20 09:17:06.213038664 +0000 UTC m=+0.137042283 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:17:06 np0005625204.localdomain podman[167323]: 2026-02-20 09:17:06.247207749 +0000 UTC m=+0.171211368 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:17:06 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:17:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41854 DF PROTO=TCP SPT=45660 DPT=9101 SEQ=3022912987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599061850000000001030307) 
Feb 20 09:17:07 np0005625204.localdomain sshd[167364]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:17:07 np0005625204.localdomain sshd[167364]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:17:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41856 DF PROTO=TCP SPT=45660 DPT=9101 SEQ=3022912987 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59906DA90000000001030307) 
Feb 20 09:17:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53977 DF PROTO=TCP SPT=42128 DPT=9105 SEQ=3538631678 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599077680000000001030307) 
Feb 20 09:17:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22578 DF PROTO=TCP SPT=42162 DPT=9100 SEQ=3643579773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599087680000000001030307) 
Feb 20 09:17:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19322 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59908E0B0000000001030307) 
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  Converting 2759 SID table entries...
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:17:18 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:17:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19324 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59909A280000000001030307) 
Feb 20 09:17:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19325 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990A9E80000000001030307) 
Feb 20 09:17:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30192 DF PROTO=TCP SPT=41982 DPT=9105 SEQ=1281184047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990B4A80000000001030307) 
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  Converting 2762 SID table entries...
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:17:28 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:17:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30193 DF PROTO=TCP SPT=41982 DPT=9105 SEQ=1281184047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990BCA80000000001030307) 
Feb 20 09:17:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19326 DF PROTO=TCP SPT=53742 DPT=9882 SEQ=3103143237 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990C9680000000001030307) 
Feb 20 09:17:36 np0005625204.localdomain sudo[168449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:17:36 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Feb 20 09:17:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:17:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:17:36 np0005625204.localdomain sudo[168449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:17:36 np0005625204.localdomain sudo[168449]: pam_unix(sudo:session): session closed for user root
Feb 20 09:17:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27263 DF PROTO=TCP SPT=35370 DPT=9101 SEQ=3360427931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990D6B50000000001030307) 
Feb 20 09:17:36 np0005625204.localdomain sudo[168479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:17:36 np0005625204.localdomain sudo[168479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:17:36 np0005625204.localdomain systemd[1]: tmp-crun.rA68cv.mount: Deactivated successfully.
Feb 20 09:17:36 np0005625204.localdomain systemd[1]: tmp-crun.8zysPZ.mount: Deactivated successfully.
Feb 20 09:17:36 np0005625204.localdomain podman[168467]: 2026-02-20 09:17:36.425199329 +0000 UTC m=+0.150949712 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:17:36 np0005625204.localdomain podman[168466]: 2026-02-20 09:17:36.389159556 +0000 UTC m=+0.115247230 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:17:36 np0005625204.localdomain podman[168466]: 2026-02-20 09:17:36.472077537 +0000 UTC m=+0.198165181 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:17:36 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:17:36 np0005625204.localdomain podman[168467]: 2026-02-20 09:17:36.501943169 +0000 UTC m=+0.227693542 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:17:36 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:17:36 np0005625204.localdomain sudo[168479]: pam_unix(sudo:session): session closed for user root
Feb 20 09:17:37 np0005625204.localdomain sudo[168563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:17:37 np0005625204.localdomain sudo[168563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:17:37 np0005625204.localdomain sudo[168563]: pam_unix(sudo:session): session closed for user root
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  Converting 2765 SID table entries...
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:17:38 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:17:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27265 DF PROTO=TCP SPT=35370 DPT=9101 SEQ=3360427931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990E2A80000000001030307) 
Feb 20 09:17:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30195 DF PROTO=TCP SPT=41982 DPT=9105 SEQ=1281184047 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990ED680000000001030307) 
Feb 20 09:17:44 np0005625204.localdomain sshd[168588]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:17:45 np0005625204.localdomain sshd[168588]: Invalid user systemd from 96.78.175.36 port 55344
Feb 20 09:17:45 np0005625204.localdomain sshd[168588]: Received disconnect from 96.78.175.36 port 55344:11: Bye Bye [preauth]
Feb 20 09:17:45 np0005625204.localdomain sshd[168588]: Disconnected from invalid user systemd 96.78.175.36 port 55344 [preauth]
Feb 20 09:17:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22069 DF PROTO=TCP SPT=35078 DPT=9100 SEQ=282675773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5990FD680000000001030307) 
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  Converting 2765 SID table entries...
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:17:46 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:17:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:17:47 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=22 res=1
Feb 20 09:17:47 np0005625204.localdomain systemd-rc-local-generator[168619]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:17:47 np0005625204.localdomain systemd-sysv-generator[168627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:17:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:17:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29435 DF PROTO=TCP SPT=34378 DPT=9882 SEQ=3238976354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991033A0000000001030307) 
Feb 20 09:17:47 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:17:47 np0005625204.localdomain systemd-rc-local-generator[168659]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:17:47 np0005625204.localdomain systemd-sysv-generator[168664]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:17:47 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:17:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29437 DF PROTO=TCP SPT=34378 DPT=9882 SEQ=3238976354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59910F290000000001030307) 
Feb 20 09:17:53 np0005625204.localdomain sshd[168679]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:17:53 np0005625204.localdomain sshd[168679]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:17:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29438 DF PROTO=TCP SPT=34378 DPT=9882 SEQ=3238976354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59911EE90000000001030307) 
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  Converting 2766 SID table entries...
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 20 09:17:56 np0005625204.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 20 09:17:57 np0005625204.localdomain groupadd[168691]: group added to /etc/group: name=clevis, GID=985
Feb 20 09:17:57 np0005625204.localdomain groupadd[168691]: group added to /etc/gshadow: name=clevis
Feb 20 09:17:57 np0005625204.localdomain groupadd[168691]: new group: name=clevis, GID=985
Feb 20 09:17:57 np0005625204.localdomain useradd[168698]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 20 09:17:57 np0005625204.localdomain usermod[168708]: add 'clevis' to group 'tss'
Feb 20 09:17:57 np0005625204.localdomain usermod[168708]: add 'clevis' to shadow group 'tss'
Feb 20 09:17:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39582 DF PROTO=TCP SPT=59270 DPT=9105 SEQ=1688412850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599129E80000000001030307) 
Feb 20 09:17:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39583 DF PROTO=TCP SPT=59270 DPT=9105 SEQ=1688412850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599131E80000000001030307) 
Feb 20 09:18:00 np0005625204.localdomain groupadd[168733]: group added to /etc/group: name=dnsmasq, GID=984
Feb 20 09:18:00 np0005625204.localdomain groupadd[168733]: group added to /etc/gshadow: name=dnsmasq
Feb 20 09:18:00 np0005625204.localdomain groupadd[168733]: new group: name=dnsmasq, GID=984
Feb 20 09:18:00 np0005625204.localdomain useradd[168740]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 20 09:18:00 np0005625204.localdomain dbus-broker-launch[750]: Noticed file-system modification, trigger reload.
Feb 20 09:18:00 np0005625204.localdomain dbus-broker-launch[754]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Reloading rules
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Collecting garbage unconditionally...
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Loading rules from directory /etc/polkit-1/rules.d
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Finished loading, compiling and executing 5 rules
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Reloading rules
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Collecting garbage unconditionally...
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Loading rules from directory /etc/polkit-1/rules.d
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 20 09:18:01 np0005625204.localdomain polkitd[1035]: Finished loading, compiling and executing 5 rules
Feb 20 09:18:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28096 DF PROTO=TCP SPT=39276 DPT=9102 SEQ=3486674986 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59913D680000000001030307) 
Feb 20 09:18:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:18:05.973 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:18:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:18:05.974 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:18:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:18:05.976 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:18:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27872 DF PROTO=TCP SPT=50366 DPT=9101 SEQ=3142144691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59914BE50000000001030307) 
Feb 20 09:18:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:18:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:18:07 np0005625204.localdomain systemd[1]: tmp-crun.bkPpJq.mount: Deactivated successfully.
Feb 20 09:18:07 np0005625204.localdomain podman[168923]: 2026-02-20 09:18:07.171523127 +0000 UTC m=+0.094366433 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 20 09:18:07 np0005625204.localdomain podman[168922]: 2026-02-20 09:18:07.221410884 +0000 UTC m=+0.145938312 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Feb 20 09:18:07 np0005625204.localdomain podman[168923]: 2026-02-20 09:18:07.252695737 +0000 UTC m=+0.175539053 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:18:07 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:18:07 np0005625204.localdomain podman[168922]: 2026-02-20 09:18:07.269047924 +0000 UTC m=+0.193575392 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:18:07 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:18:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27874 DF PROTO=TCP SPT=50366 DPT=9101 SEQ=3142144691 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599157E80000000001030307) 
Feb 20 09:18:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39585 DF PROTO=TCP SPT=59270 DPT=9105 SEQ=1688412850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599161680000000001030307) 
Feb 20 09:18:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45680 DF PROTO=TCP SPT=36640 DPT=9102 SEQ=1733988649 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599171680000000001030307) 
Feb 20 09:18:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58737 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599178800000000001030307) 
Feb 20 09:18:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58739 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599184A90000000001030307) 
Feb 20 09:18:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58740 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599194680000000001030307) 
Feb 20 09:18:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48292 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=2261576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59919F280000000001030307) 
Feb 20 09:18:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48293 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=2261576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991A7280000000001030307) 
Feb 20 09:18:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58741 DF PROTO=TCP SPT=39882 DPT=9882 SEQ=2583904414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991B5680000000001030307) 
Feb 20 09:18:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25492 DF PROTO=TCP SPT=41030 DPT=9101 SEQ=1941935264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991C1150000000001030307) 
Feb 20 09:18:37 np0005625204.localdomain sudo[186020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:18:37 np0005625204.localdomain sudo[186020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:18:37 np0005625204.localdomain sudo[186020]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:18:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:18:37 np0005625204.localdomain sudo[186050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:18:37 np0005625204.localdomain sudo[186050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:18:37 np0005625204.localdomain podman[186038]: 2026-02-20 09:18:37.97060908 +0000 UTC m=+0.090496935 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:18:38 np0005625204.localdomain podman[186039]: 2026-02-20 09:18:38.018671332 +0000 UTC m=+0.134977078 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:18:38 np0005625204.localdomain podman[186039]: 2026-02-20 09:18:38.028954495 +0000 UTC m=+0.145260201 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true)
Feb 20 09:18:38 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:18:38 np0005625204.localdomain podman[186038]: 2026-02-20 09:18:38.084085483 +0000 UTC m=+0.203973378 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:18:38 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:18:38 np0005625204.localdomain sshd[186115]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:18:38 np0005625204.localdomain sshd[186115]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:18:38 np0005625204.localdomain sudo[186050]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:38 np0005625204.localdomain groupadd[186135]: group added to /etc/group: name=ceph, GID=167
Feb 20 09:18:38 np0005625204.localdomain groupadd[186135]: group added to /etc/gshadow: name=ceph
Feb 20 09:18:38 np0005625204.localdomain groupadd[186135]: new group: name=ceph, GID=167
Feb 20 09:18:38 np0005625204.localdomain useradd[186141]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 20 09:18:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25494 DF PROTO=TCP SPT=41030 DPT=9101 SEQ=1941935264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991CD280000000001030307) 
Feb 20 09:18:39 np0005625204.localdomain sudo[186148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:18:39 np0005625204.localdomain sudo[186148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:18:39 np0005625204.localdomain sudo[186148]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48295 DF PROTO=TCP SPT=60390 DPT=9105 SEQ=2261576420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991D7680000000001030307) 
Feb 20 09:18:42 np0005625204.localdomain sshd[121278]: Received signal 15; terminating.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: sshd.service: Consumed 2.379s CPU time, read 32.0K from disk, written 0B to disk.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 20 09:18:42 np0005625204.localdomain sshd[186832]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:18:42 np0005625204.localdomain sshd[186832]: Server listening on 0.0.0.0 port 22.
Feb 20 09:18:42 np0005625204.localdomain sshd[186832]: Server listening on :: port 22.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:43 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:44 np0005625204.localdomain systemd-sysv-generator[187084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:44 np0005625204.localdomain systemd-rc-local-generator[187078]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 09:18:44 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:18:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34022 DF PROTO=TCP SPT=49436 DPT=9102 SEQ=3458103529 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991E7680000000001030307) 
Feb 20 09:18:47 np0005625204.localdomain sudo[167250]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37974 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991ED9A0000000001030307) 
Feb 20 09:18:48 np0005625204.localdomain sudo[191881]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aakxrzrgcnjxvmwrwynksjrfykwnbyeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579127.677107-986-6421438309472/AnsiballZ_systemd.py
Feb 20 09:18:48 np0005625204.localdomain sudo[191881]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:48 np0005625204.localdomain python3.9[191926]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:48 np0005625204.localdomain systemd-sysv-generator[192344]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:48 np0005625204.localdomain systemd-rc-local-generator[192338]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:48 np0005625204.localdomain sudo[191881]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:49 np0005625204.localdomain sudo[193010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvhbmfmxgudfqmmyojtoyjqjtcenhuhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579129.3197727-986-221075801477148/AnsiballZ_systemd.py
Feb 20 09:18:49 np0005625204.localdomain sudo[193010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:49 np0005625204.localdomain sshd[193064]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:18:49 np0005625204.localdomain python3.9[193012]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:18:49 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:50 np0005625204.localdomain systemd-sysv-generator[193158]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:50 np0005625204.localdomain systemd-rc-local-generator[193154]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:50 np0005625204.localdomain sshd[193064]: Invalid user sol from 45.148.10.240 port 42520
Feb 20 09:18:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37976 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5991F9A80000000001030307) 
Feb 20 09:18:50 np0005625204.localdomain sshd[193064]: Connection closed by invalid user sol 45.148.10.240 port 42520 [preauth]
Feb 20 09:18:51 np0005625204.localdomain sudo[193010]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:51 np0005625204.localdomain sudo[194069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbybwnwtakanjpdiicaayvdssfuzfniq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579131.3685062-986-176000074701300/AnsiballZ_systemd.py
Feb 20 09:18:51 np0005625204.localdomain sudo[194069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:51 np0005625204.localdomain python3.9[194090]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:52 np0005625204.localdomain systemd-sysv-generator[194279]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:52 np0005625204.localdomain systemd-rc-local-generator[194276]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:52 np0005625204.localdomain sudo[194069]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:52 np0005625204.localdomain sudo[194761]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqpeystajgeffprbrnguksqrvivbjgxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579132.528586-986-160018299068348/AnsiballZ_systemd.py
Feb 20 09:18:52 np0005625204.localdomain sudo[194761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:53 np0005625204.localdomain python3.9[194776]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:53 np0005625204.localdomain systemd-sysv-generator[195002]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:53 np0005625204.localdomain systemd-rc-local-generator[194997]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:53 np0005625204.localdomain sudo[194761]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:54 np0005625204.localdomain sudo[195819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdckstripnfdcuqhnibesewrtvfilstd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579134.4324396-1073-209004743789156/AnsiballZ_systemd.py
Feb 20 09:18:54 np0005625204.localdomain sudo[195819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37977 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599209690000000001030307) 
Feb 20 09:18:55 np0005625204.localdomain python3.9[195845]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:55 np0005625204.localdomain systemd-sysv-generator[196085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:55 np0005625204.localdomain systemd-rc-local-generator[196080]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:55 np0005625204.localdomain sudo[195819]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:55 np0005625204.localdomain sudo[196400]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wanykeqamrsekolibthmclknkbqjnwhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579135.5402877-1073-239067811151012/AnsiballZ_systemd.py
Feb 20 09:18:55 np0005625204.localdomain sudo[196400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Consumed 13.952s CPU time.
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: run-r2a2a39551e3b4b5b9c3115cb5b5ff1b8.service: Deactivated successfully.
Feb 20 09:18:55 np0005625204.localdomain systemd[1]: run-rbcb2b767c39941cf8bc81e61264cc9b0.service: Deactivated successfully.
Feb 20 09:18:56 np0005625204.localdomain python3.9[196417]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:56 np0005625204.localdomain systemd-sysv-generator[196457]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:56 np0005625204.localdomain systemd-rc-local-generator[196451]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:56 np0005625204.localdomain sudo[196400]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:56 np0005625204.localdomain sudo[196572]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrnvhkqhfkgblrrdfopwvwrdplrtyksn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579136.691451-1073-236619858094230/AnsiballZ_systemd.py
Feb 20 09:18:56 np0005625204.localdomain sudo[196572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:57 np0005625204.localdomain python3.9[196574]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:18:57 np0005625204.localdomain systemd-sysv-generator[196608]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:18:57 np0005625204.localdomain systemd-rc-local-generator[196603]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:18:57 np0005625204.localdomain sudo[196572]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39099 DF PROTO=TCP SPT=44092 DPT=9105 SEQ=2269906091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599214690000000001030307) 
Feb 20 09:18:57 np0005625204.localdomain sudo[196721]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdrinaorwaaewwdpaejwsmdhbtuahlfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579137.68894-1073-230008127566639/AnsiballZ_systemd.py
Feb 20 09:18:57 np0005625204.localdomain sudo[196721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:58 np0005625204.localdomain python3.9[196723]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:18:58 np0005625204.localdomain sudo[196721]: pam_unix(sudo:session): session closed for user root
Feb 20 09:18:58 np0005625204.localdomain sudo[196834]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgqaquvtbnzylkqxyowxochmcjimdnef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579138.4295828-1073-116189758598759/AnsiballZ_systemd.py
Feb 20 09:18:58 np0005625204.localdomain sudo[196834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:18:59 np0005625204.localdomain python3.9[196836]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:18:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39100 DF PROTO=TCP SPT=44092 DPT=9105 SEQ=2269906091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59921C690000000001030307) 
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:19:00 np0005625204.localdomain systemd-sysv-generator[196870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:19:00 np0005625204.localdomain systemd-rc-local-generator[196863]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:00 np0005625204.localdomain sudo[196834]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37978 DF PROTO=TCP SPT=41034 DPT=9882 SEQ=18082864 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599229680000000001030307) 
Feb 20 09:19:03 np0005625204.localdomain sudo[196983]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghlcjfzzwhfqhbjppcyqqqlxmvksjsik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579143.247141-1181-115308325902254/AnsiballZ_systemd.py
Feb 20 09:19:03 np0005625204.localdomain sudo[196983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:03 np0005625204.localdomain python3.9[196985]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:19:03 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:19:03 np0005625204.localdomain systemd-rc-local-generator[197010]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:19:04 np0005625204.localdomain systemd-sysv-generator[197015]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:19:04 np0005625204.localdomain sudo[196983]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:04 np0005625204.localdomain sudo[197131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivbitwxdgybabbaiwngivrgpoyunlnwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579144.4691687-1206-34368672482003/AnsiballZ_systemd.py
Feb 20 09:19:04 np0005625204.localdomain sudo[197131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:05 np0005625204.localdomain python3.9[197133]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:05 np0005625204.localdomain sudo[197131]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:05 np0005625204.localdomain sudo[197244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjydkzfmtowlhteqkarwpmdlxhsehfpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579145.274467-1206-178786985323222/AnsiballZ_systemd.py
Feb 20 09:19:05 np0005625204.localdomain sudo[197244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:05 np0005625204.localdomain python3.9[197246]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:05 np0005625204.localdomain sudo[197244]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:19:05.974 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:19:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:19:05.975 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:19:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:19:05.976 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:19:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63702 DF PROTO=TCP SPT=39996 DPT=9101 SEQ=1291160012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599236450000000001030307) 
Feb 20 09:19:06 np0005625204.localdomain sudo[197357]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yosmdruabgmawvgaswvzglxnmpzzjvly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579146.0377927-1206-166301406610697/AnsiballZ_systemd.py
Feb 20 09:19:06 np0005625204.localdomain sudo[197357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:06 np0005625204.localdomain python3.9[197359]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:07 np0005625204.localdomain sudo[197357]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:08 np0005625204.localdomain sudo[197470]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgmhalmbvlndpjzfwdsqihmnfagpqsdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579147.83776-1206-977455904157/AnsiballZ_systemd.py
Feb 20 09:19:08 np0005625204.localdomain sudo[197470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:19:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:19:08 np0005625204.localdomain podman[197474]: 2026-02-20 09:19:08.281664599 +0000 UTC m=+0.093900819 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:19:08 np0005625204.localdomain podman[197474]: 2026-02-20 09:19:08.291097836 +0000 UTC m=+0.103334086 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:19:08 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:19:08 np0005625204.localdomain podman[197473]: 2026-02-20 09:19:08.385901421 +0000 UTC m=+0.198274185 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:19:08 np0005625204.localdomain podman[197473]: 2026-02-20 09:19:08.424935899 +0000 UTC m=+0.237308663 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:19:08 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:19:08 np0005625204.localdomain python3.9[197472]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:08 np0005625204.localdomain sudo[197470]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:09 np0005625204.localdomain sudo[197627]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jelxixfxaurbgasrreljzltfiobyyjcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579148.680195-1206-142697774384283/AnsiballZ_systemd.py
Feb 20 09:19:09 np0005625204.localdomain sudo[197627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:09 np0005625204.localdomain python3.9[197629]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:09 np0005625204.localdomain sudo[197627]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63704 DF PROTO=TCP SPT=39996 DPT=9101 SEQ=1291160012 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599242680000000001030307) 
Feb 20 09:19:09 np0005625204.localdomain sudo[197740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdohodcqrjhjggetcdigonjldwrkintt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579149.4862084-1206-147412771727350/AnsiballZ_systemd.py
Feb 20 09:19:09 np0005625204.localdomain sudo[197740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:10 np0005625204.localdomain python3.9[197742]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:10 np0005625204.localdomain sudo[197740]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:10 np0005625204.localdomain sudo[197853]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swqvhlatzpxmkvqbyyzyrmemeltijufm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579150.2457466-1206-242476402692234/AnsiballZ_systemd.py
Feb 20 09:19:10 np0005625204.localdomain sudo[197853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:10 np0005625204.localdomain python3.9[197855]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:10 np0005625204.localdomain sudo[197853]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:11 np0005625204.localdomain sudo[197966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tidfcvsxriikqaiuvsodbcwohhdgmvhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579151.0043075-1206-169334164022746/AnsiballZ_systemd.py
Feb 20 09:19:11 np0005625204.localdomain sudo[197966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:11 np0005625204.localdomain python3.9[197968]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:11 np0005625204.localdomain sudo[197966]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:12 np0005625204.localdomain sudo[198079]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttytgevczknfgoynkmlnqiegerhouwfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579151.7568264-1206-101947350879129/AnsiballZ_systemd.py
Feb 20 09:19:12 np0005625204.localdomain sudo[198079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39102 DF PROTO=TCP SPT=44092 DPT=9105 SEQ=2269906091 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59924D690000000001030307) 
Feb 20 09:19:12 np0005625204.localdomain python3.9[198081]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:12 np0005625204.localdomain sudo[198079]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:13 np0005625204.localdomain sudo[198192]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lawapgnthrnudzwjqanheoyhwcaeqwxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579152.5211642-1206-171397754482609/AnsiballZ_systemd.py
Feb 20 09:19:13 np0005625204.localdomain sudo[198192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:13 np0005625204.localdomain python3.9[198194]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:13 np0005625204.localdomain sudo[198192]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:14 np0005625204.localdomain sudo[198305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjutyectoltsrfkxedkkppgiwzogjzat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579153.9720404-1206-195485857959101/AnsiballZ_systemd.py
Feb 20 09:19:14 np0005625204.localdomain sudo[198305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:14 np0005625204.localdomain python3.9[198307]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:14 np0005625204.localdomain sudo[198305]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:15 np0005625204.localdomain sudo[198418]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqhgtykvgqbvafthwrvrgennqwyiekzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579154.7856026-1206-179255471709552/AnsiballZ_systemd.py
Feb 20 09:19:15 np0005625204.localdomain sudo[198418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:15 np0005625204.localdomain python3.9[198420]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:15 np0005625204.localdomain sudo[198418]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6504 DF PROTO=TCP SPT=46248 DPT=9102 SEQ=1483972815 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59925B690000000001030307) 
Feb 20 09:19:16 np0005625204.localdomain sudo[198531]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqbzbtgehoupkfrudkzrgusfsieilbjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579156.1709948-1206-128522190968314/AnsiballZ_systemd.py
Feb 20 09:19:16 np0005625204.localdomain sudo[198531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:16 np0005625204.localdomain python3.9[198533]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:16 np0005625204.localdomain sudo[198531]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:17 np0005625204.localdomain sudo[198644]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bevawzgwlqhmikzxpbctuuoavbtzthcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579156.989189-1206-97990597439077/AnsiballZ_systemd.py
Feb 20 09:19:17 np0005625204.localdomain sudo[198644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:17 np0005625204.localdomain python3.9[198646]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 20 09:19:17 np0005625204.localdomain sudo[198644]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21087 DF PROTO=TCP SPT=38838 DPT=9882 SEQ=64419992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599262CB0000000001030307) 
Feb 20 09:19:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21089 DF PROTO=TCP SPT=38838 DPT=9882 SEQ=64419992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59926EE90000000001030307) 
Feb 20 09:19:22 np0005625204.localdomain sudo[198757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xgfkponslksdmbyveufiuqkfguyiyonh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579162.2679436-1511-56817723748827/AnsiballZ_file.py
Feb 20 09:19:22 np0005625204.localdomain sudo[198757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:22 np0005625204.localdomain python3.9[198759]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:19:22 np0005625204.localdomain sudo[198757]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:23 np0005625204.localdomain sudo[198867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oulcbkrvnfhdwezseuuokpybhpghcuvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579162.8870401-1511-203238187252173/AnsiballZ_file.py
Feb 20 09:19:23 np0005625204.localdomain sudo[198867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:23 np0005625204.localdomain python3.9[198869]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:19:23 np0005625204.localdomain sudo[198867]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:23 np0005625204.localdomain sshd[198870]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:19:23 np0005625204.localdomain sshd[198870]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:19:23 np0005625204.localdomain sudo[198979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmlzyujbznkjwlafxhzqanmqornahmig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579163.4896295-1511-226372445538024/AnsiballZ_file.py
Feb 20 09:19:23 np0005625204.localdomain sudo[198979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:23 np0005625204.localdomain python3.9[198981]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:19:23 np0005625204.localdomain sudo[198979]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21090 DF PROTO=TCP SPT=38838 DPT=9882 SEQ=64419992 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59927EA80000000001030307) 
Feb 20 09:19:24 np0005625204.localdomain sudo[199089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtecjqhdtrlrlijubhwhjzxqcqltjbzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579164.063049-1511-202241012512152/AnsiballZ_file.py
Feb 20 09:19:24 np0005625204.localdomain sudo[199089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:25 np0005625204.localdomain python3.9[199091]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:19:25 np0005625204.localdomain sudo[199089]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:25 np0005625204.localdomain sudo[199199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjtwayhigssseycfgftdahbhqpgipoys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579165.2921338-1511-192095216984927/AnsiballZ_file.py
Feb 20 09:19:25 np0005625204.localdomain sudo[199199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:25 np0005625204.localdomain python3.9[199201]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:19:25 np0005625204.localdomain sudo[199199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:26 np0005625204.localdomain sudo[199309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwaoxxrnmjkrbsziqemceikozmzffrmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579165.892832-1511-130445829124917/AnsiballZ_file.py
Feb 20 09:19:26 np0005625204.localdomain sudo[199309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:26 np0005625204.localdomain python3.9[199311]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:19:26 np0005625204.localdomain sudo[199309]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:27 np0005625204.localdomain python3.9[199419]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:19:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52280 DF PROTO=TCP SPT=58718 DPT=9105 SEQ=2224049347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599289680000000001030307) 
Feb 20 09:19:28 np0005625204.localdomain sudo[199527]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmwmxhltlbtmmmxmlbqxgtvklwzcylju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579167.768335-1665-237756172481434/AnsiballZ_stat.py
Feb 20 09:19:28 np0005625204.localdomain sudo[199527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:28 np0005625204.localdomain python3.9[199529]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:28 np0005625204.localdomain sudo[199527]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:29 np0005625204.localdomain sudo[199617]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sarjdbtyyfngfgvjxxfmwcdcydwmqqay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579167.768335-1665-237756172481434/AnsiballZ_copy.py
Feb 20 09:19:29 np0005625204.localdomain sudo[199617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:29 np0005625204.localdomain python3.9[199619]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579167.768335-1665-237756172481434/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:29 np0005625204.localdomain sudo[199617]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52281 DF PROTO=TCP SPT=58718 DPT=9105 SEQ=2224049347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599291680000000001030307) 
Feb 20 09:19:29 np0005625204.localdomain sudo[199727]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdurcgmuasshpeuymrdvaabbeivbljox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579169.3523057-1665-91867724010230/AnsiballZ_stat.py
Feb 20 09:19:29 np0005625204.localdomain sudo[199727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:29 np0005625204.localdomain python3.9[199729]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:29 np0005625204.localdomain sshd[199731]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:19:29 np0005625204.localdomain sudo[199727]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:30 np0005625204.localdomain sudo[199819]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xphevdryphepaveobbxjagxkqcuuuegi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579169.3523057-1665-91867724010230/AnsiballZ_copy.py
Feb 20 09:19:30 np0005625204.localdomain sudo[199819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:30 np0005625204.localdomain sshd[199731]: Invalid user claude from 54.36.99.29 port 42490
Feb 20 09:19:30 np0005625204.localdomain python3.9[199821]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579169.3523057-1665-91867724010230/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:30 np0005625204.localdomain sudo[199819]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:30 np0005625204.localdomain sshd[199731]: Received disconnect from 54.36.99.29 port 42490:11: Bye Bye [preauth]
Feb 20 09:19:30 np0005625204.localdomain sshd[199731]: Disconnected from invalid user claude 54.36.99.29 port 42490 [preauth]
Feb 20 09:19:30 np0005625204.localdomain sudo[199929]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apxnygoiifvbiybaadguglyjwovamypg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579170.5336275-1665-53117978826558/AnsiballZ_stat.py
Feb 20 09:19:30 np0005625204.localdomain sudo[199929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:30 np0005625204.localdomain python3.9[199931]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:31 np0005625204.localdomain sudo[199929]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:31 np0005625204.localdomain sudo[200019]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkowqhzhkywlxpgsjkmfjnsnsreoiyit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579170.5336275-1665-53117978826558/AnsiballZ_copy.py
Feb 20 09:19:31 np0005625204.localdomain sudo[200019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:31 np0005625204.localdomain python3.9[200021]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579170.5336275-1665-53117978826558/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:31 np0005625204.localdomain sudo[200019]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:31 np0005625204.localdomain sudo[200129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mizkyrgukpkwzptidtkdqpzrgyrgyfjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579171.7030823-1665-151640048376763/AnsiballZ_stat.py
Feb 20 09:19:31 np0005625204.localdomain sudo[200129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:32 np0005625204.localdomain python3.9[200131]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:32 np0005625204.localdomain sudo[200129]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:32 np0005625204.localdomain sudo[200219]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkgkqgygnphrevvfsreredebrwkqzzkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579171.7030823-1665-151640048376763/AnsiballZ_copy.py
Feb 20 09:19:32 np0005625204.localdomain sudo[200219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31659 DF PROTO=TCP SPT=34970 DPT=9100 SEQ=4001655195 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59929D680000000001030307) 
Feb 20 09:19:32 np0005625204.localdomain python3.9[200221]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579171.7030823-1665-151640048376763/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:32 np0005625204.localdomain sudo[200219]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:33 np0005625204.localdomain sudo[200329]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shupjvzrntmicarpcjmntvqcjyhnvxuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579172.8879738-1665-4352235216342/AnsiballZ_stat.py
Feb 20 09:19:33 np0005625204.localdomain sudo[200329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:33 np0005625204.localdomain python3.9[200331]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:33 np0005625204.localdomain sudo[200329]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:33 np0005625204.localdomain sudo[200419]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvofjdyrhhzwcrwrzicvasbcxbhipver ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579172.8879738-1665-4352235216342/AnsiballZ_copy.py
Feb 20 09:19:33 np0005625204.localdomain sudo[200419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:33 np0005625204.localdomain python3.9[200421]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579172.8879738-1665-4352235216342/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:33 np0005625204.localdomain sudo[200419]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:34 np0005625204.localdomain sudo[200529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsigdleimcbivgtwhsfkwgdfomwmvlot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579174.0234854-1665-64128306467384/AnsiballZ_stat.py
Feb 20 09:19:34 np0005625204.localdomain sudo[200529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:34 np0005625204.localdomain python3.9[200531]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:34 np0005625204.localdomain sudo[200529]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:35 np0005625204.localdomain sudo[200619]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfglrjxfzvxclqpzdsqovvkhbqzcoeci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579174.0234854-1665-64128306467384/AnsiballZ_copy.py
Feb 20 09:19:35 np0005625204.localdomain sudo[200619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:35 np0005625204.localdomain python3.9[200621]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579174.0234854-1665-64128306467384/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:35 np0005625204.localdomain sudo[200619]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:35 np0005625204.localdomain sudo[200729]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bodhxbeipfspgharvrycdgxvqzmztqsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579175.626153-1665-129836363519062/AnsiballZ_stat.py
Feb 20 09:19:35 np0005625204.localdomain sudo[200729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:36 np0005625204.localdomain python3.9[200731]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:36 np0005625204.localdomain sudo[200729]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37788 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=241455690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992AB750000000001030307) 
Feb 20 09:19:36 np0005625204.localdomain sudo[200817]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeckeprhdfdkhyvqjwccbsivibtvyqtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579175.626153-1665-129836363519062/AnsiballZ_copy.py
Feb 20 09:19:36 np0005625204.localdomain sudo[200817]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:36 np0005625204.localdomain python3.9[200819]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579175.626153-1665-129836363519062/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:36 np0005625204.localdomain sudo[200817]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:37 np0005625204.localdomain sudo[200927]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyqsepmikukrrhpngunowczaljjiulcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579177.4699535-1665-202993156818799/AnsiballZ_stat.py
Feb 20 09:19:37 np0005625204.localdomain sudo[200927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:37 np0005625204.localdomain python3.9[200929]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:37 np0005625204.localdomain sudo[200927]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:38 np0005625204.localdomain sudo[201017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhtzjjbrqsfednmhgucunzdkgepzymra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579177.4699535-1665-202993156818799/AnsiballZ_copy.py
Feb 20 09:19:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:19:38 np0005625204.localdomain sudo[201017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:38 np0005625204.localdomain podman[201019]: 2026-02-20 09:19:38.445572479 +0000 UTC m=+0.082943072 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:19:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:19:38 np0005625204.localdomain podman[201019]: 2026-02-20 09:19:38.48001336 +0000 UTC m=+0.117383923 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:19:38 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:19:38 np0005625204.localdomain systemd[1]: tmp-crun.lvvK0A.mount: Deactivated successfully.
Feb 20 09:19:38 np0005625204.localdomain python3.9[201020]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1771579177.4699535-1665-202993156818799/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:38 np0005625204.localdomain podman[201038]: 2026-02-20 09:19:38.561698391 +0000 UTC m=+0.088104104 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:19:38 np0005625204.localdomain sudo[201017]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:38 np0005625204.localdomain podman[201038]: 2026-02-20 09:19:38.622200218 +0000 UTC m=+0.148605941 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:19:38 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:19:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37790 DF PROTO=TCP SPT=60826 DPT=9101 SEQ=241455690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992B7680000000001030307) 
Feb 20 09:19:39 np0005625204.localdomain sudo[201110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:19:39 np0005625204.localdomain sudo[201110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:19:39 np0005625204.localdomain sudo[201110]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:39 np0005625204.localdomain sudo[201150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:19:39 np0005625204.localdomain sudo[201150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:19:39 np0005625204.localdomain sudo[201204]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfncbosfjfvanwwhtnfijfowuyxmznll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579179.4156613-2006-152743689244988/AnsiballZ_file.py
Feb 20 09:19:39 np0005625204.localdomain sudo[201204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:39 np0005625204.localdomain python3.9[201206]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:39 np0005625204.localdomain sudo[201204]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:40 np0005625204.localdomain sudo[201150]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:40 np0005625204.localdomain sudo[201346]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgaxnrvtdjdvdngibymwxwxcplipcevq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579180.0711331-2030-192730320680452/AnsiballZ_file.py
Feb 20 09:19:40 np0005625204.localdomain sudo[201346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:40 np0005625204.localdomain python3.9[201348]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:40 np0005625204.localdomain sudo[201346]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:40 np0005625204.localdomain sudo[201456]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdegsobmhbdvajkkkfyyeqquyskdlgof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579180.657984-2030-32300654032133/AnsiballZ_file.py
Feb 20 09:19:40 np0005625204.localdomain sudo[201456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:41 np0005625204.localdomain sudo[201459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:19:41 np0005625204.localdomain sudo[201459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:19:41 np0005625204.localdomain sudo[201459]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:41 np0005625204.localdomain python3.9[201458]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:41 np0005625204.localdomain sudo[201456]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:41 np0005625204.localdomain sudo[201584]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcqnylqkxxruwtkaahehjqebgygdhaos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579181.2463539-2030-135078055104910/AnsiballZ_file.py
Feb 20 09:19:41 np0005625204.localdomain sudo[201584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:41 np0005625204.localdomain python3.9[201586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:41 np0005625204.localdomain sudo[201584]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52283 DF PROTO=TCP SPT=58718 DPT=9105 SEQ=2224049347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992C1680000000001030307) 
Feb 20 09:19:42 np0005625204.localdomain sudo[201694]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvcyqcfybiezmyjuvqyhjgjnytmcpxnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579181.861495-2030-85958345569755/AnsiballZ_file.py
Feb 20 09:19:42 np0005625204.localdomain sudo[201694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:42 np0005625204.localdomain python3.9[201696]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:42 np0005625204.localdomain sudo[201694]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:42 np0005625204.localdomain sudo[201804]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnuderkmymqqugfjqeqvumchuyszeras ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579182.467242-2030-128677253985619/AnsiballZ_file.py
Feb 20 09:19:42 np0005625204.localdomain sudo[201804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:43 np0005625204.localdomain python3.9[201806]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:43 np0005625204.localdomain sudo[201804]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:43 np0005625204.localdomain sudo[201914]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikipztwsaskmhmanhxcsrfeyqmuwhqnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579183.4022553-2030-66420912631023/AnsiballZ_file.py
Feb 20 09:19:43 np0005625204.localdomain sudo[201914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:43 np0005625204.localdomain python3.9[201916]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:43 np0005625204.localdomain sudo[201914]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:44 np0005625204.localdomain sudo[202024]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coczusetaxtlhktgbpvosfobsytzsajq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579183.9909902-2030-158571895532232/AnsiballZ_file.py
Feb 20 09:19:44 np0005625204.localdomain sudo[202024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:44 np0005625204.localdomain python3.9[202026]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:44 np0005625204.localdomain sudo[202024]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:44 np0005625204.localdomain sudo[202134]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuzlalhegrqbrlhhdabzwbwlxezcszmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579184.5705495-2030-187310916046907/AnsiballZ_file.py
Feb 20 09:19:44 np0005625204.localdomain sudo[202134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:45 np0005625204.localdomain python3.9[202136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:45 np0005625204.localdomain sudo[202134]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:45 np0005625204.localdomain sudo[202244]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvjfcjvcwrsutxhddieiczmrraczslsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579185.1377244-2030-100527358076188/AnsiballZ_file.py
Feb 20 09:19:45 np0005625204.localdomain sudo[202244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:45 np0005625204.localdomain python3.9[202246]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:45 np0005625204.localdomain sudo[202244]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:45 np0005625204.localdomain sudo[202354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghghfxcwpdngazxwcarhydwjegwaddte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579185.7292051-2030-118979115937973/AnsiballZ_file.py
Feb 20 09:19:45 np0005625204.localdomain sudo[202354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11267 DF PROTO=TCP SPT=44804 DPT=9102 SEQ=4043637556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992D1680000000001030307) 
Feb 20 09:19:46 np0005625204.localdomain python3.9[202356]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:46 np0005625204.localdomain sudo[202354]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:19:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd59610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55bf8dd582d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 09:19:47 np0005625204.localdomain sudo[202464]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbqpxhharjdymqnyfzvfbavvgtdlorzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579186.8681428-2030-227521145119547/AnsiballZ_file.py
Feb 20 09:19:47 np0005625204.localdomain sudo[202464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:47 np0005625204.localdomain python3.9[202466]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:47 np0005625204.localdomain sudo[202464]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40631 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992D7FA0000000001030307) 
Feb 20 09:19:47 np0005625204.localdomain sudo[202574]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixihtiwdohzrcdmrzjprucvopkybvyhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579187.5090125-2030-122213470692080/AnsiballZ_file.py
Feb 20 09:19:47 np0005625204.localdomain sudo[202574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:47 np0005625204.localdomain python3.9[202576]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:47 np0005625204.localdomain sudo[202574]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:48 np0005625204.localdomain sudo[202684]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnfufbgxylqticquptphnabdcbfnuovd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579188.6935053-2030-13042321416535/AnsiballZ_file.py
Feb 20 09:19:48 np0005625204.localdomain sudo[202684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:49 np0005625204.localdomain python3.9[202686]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:49 np0005625204.localdomain sudo[202684]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:49 np0005625204.localdomain sudo[202794]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sppxigxoexhmppegpyjdueksumzwcikn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579189.3490725-2030-20136598579487/AnsiballZ_file.py
Feb 20 09:19:49 np0005625204.localdomain sudo[202794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:49 np0005625204.localdomain python3.9[202796]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:49 np0005625204.localdomain sudo[202794]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:50 np0005625204.localdomain sudo[202904]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jknkjqcamjqmcmpjqhxbghbuxmuclpdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579190.0395901-2327-144118274078383/AnsiballZ_stat.py
Feb 20 09:19:50 np0005625204.localdomain sudo[202904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:50 np0005625204.localdomain python3.9[202906]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:50 np0005625204.localdomain sudo[202904]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40633 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992E3E90000000001030307) 
Feb 20 09:19:50 np0005625204.localdomain sudo[202992]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptwjwsezjzlxshnjbqdpyaihjfvwowje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579190.0395901-2327-144118274078383/AnsiballZ_copy.py
Feb 20 09:19:50 np0005625204.localdomain sudo[202992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:51 np0005625204.localdomain python3.9[202994]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579190.0395901-2327-144118274078383/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:51 np0005625204.localdomain sudo[202992]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:19:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.013       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694b610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.01              0.00         1    0.012       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55cba694a2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 3.7e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 20 09:19:51 np0005625204.localdomain sudo[203102]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqjluojkyqxumnqvieqtcixnfibkkggb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579191.184508-2327-104139547922568/AnsiballZ_stat.py
Feb 20 09:19:51 np0005625204.localdomain sudo[203102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:51 np0005625204.localdomain python3.9[203104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:51 np0005625204.localdomain sudo[203102]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:52 np0005625204.localdomain sudo[203190]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxwedksmeakstizdhqkqlovsvtzgcjbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579191.184508-2327-104139547922568/AnsiballZ_copy.py
Feb 20 09:19:52 np0005625204.localdomain sudo[203190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:52 np0005625204.localdomain python3.9[203192]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579191.184508-2327-104139547922568/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:52 np0005625204.localdomain sudo[203190]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:52 np0005625204.localdomain sudo[203300]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fetjhpflsnqmqewdnpycpkocdjqagfrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579192.4147668-2327-214635171512767/AnsiballZ_stat.py
Feb 20 09:19:52 np0005625204.localdomain sudo[203300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:52 np0005625204.localdomain python3.9[203302]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:52 np0005625204.localdomain sudo[203300]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:53 np0005625204.localdomain sudo[203388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmtoysnysrrypdsfdslmbopvjjdfgnrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579192.4147668-2327-214635171512767/AnsiballZ_copy.py
Feb 20 09:19:53 np0005625204.localdomain sudo[203388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:53 np0005625204.localdomain python3.9[203390]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579192.4147668-2327-214635171512767/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:53 np0005625204.localdomain sudo[203388]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:53 np0005625204.localdomain sudo[203498]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oopxjqoffmqgfrtpsjovyzdlzqzqtxbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579193.5151584-2327-124124188223815/AnsiballZ_stat.py
Feb 20 09:19:53 np0005625204.localdomain sudo[203498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:53 np0005625204.localdomain python3.9[203500]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:54 np0005625204.localdomain sudo[203498]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:54 np0005625204.localdomain sudo[203586]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pstochfqnwgznrforypnuroowgupkymh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579193.5151584-2327-124124188223815/AnsiballZ_copy.py
Feb 20 09:19:54 np0005625204.localdomain sudo[203586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:54 np0005625204.localdomain python3.9[203588]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579193.5151584-2327-124124188223815/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:54 np0005625204.localdomain sudo[203586]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40634 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992F3A90000000001030307) 
Feb 20 09:19:54 np0005625204.localdomain sudo[203696]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmmrsisqbjmfwwluphbykhhjszegkifi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579194.719194-2327-269056524107679/AnsiballZ_stat.py
Feb 20 09:19:54 np0005625204.localdomain sudo[203696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:55 np0005625204.localdomain python3.9[203698]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:55 np0005625204.localdomain sudo[203696]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:55 np0005625204.localdomain sudo[203784]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-solfmpeqtdskfzzgbcawacxnzpkzhogd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579194.719194-2327-269056524107679/AnsiballZ_copy.py
Feb 20 09:19:55 np0005625204.localdomain sudo[203784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:55 np0005625204.localdomain python3.9[203786]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579194.719194-2327-269056524107679/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:55 np0005625204.localdomain sudo[203784]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:56 np0005625204.localdomain sudo[203894]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xupmgvqcsykfbvebtxjcuhxjrortkonq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579196.0122721-2327-178922098812536/AnsiballZ_stat.py
Feb 20 09:19:56 np0005625204.localdomain sudo[203894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:56 np0005625204.localdomain python3.9[203896]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:56 np0005625204.localdomain sudo[203894]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:56 np0005625204.localdomain sudo[203982]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbgbovsxbueylcoubpgkrwyvljpuzzca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579196.0122721-2327-178922098812536/AnsiballZ_copy.py
Feb 20 09:19:56 np0005625204.localdomain sudo[203982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:57 np0005625204.localdomain python3.9[203984]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579196.0122721-2327-178922098812536/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:57 np0005625204.localdomain sudo[203982]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:57 np0005625204.localdomain sudo[204092]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rbgyompxnnrjxphgapsxdycenjssodlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579197.205364-2327-48964405654693/AnsiballZ_stat.py
Feb 20 09:19:57 np0005625204.localdomain sudo[204092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41461 DF PROTO=TCP SPT=60506 DPT=9105 SEQ=3569357637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5992FEA80000000001030307) 
Feb 20 09:19:57 np0005625204.localdomain python3.9[204094]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:57 np0005625204.localdomain sudo[204092]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:58 np0005625204.localdomain sudo[204180]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmpozhxqzpiflugvlcnetpghspolkksw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579197.205364-2327-48964405654693/AnsiballZ_copy.py
Feb 20 09:19:58 np0005625204.localdomain sudo[204180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:58 np0005625204.localdomain python3.9[204182]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579197.205364-2327-48964405654693/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:19:59 np0005625204.localdomain sudo[204180]: pam_unix(sudo:session): session closed for user root
Feb 20 09:19:59 np0005625204.localdomain sudo[204290]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyofchrchookywbzxwxminxcgkhtskvp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579199.146064-2327-196876657251657/AnsiballZ_stat.py
Feb 20 09:19:59 np0005625204.localdomain sudo[204290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:19:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41462 DF PROTO=TCP SPT=60506 DPT=9105 SEQ=3569357637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599306A80000000001030307) 
Feb 20 09:19:59 np0005625204.localdomain python3.9[204292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:19:59 np0005625204.localdomain sudo[204290]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:00 np0005625204.localdomain sudo[204378]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oorsoacrbffuuuxxjgdphidltsgdfsuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579199.146064-2327-196876657251657/AnsiballZ_copy.py
Feb 20 09:20:00 np0005625204.localdomain sudo[204378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:00 np0005625204.localdomain python3.9[204380]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579199.146064-2327-196876657251657/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:00 np0005625204.localdomain sudo[204378]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:01 np0005625204.localdomain sudo[204488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdtjgthruzxdbumzohjgwjqrekfbhnou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579200.3764126-2327-233536110770031/AnsiballZ_stat.py
Feb 20 09:20:01 np0005625204.localdomain sudo[204488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:01 np0005625204.localdomain python3.9[204490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:01 np0005625204.localdomain sudo[204488]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:01 np0005625204.localdomain sudo[204576]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldsthmxxewwwsvzplppajcoiyikukuwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579200.3764126-2327-233536110770031/AnsiballZ_copy.py
Feb 20 09:20:01 np0005625204.localdomain sudo[204576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:01 np0005625204.localdomain python3.9[204578]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579200.3764126-2327-233536110770031/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:01 np0005625204.localdomain sudo[204576]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:02 np0005625204.localdomain sudo[204686]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxzjsezhajrhxqkfiwdfniylmvohikwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579201.978815-2327-84156838051085/AnsiballZ_stat.py
Feb 20 09:20:02 np0005625204.localdomain sudo[204686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:02 np0005625204.localdomain python3.9[204688]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:02 np0005625204.localdomain sudo[204686]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:02 np0005625204.localdomain sudo[204774]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pklqthaywcdijaznvckznprvkzoqtoit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579201.978815-2327-84156838051085/AnsiballZ_copy.py
Feb 20 09:20:02 np0005625204.localdomain sudo[204774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40635 DF PROTO=TCP SPT=46782 DPT=9882 SEQ=1592775776 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599313680000000001030307) 
Feb 20 09:20:03 np0005625204.localdomain python3.9[204776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579201.978815-2327-84156838051085/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:03 np0005625204.localdomain sudo[204774]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:03 np0005625204.localdomain sudo[204884]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-masxflkjyqbbwssbkkefndgxylyldzod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579203.229483-2327-71804683373422/AnsiballZ_stat.py
Feb 20 09:20:03 np0005625204.localdomain sudo[204884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:03 np0005625204.localdomain python3.9[204886]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:03 np0005625204.localdomain sudo[204884]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:04 np0005625204.localdomain sudo[204972]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpnjeotefeyiulyxzsmwqvjpasottuhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579203.229483-2327-71804683373422/AnsiballZ_copy.py
Feb 20 09:20:04 np0005625204.localdomain sudo[204972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:04 np0005625204.localdomain python3.9[204974]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579203.229483-2327-71804683373422/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:04 np0005625204.localdomain sudo[204972]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:04 np0005625204.localdomain sudo[205082]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjcquhhqitujuokctiskigdohwkxdudp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579204.3953495-2327-219570583992080/AnsiballZ_stat.py
Feb 20 09:20:04 np0005625204.localdomain sudo[205082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:04 np0005625204.localdomain python3.9[205084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:04 np0005625204.localdomain sudo[205082]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:05 np0005625204.localdomain sudo[205170]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-intywdakrsttamernkdpwdfwhhafmxrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579204.3953495-2327-219570583992080/AnsiballZ_copy.py
Feb 20 09:20:05 np0005625204.localdomain sudo[205170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:05 np0005625204.localdomain python3.9[205172]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579204.3953495-2327-219570583992080/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:05 np0005625204.localdomain sudo[205170]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:05 np0005625204.localdomain sudo[205280]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aatctllyjilkyyggfxwtrsuapiivfamj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579205.619173-2327-280455899276421/AnsiballZ_stat.py
Feb 20 09:20:05 np0005625204.localdomain sudo[205280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:20:05.976 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:20:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:20:05.977 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:20:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:20:05.978 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:20:06 np0005625204.localdomain python3.9[205282]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:06 np0005625204.localdomain sudo[205280]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:06 np0005625204.localdomain sshd[205283]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:20:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1951 DF PROTO=TCP SPT=38332 DPT=9101 SEQ=3436120266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599320A40000000001030307) 
Feb 20 09:20:06 np0005625204.localdomain sshd[205283]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:20:06 np0005625204.localdomain sudo[205370]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kahjiolfsrcvnjjlpurslhnhdagrefwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579205.619173-2327-280455899276421/AnsiballZ_copy.py
Feb 20 09:20:06 np0005625204.localdomain sudo[205370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:06 np0005625204.localdomain python3.9[205372]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579205.619173-2327-280455899276421/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:06 np0005625204.localdomain sudo[205370]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:07 np0005625204.localdomain sudo[205480]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byygxzntcgocwiyxywuhhthcvqvoxqxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579206.859325-2327-268408291450871/AnsiballZ_stat.py
Feb 20 09:20:07 np0005625204.localdomain sudo[205480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:07 np0005625204.localdomain python3.9[205482]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:07 np0005625204.localdomain sudo[205480]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:07 np0005625204.localdomain sudo[205568]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzhlgfsgtfbmewzctopfptcvavmgbtji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579206.859325-2327-268408291450871/AnsiballZ_copy.py
Feb 20 09:20:07 np0005625204.localdomain sudo[205568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:07 np0005625204.localdomain python3.9[205570]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579206.859325-2327-268408291450871/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:07 np0005625204.localdomain sudo[205568]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:08 np0005625204.localdomain python3.9[205678]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:20:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:20:09 np0005625204.localdomain podman[205737]: 2026-02-20 09:20:09.165045995 +0000 UTC m=+0.088478977 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:20:09 np0005625204.localdomain podman[205738]: 2026-02-20 09:20:09.246683855 +0000 UTC m=+0.169797266 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 20 09:20:09 np0005625204.localdomain podman[205738]: 2026-02-20 09:20:09.254340456 +0000 UTC m=+0.177453827 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 20 09:20:09 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:20:09 np0005625204.localdomain podman[205737]: 2026-02-20 09:20:09.279196325 +0000 UTC m=+0.202629297 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 20 09:20:09 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:20:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1953 DF PROTO=TCP SPT=38332 DPT=9101 SEQ=3436120266 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59932CA80000000001030307) 
Feb 20 09:20:09 np0005625204.localdomain sudo[205833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovlgvfojmagehftgsmhzcndvzhucoafd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579208.8645837-2945-266130107323135/AnsiballZ_seboolean.py
Feb 20 09:20:09 np0005625204.localdomain sudo[205833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:09 np0005625204.localdomain python3.9[205835]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 20 09:20:09 np0005625204.localdomain sudo[205833]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:10 np0005625204.localdomain sudo[205943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrgjbkwaqohomfliesafmdokliosxnvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579210.4538546-2976-76563520691412/AnsiballZ_systemd.py
Feb 20 09:20:10 np0005625204.localdomain sudo[205943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:11 np0005625204.localdomain python3.9[205945]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:11 np0005625204.localdomain systemd-rc-local-generator[205967]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:11 np0005625204.localdomain systemd-sysv-generator[205975]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Starting libvirt logging daemon socket...
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Starting libvirt logging daemon...
Feb 20 09:20:11 np0005625204.localdomain systemd[1]: Started libvirt logging daemon.
Feb 20 09:20:11 np0005625204.localdomain sudo[205943]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:11 np0005625204.localdomain sudo[206094]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyidcbhagwbkvpsrnibcsfbbsgsmvjmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579211.674451-2976-220065821546950/AnsiballZ_systemd.py
Feb 20 09:20:11 np0005625204.localdomain sudo[206094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41464 DF PROTO=TCP SPT=60506 DPT=9105 SEQ=3569357637 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599337680000000001030307) 
Feb 20 09:20:12 np0005625204.localdomain python3.9[206096]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:12 np0005625204.localdomain systemd-rc-local-generator[206119]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:12 np0005625204.localdomain systemd-sysv-generator[206127]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 20 09:20:12 np0005625204.localdomain systemd[1]: Started libvirt nodedev daemon.
Feb 20 09:20:12 np0005625204.localdomain sudo[206094]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:13 np0005625204.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 20 09:20:13 np0005625204.localdomain sudo[206270]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjcikwwhcllyhorrktsgxzntdzbrztin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579212.75303-2976-125070795160249/AnsiballZ_systemd.py
Feb 20 09:20:13 np0005625204.localdomain sudo[206270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:13 np0005625204.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 20 09:20:13 np0005625204.localdomain setroubleshoot[206261]: Deleting alert 4c153363-0b75-4da9-9673-ecc521f0261c, it is allowed in current policy
Feb 20 09:20:13 np0005625204.localdomain python3.9[206272]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:20:13 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:14 np0005625204.localdomain systemd-rc-local-generator[206299]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:14 np0005625204.localdomain systemd-sysv-generator[206303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 20 09:20:14 np0005625204.localdomain systemd[1]: Started libvirt proxy daemon.
Feb 20 09:20:14 np0005625204.localdomain sudo[206270]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:14 np0005625204.localdomain sudo[206449]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgdylxrqibdfgqfkpvxluxbzesyubzvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579214.482454-2976-227088710983548/AnsiballZ_systemd.py
Feb 20 09:20:14 np0005625204.localdomain sudo[206449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:15 np0005625204.localdomain python3.9[206451]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:15 np0005625204.localdomain systemd-rc-local-generator[206476]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:15 np0005625204.localdomain systemd-sysv-generator[206479]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:15 np0005625204.localdomain setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f0e143a7-e3ff-424e-aeeb-bf5f3a6516a4
Feb 20 09:20:15 np0005625204.localdomain setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Feb 20 09:20:15 np0005625204.localdomain setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l f0e143a7-e3ff-424e-aeeb-bf5f3a6516a4
Feb 20 09:20:15 np0005625204.localdomain setroubleshoot[206261]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 20 09:20:15 np0005625204.localdomain systemd[1]: Started libvirt QEMU daemon.
Feb 20 09:20:15 np0005625204.localdomain sudo[206449]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:15 np0005625204.localdomain sudo[206632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zchwststtlgahdmohmqzcijunsaxafgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579215.6043632-2976-126179467017398/AnsiballZ_systemd.py
Feb 20 09:20:15 np0005625204.localdomain sudo[206632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13932 DF PROTO=TCP SPT=52974 DPT=9100 SEQ=729795198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599347680000000001030307) 
Feb 20 09:20:16 np0005625204.localdomain python3.9[206634]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:16 np0005625204.localdomain systemd-rc-local-generator[206668]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:16 np0005625204.localdomain systemd-sysv-generator[206672]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Starting libvirt secret daemon socket...
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 20 09:20:16 np0005625204.localdomain systemd[1]: Started libvirt secret daemon.
Feb 20 09:20:16 np0005625204.localdomain sudo[206632]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:17 np0005625204.localdomain sudo[206815]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atyaathnrawrszkoddptqzipmvigcnkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579217.2291946-3087-253838309118582/AnsiballZ_file.py
Feb 20 09:20:17 np0005625204.localdomain sudo[206815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14391 DF PROTO=TCP SPT=35116 DPT=9882 SEQ=342328268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59934D2A0000000001030307) 
Feb 20 09:20:17 np0005625204.localdomain python3.9[206817]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:17 np0005625204.localdomain sudo[206815]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:18 np0005625204.localdomain sudo[206925]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eepukywpfzmxbaegrkwkaovetlzenplo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579217.9275382-3111-226994471262530/AnsiballZ_find.py
Feb 20 09:20:18 np0005625204.localdomain sudo[206925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:18 np0005625204.localdomain python3.9[206927]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:20:18 np0005625204.localdomain sudo[206925]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:19 np0005625204.localdomain sudo[207035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqmppjdffuoisphouymkwgebqriikxys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579218.99693-3135-83145780171815/AnsiballZ_command.py
Feb 20 09:20:19 np0005625204.localdomain sudo[207035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:19 np0005625204.localdomain python3.9[207037]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:19 np0005625204.localdomain sudo[207035]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:20 np0005625204.localdomain python3.9[207149]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:20:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14393 DF PROTO=TCP SPT=35116 DPT=9882 SEQ=342328268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599359460000000001030307) 
Feb 20 09:20:21 np0005625204.localdomain python3.9[207257]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:22 np0005625204.localdomain python3.9[207343]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579220.8484962-3191-14855100117293/.source.xml follow=False _original_basename=secret.xml.j2 checksum=e299a5f369c62c832b857708260504de70ea24e6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:22 np0005625204.localdomain sudo[207451]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksyphozkatbztlvypsxqialzoqvdxdih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579222.507695-3237-142044084598465/AnsiballZ_command.py
Feb 20 09:20:22 np0005625204.localdomain sudo[207451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:22 np0005625204.localdomain python3.9[207453]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:23 np0005625204.localdomain polkitd[1035]: Registered Authentication Agent for unix-process:207455:978222 (system bus name :1.2845 [pkttyagent --process 207455 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 20 09:20:23 np0005625204.localdomain polkitd[1035]: Unregistered Authentication Agent for unix-process:207455:978222 (system bus name :1.2845, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 20 09:20:23 np0005625204.localdomain polkitd[1035]: Registered Authentication Agent for unix-process:207454:978222 (system bus name :1.2846 [pkttyagent --process 207454 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 20 09:20:23 np0005625204.localdomain polkitd[1035]: Unregistered Authentication Agent for unix-process:207454:978222 (system bus name :1.2846, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 20 09:20:23 np0005625204.localdomain sudo[207451]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:24 np0005625204.localdomain python3.9[207573]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:24 np0005625204.localdomain sudo[207681]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygirqoczkboakzwcbuybepoxslvnldkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579224.5557842-3284-256737102055970/AnsiballZ_command.py
Feb 20 09:20:24 np0005625204.localdomain sudo[207681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14394 DF PROTO=TCP SPT=35116 DPT=9882 SEQ=342328268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599368E80000000001030307) 
Feb 20 09:20:24 np0005625204.localdomain sudo[207681]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:25 np0005625204.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Feb 20 09:20:25 np0005625204.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 20 09:20:25 np0005625204.localdomain sudo[207792]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orcyfcyuiuadmmnbmwgrrbqsybjgwjiz ; FSID=a8557ee9-b55d-5519-942c-cf8f6172f1d8 KEY=AQDtD5hpAAAAABAA3WyXm9j+KcpKUe+kDHkLgg== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579225.203982-3308-64901994996563/AnsiballZ_command.py
Feb 20 09:20:25 np0005625204.localdomain sudo[207792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:25 np0005625204.localdomain polkitd[1035]: Registered Authentication Agent for unix-process:207795:978491 (system bus name :1.2849 [pkttyagent --process 207795 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 20 09:20:25 np0005625204.localdomain polkitd[1035]: Unregistered Authentication Agent for unix-process:207795:978491 (system bus name :1.2849, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 20 09:20:25 np0005625204.localdomain sudo[207792]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:26 np0005625204.localdomain sudo[207908]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjeogpqgmilauqgbttlarsfivredjciw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579226.120869-3332-7175409729711/AnsiballZ_copy.py
Feb 20 09:20:26 np0005625204.localdomain sudo[207908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:26 np0005625204.localdomain python3.9[207910]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:26 np0005625204.localdomain sudo[207908]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:27 np0005625204.localdomain sudo[208018]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvknxuhhghfxbdaswgbjuucbotpmhgrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579226.7791867-3357-272241796040600/AnsiballZ_stat.py
Feb 20 09:20:27 np0005625204.localdomain sudo[208018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:27 np0005625204.localdomain python3.9[208020]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:27 np0005625204.localdomain sudo[208018]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:27 np0005625204.localdomain sudo[208106]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lphsvqpqqbnfnvrvgrqqbqsvuwixhprf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579226.7791867-3357-272241796040600/AnsiballZ_copy.py
Feb 20 09:20:27 np0005625204.localdomain sudo[208106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63478 DF PROTO=TCP SPT=56584 DPT=9105 SEQ=1510305160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599373E80000000001030307) 
Feb 20 09:20:27 np0005625204.localdomain python3.9[208108]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579226.7791867-3357-272241796040600/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:27 np0005625204.localdomain sudo[208106]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:28 np0005625204.localdomain sudo[208216]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lahjhbmgnngnldkmheywfhudefqvbnhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579228.1773145-3404-235315932873253/AnsiballZ_file.py
Feb 20 09:20:28 np0005625204.localdomain sudo[208216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:28 np0005625204.localdomain python3.9[208218]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:28 np0005625204.localdomain sudo[208216]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:29 np0005625204.localdomain sudo[208326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdbflegeqqbadywqjkmuypqbfworczel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579229.1432734-3429-112488591842386/AnsiballZ_stat.py
Feb 20 09:20:29 np0005625204.localdomain sudo[208326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:29 np0005625204.localdomain python3.9[208328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:29 np0005625204.localdomain sudo[208326]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63479 DF PROTO=TCP SPT=56584 DPT=9105 SEQ=1510305160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59937BE90000000001030307) 
Feb 20 09:20:29 np0005625204.localdomain sudo[208383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxewlzdgqptinhvrbgxqvjyzaeozelzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579229.1432734-3429-112488591842386/AnsiballZ_file.py
Feb 20 09:20:29 np0005625204.localdomain sudo[208383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:30 np0005625204.localdomain python3.9[208385]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:30 np0005625204.localdomain sudo[208383]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:30 np0005625204.localdomain sudo[208493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyewztvksghyuirasmwuzpftytpqdpzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579230.2674348-3465-259400443344409/AnsiballZ_stat.py
Feb 20 09:20:30 np0005625204.localdomain sudo[208493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:30 np0005625204.localdomain python3.9[208495]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:30 np0005625204.localdomain sudo[208493]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:30 np0005625204.localdomain sudo[208550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dirjddqhdauumzynhijbmsvthymfyjww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579230.2674348-3465-259400443344409/AnsiballZ_file.py
Feb 20 09:20:30 np0005625204.localdomain sudo[208550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:31 np0005625204.localdomain python3.9[208552]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.rs0pdz88 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:31 np0005625204.localdomain sudo[208550]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:31 np0005625204.localdomain sudo[208660]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igsyyzqofszaxrklbwiwrzmjmohwpubj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579231.3926725-3501-65632202361575/AnsiballZ_stat.py
Feb 20 09:20:31 np0005625204.localdomain sudo[208660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:31 np0005625204.localdomain python3.9[208662]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:31 np0005625204.localdomain sudo[208660]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13933 DF PROTO=TCP SPT=52974 DPT=9100 SEQ=729795198 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599387680000000001030307) 
Feb 20 09:20:32 np0005625204.localdomain sudo[208717]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jovgqypusottzopseffcapdyiljkxwuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579231.3926725-3501-65632202361575/AnsiballZ_file.py
Feb 20 09:20:32 np0005625204.localdomain sudo[208717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:33 np0005625204.localdomain python3.9[208719]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:33 np0005625204.localdomain sudo[208717]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:33 np0005625204.localdomain sudo[208827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdawgeocryovlrhivcrtaggdfwszjfna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579233.2912004-3540-187425242834400/AnsiballZ_command.py
Feb 20 09:20:33 np0005625204.localdomain sudo[208827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:33 np0005625204.localdomain python3.9[208829]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:33 np0005625204.localdomain sudo[208827]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:35 np0005625204.localdomain sudo[208938]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsqfpedzchunyqzkfvjwfjbuhvviggeu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579234.6176245-3564-77356100881918/AnsiballZ_edpm_nftables_from_files.py
Feb 20 09:20:35 np0005625204.localdomain sudo[208938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:35 np0005625204.localdomain python3[208940]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 20 09:20:35 np0005625204.localdomain sudo[208938]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20846 DF PROTO=TCP SPT=46028 DPT=9101 SEQ=1052269825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599395D70000000001030307) 
Feb 20 09:20:37 np0005625204.localdomain sudo[209048]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjmhssydbimszgoicsglmbsodywzbqtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579237.4218612-3588-156157169492723/AnsiballZ_stat.py
Feb 20 09:20:37 np0005625204.localdomain sudo[209048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:37 np0005625204.localdomain python3.9[209050]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:37 np0005625204.localdomain sudo[209048]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:38 np0005625204.localdomain sudo[209105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdfsvbiinnbctzqlhqfnwwrykpxpekgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579237.4218612-3588-156157169492723/AnsiballZ_file.py
Feb 20 09:20:38 np0005625204.localdomain sudo[209105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:38 np0005625204.localdomain python3.9[209107]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:38 np0005625204.localdomain sudo[209105]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20848 DF PROTO=TCP SPT=46028 DPT=9101 SEQ=1052269825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993A1E80000000001030307) 
Feb 20 09:20:39 np0005625204.localdomain sudo[209215]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdsacsnnnblczdkfjcbhvomgahpsnzsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579238.9304204-3624-174519557802430/AnsiballZ_stat.py
Feb 20 09:20:39 np0005625204.localdomain sudo[209215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:20:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:20:39 np0005625204.localdomain systemd[1]: tmp-crun.eMPfXX.mount: Deactivated successfully.
Feb 20 09:20:39 np0005625204.localdomain podman[209219]: 2026-02-20 09:20:39.67276003 +0000 UTC m=+0.098991760 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:20:39 np0005625204.localdomain podman[209219]: 2026-02-20 09:20:39.679826189 +0000 UTC m=+0.106057919 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:20:39 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:20:39 np0005625204.localdomain python3.9[209217]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:39 np0005625204.localdomain podman[209218]: 2026-02-20 09:20:39.76616624 +0000 UTC m=+0.192421861 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:20:39 np0005625204.localdomain sudo[209215]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:39 np0005625204.localdomain podman[209218]: 2026-02-20 09:20:39.798955369 +0000 UTC m=+0.225210990 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:20:39 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:20:40 np0005625204.localdomain sudo[209347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcrwvlymlkfcglqcqxtfiuasvmxjvbum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579238.9304204-3624-174519557802430/AnsiballZ_copy.py
Feb 20 09:20:40 np0005625204.localdomain sudo[209347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:40 np0005625204.localdomain python3.9[209349]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579238.9304204-3624-174519557802430/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:40 np0005625204.localdomain sudo[209347]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:40 np0005625204.localdomain sudo[209457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzkvtsrbguuwggxcpkxircfdexfemzpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579240.5262446-3669-192165599730887/AnsiballZ_stat.py
Feb 20 09:20:40 np0005625204.localdomain sudo[209457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:41 np0005625204.localdomain python3.9[209459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:41 np0005625204.localdomain sudo[209457]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:41 np0005625204.localdomain sudo[209462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:20:41 np0005625204.localdomain sudo[209462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:20:41 np0005625204.localdomain sudo[209462]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:41 np0005625204.localdomain sudo[209480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:20:41 np0005625204.localdomain sudo[209480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:20:41 np0005625204.localdomain sudo[209550]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-givaqyealikozjopohvlayjguprftiil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579240.5262446-3669-192165599730887/AnsiballZ_file.py
Feb 20 09:20:41 np0005625204.localdomain sudo[209550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:41 np0005625204.localdomain python3.9[209552]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:41 np0005625204.localdomain sudo[209550]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63481 DF PROTO=TCP SPT=56584 DPT=9105 SEQ=1510305160 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993AB690000000001030307) 
Feb 20 09:20:41 np0005625204.localdomain sudo[209480]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:42 np0005625204.localdomain sudo[209691]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fcorrsusyxjhzlcrhmefiwedoawdiapp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579241.8453863-3704-208105831271550/AnsiballZ_stat.py
Feb 20 09:20:42 np0005625204.localdomain sudo[209691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:42 np0005625204.localdomain python3.9[209693]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:42 np0005625204.localdomain sudo[209691]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:42 np0005625204.localdomain sudo[209749]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emnwjrerinjiugewaqdswoaubqejvgos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579241.8453863-3704-208105831271550/AnsiballZ_file.py
Feb 20 09:20:42 np0005625204.localdomain sudo[209749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:42 np0005625204.localdomain sudo[209748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:20:42 np0005625204.localdomain sudo[209748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:20:42 np0005625204.localdomain sudo[209748]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:42 np0005625204.localdomain python3.9[209766]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:42 np0005625204.localdomain sudo[209749]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:44 np0005625204.localdomain sudo[209876]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abkmabptjqeyvuvqgqrtbgkaqbhujthz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579243.0479882-3741-33841473204825/AnsiballZ_stat.py
Feb 20 09:20:44 np0005625204.localdomain sudo[209876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:44 np0005625204.localdomain python3.9[209878]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:44 np0005625204.localdomain sudo[209876]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:44 np0005625204.localdomain sudo[209966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwswdqcozhvtqmckvwrebuqfcxkpxpee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579243.0479882-3741-33841473204825/AnsiballZ_copy.py
Feb 20 09:20:44 np0005625204.localdomain sudo[209966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:44 np0005625204.localdomain python3.9[209968]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579243.0479882-3741-33841473204825/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:44 np0005625204.localdomain sudo[209966]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:45 np0005625204.localdomain sudo[210076]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hiuouggtxzefhzoxruzkysazzfvsujad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579245.1789503-3786-38857663266459/AnsiballZ_file.py
Feb 20 09:20:45 np0005625204.localdomain sudo[210076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:45 np0005625204.localdomain python3.9[210078]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:45 np0005625204.localdomain sudo[210076]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:45 np0005625204.localdomain sshd[210079]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:20:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30004 DF PROTO=TCP SPT=38372 DPT=9102 SEQ=2768683098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993BB680000000001030307) 
Feb 20 09:20:46 np0005625204.localdomain sshd[210079]: Received disconnect from 96.78.175.36 port 36324:11: Bye Bye [preauth]
Feb 20 09:20:46 np0005625204.localdomain sshd[210079]: Disconnected from authenticating user root 96.78.175.36 port 36324 [preauth]
Feb 20 09:20:46 np0005625204.localdomain sudo[210188]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpdugowmuraoeycofwhxmieheyppuwhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579246.5843327-3810-261685977285368/AnsiballZ_command.py
Feb 20 09:20:46 np0005625204.localdomain sudo[210188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:47 np0005625204.localdomain python3.9[210190]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:47 np0005625204.localdomain sudo[210188]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53682 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993C25A0000000001030307) 
Feb 20 09:20:47 np0005625204.localdomain sudo[210301]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocpsqopulldrsawrvxhqxhbgnpctmnlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579247.3609595-3834-20271184835209/AnsiballZ_blockinfile.py
Feb 20 09:20:47 np0005625204.localdomain sudo[210301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:47 np0005625204.localdomain sshd[210304]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:20:47 np0005625204.localdomain python3.9[210303]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:47 np0005625204.localdomain sudo[210301]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:48 np0005625204.localdomain sshd[210304]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:20:48 np0005625204.localdomain sshd[210323]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:20:48 np0005625204.localdomain sudo[210415]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-doxatzgaogqfprwmxmysybsgasqncmdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579248.397172-3861-166782416314688/AnsiballZ_command.py
Feb 20 09:20:48 np0005625204.localdomain sudo[210415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:48 np0005625204.localdomain python3.9[210417]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:48 np0005625204.localdomain sudo[210415]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:49 np0005625204.localdomain sudo[210526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-huqtrfwbawfyppugghabfsaszqcwwtvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579249.12928-3885-54667079550471/AnsiballZ_stat.py
Feb 20 09:20:49 np0005625204.localdomain sudo[210526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:49 np0005625204.localdomain python3.9[210528]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:20:49 np0005625204.localdomain sudo[210526]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:49 np0005625204.localdomain sshd[210323]: Invalid user httpd from 27.112.79.3 port 58060
Feb 20 09:20:49 np0005625204.localdomain sshd[210323]: Received disconnect from 27.112.79.3 port 58060:11: Bye Bye [preauth]
Feb 20 09:20:49 np0005625204.localdomain sshd[210323]: Disconnected from invalid user httpd 27.112.79.3 port 58060 [preauth]
Feb 20 09:20:50 np0005625204.localdomain sudo[210638]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emsvgvwkftlmhjjdklevdhuujlitoeer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579249.855555-3908-270688176022768/AnsiballZ_command.py
Feb 20 09:20:50 np0005625204.localdomain sudo[210638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:50 np0005625204.localdomain python3.9[210640]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:20:50 np0005625204.localdomain sudo[210638]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53684 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993CE680000000001030307) 
Feb 20 09:20:50 np0005625204.localdomain sudo[210751]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiycfceocdobdcvsfskexjatwszritop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579250.5766578-3933-3711807100279/AnsiballZ_file.py
Feb 20 09:20:50 np0005625204.localdomain sudo[210751]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:51 np0005625204.localdomain python3.9[210753]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:51 np0005625204.localdomain sudo[210751]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:51 np0005625204.localdomain sudo[210861]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moitpmxgiykmetxhvzdkpftogbnohflq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579251.2760997-3957-162186007319157/AnsiballZ_stat.py
Feb 20 09:20:51 np0005625204.localdomain sudo[210861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:51 np0005625204.localdomain python3.9[210863]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:51 np0005625204.localdomain sudo[210861]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:52 np0005625204.localdomain sudo[210949]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mduwcuvkxuerazlzjpuprabaeocsrroh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579251.2760997-3957-162186007319157/AnsiballZ_copy.py
Feb 20 09:20:52 np0005625204.localdomain sudo[210949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:52 np0005625204.localdomain python3.9[210951]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579251.2760997-3957-162186007319157/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:52 np0005625204.localdomain sudo[210949]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:52 np0005625204.localdomain sudo[211059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srpdnsdjkxvxiuoyfqnsebmvudlxeyvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579252.4943192-4002-245225278398161/AnsiballZ_stat.py
Feb 20 09:20:52 np0005625204.localdomain sudo[211059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:52 np0005625204.localdomain python3.9[211061]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:52 np0005625204.localdomain sudo[211059]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:53 np0005625204.localdomain sudo[211147]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxofrqcufnnqtrudfsokmrdakkhpgkpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579252.4943192-4002-245225278398161/AnsiballZ_copy.py
Feb 20 09:20:53 np0005625204.localdomain sudo[211147]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:53 np0005625204.localdomain python3.9[211149]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579252.4943192-4002-245225278398161/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:53 np0005625204.localdomain sudo[211147]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:53 np0005625204.localdomain sudo[211257]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htnzkwlonengnbteingvfnevpjwpfxjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579253.7050261-4048-69547586338565/AnsiballZ_stat.py
Feb 20 09:20:53 np0005625204.localdomain sudo[211257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:54 np0005625204.localdomain python3.9[211259]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:20:54 np0005625204.localdomain sudo[211257]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:54 np0005625204.localdomain sudo[211345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vawankfjpyquinijcqfvnwzgzinkxigs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579253.7050261-4048-69547586338565/AnsiballZ_copy.py
Feb 20 09:20:54 np0005625204.localdomain sudo[211345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:54 np0005625204.localdomain python3.9[211347]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579253.7050261-4048-69547586338565/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:20:54 np0005625204.localdomain sudo[211345]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53685 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993DE280000000001030307) 
Feb 20 09:20:55 np0005625204.localdomain sudo[211455]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvlvnjyfdbahkuduwdwbhogutvfmliif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579255.1797602-4092-256718666822/AnsiballZ_systemd.py
Feb 20 09:20:55 np0005625204.localdomain sudo[211455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:55 np0005625204.localdomain python3.9[211457]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:55 np0005625204.localdomain systemd-rc-local-generator[211481]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:55 np0005625204.localdomain systemd-sysv-generator[211486]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:56 np0005625204.localdomain systemd[1]: Reached target edpm_libvirt.target.
Feb 20 09:20:56 np0005625204.localdomain sudo[211455]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:56 np0005625204.localdomain sudo[211605]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdcxxxthsmhudkonjczigtdzbvrhxgyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579256.3700264-4116-218059108986532/AnsiballZ_systemd.py
Feb 20 09:20:56 np0005625204.localdomain sudo[211605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:20:57 np0005625204.localdomain python3.9[211607]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:57 np0005625204.localdomain systemd-rc-local-generator[211636]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:57 np0005625204.localdomain systemd-sysv-generator[211639]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47117 DF PROTO=TCP SPT=58588 DPT=9105 SEQ=2990462913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993E9280000000001030307) 
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:20:57 np0005625204.localdomain systemd-rc-local-generator[211672]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:20:57 np0005625204.localdomain systemd-sysv-generator[211676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:20:57 np0005625204.localdomain sudo[211605]: pam_unix(sudo:session): session closed for user root
Feb 20 09:20:58 np0005625204.localdomain sshd[163077]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:20:58 np0005625204.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Feb 20 09:20:58 np0005625204.localdomain systemd[1]: session-53.scope: Consumed 3min 23.876s CPU time.
Feb 20 09:20:58 np0005625204.localdomain systemd-logind[759]: Session 53 logged out. Waiting for processes to exit.
Feb 20 09:20:58 np0005625204.localdomain systemd-logind[759]: Removed session 53.
Feb 20 09:20:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47118 DF PROTO=TCP SPT=58588 DPT=9105 SEQ=2990462913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993F1290000000001030307) 
Feb 20 09:21:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53686 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=1854450508 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5993FF680000000001030307) 
Feb 20 09:21:03 np0005625204.localdomain sshd[211700]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:21:03 np0005625204.localdomain sshd[211700]: Accepted publickey for zuul from 192.168.122.30 port 45794 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:21:03 np0005625204.localdomain systemd-logind[759]: New session 54 of user zuul.
Feb 20 09:21:03 np0005625204.localdomain systemd[1]: Started Session 54 of User zuul.
Feb 20 09:21:04 np0005625204.localdomain sshd[211700]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:21:04 np0005625204.localdomain python3.9[211811]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:21:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:21:05.977 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:21:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:21:05.978 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:21:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:21:05.980 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:21:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64752 DF PROTO=TCP SPT=50190 DPT=9101 SEQ=781888736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59940B050000000001030307) 
Feb 20 09:21:06 np0005625204.localdomain python3.9[211923]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:21:06 np0005625204.localdomain network[211940]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:21:06 np0005625204.localdomain network[211941]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:21:06 np0005625204.localdomain network[211942]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:21:08 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:21:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64754 DF PROTO=TCP SPT=50190 DPT=9101 SEQ=781888736 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599417280000000001030307) 
Feb 20 09:21:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:21:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:21:09 np0005625204.localdomain podman[212058]: 2026-02-20 09:21:09.855131074 +0000 UTC m=+0.106551355 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 20 09:21:09 np0005625204.localdomain podman[212058]: 2026-02-20 09:21:09.891089896 +0000 UTC m=+0.142510177 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 20 09:21:09 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:21:09 np0005625204.localdomain podman[212078]: 2026-02-20 09:21:09.95707135 +0000 UTC m=+0.091920143 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:21:10 np0005625204.localdomain podman[212078]: 2026-02-20 09:21:10.020669315 +0000 UTC m=+0.155518118 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:21:10 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:21:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47120 DF PROTO=TCP SPT=58588 DPT=9105 SEQ=2990462913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599421680000000001030307) 
Feb 20 09:21:13 np0005625204.localdomain sudo[212213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jydwahzkjkuyxmgyonufruivzxtckilg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579272.9800823-98-114193599398011/AnsiballZ_setup.py
Feb 20 09:21:13 np0005625204.localdomain sudo[212213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:13 np0005625204.localdomain python3.9[212215]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:21:13 np0005625204.localdomain sudo[212213]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:14 np0005625204.localdomain sudo[212276]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekepxureacyxughdvhbibgxmlnlczifg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579272.9800823-98-114193599398011/AnsiballZ_dnf.py
Feb 20 09:21:14 np0005625204.localdomain sudo[212276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:14 np0005625204.localdomain python3.9[212278]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:21:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28353 DF PROTO=TCP SPT=40082 DPT=9102 SEQ=2778869450 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599431680000000001030307) 
Feb 20 09:21:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52001 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994378A0000000001030307) 
Feb 20 09:21:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52003 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599443A80000000001030307) 
Feb 20 09:21:21 np0005625204.localdomain sudo[212276]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:22 np0005625204.localdomain sudo[212388]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgfjoamzanryxlvfkrzijoxuowhcdues ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579281.978611-135-126371382962315/AnsiballZ_stat.py
Feb 20 09:21:22 np0005625204.localdomain sudo[212388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:22 np0005625204.localdomain python3.9[212390]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:21:22 np0005625204.localdomain sudo[212388]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:23 np0005625204.localdomain sudo[212500]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnrvsuxlypceyaaabyiqjrueyhtkudpl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579282.8138487-159-167444015054211/AnsiballZ_copy.py
Feb 20 09:21:23 np0005625204.localdomain sudo[212500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:23 np0005625204.localdomain python3.9[212502]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:23 np0005625204.localdomain sudo[212500]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:24 np0005625204.localdomain sudo[212610]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjlomchooakfstnyxouopuhcvhaivycv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579283.6714442-182-279435888388476/AnsiballZ_command.py
Feb 20 09:21:24 np0005625204.localdomain sudo[212610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:24 np0005625204.localdomain python3.9[212612]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:21:24 np0005625204.localdomain sudo[212610]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:24 np0005625204.localdomain sudo[212721]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hangxuyjlbniisczxcibhmujtodtjmmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579284.425542-207-67477442365465/AnsiballZ_command.py
Feb 20 09:21:24 np0005625204.localdomain sudo[212721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52004 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599453680000000001030307) 
Feb 20 09:21:24 np0005625204.localdomain python3.9[212723]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:21:24 np0005625204.localdomain sudo[212721]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:25 np0005625204.localdomain sudo[212832]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqlzbjievpfrksneqjussyoctnqrrkeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579285.0209103-231-26189100070477/AnsiballZ_command.py
Feb 20 09:21:25 np0005625204.localdomain sudo[212832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:25 np0005625204.localdomain python3.9[212834]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:21:25 np0005625204.localdomain sudo[212832]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:26 np0005625204.localdomain sudo[212943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqbhgbnjsilwonjrrvvgrzemhjmgfgog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579285.760458-258-222320936919777/AnsiballZ_stat.py
Feb 20 09:21:26 np0005625204.localdomain sudo[212943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:26 np0005625204.localdomain python3.9[212945]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:21:26 np0005625204.localdomain sudo[212943]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:26 np0005625204.localdomain sudo[213055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkbybsuxwdefkydlsenrspnggykgexhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579286.6250665-291-206502037263015/AnsiballZ_lineinfile.py
Feb 20 09:21:26 np0005625204.localdomain sudo[213055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:27 np0005625204.localdomain python3.9[213057]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:27 np0005625204.localdomain sudo[213055]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36335 DF PROTO=TCP SPT=50972 DPT=9105 SEQ=4082688825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59945E280000000001030307) 
Feb 20 09:21:27 np0005625204.localdomain sshd[213119]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:21:28 np0005625204.localdomain sudo[213167]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cknmdtctyjlkovcshhwyskbtyjlulvck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579287.4782298-318-277643106225976/AnsiballZ_systemd_service.py
Feb 20 09:21:28 np0005625204.localdomain sudo[213167]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:28 np0005625204.localdomain python3.9[213169]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:21:28 np0005625204.localdomain sshd[213119]: Invalid user sol from 45.148.10.240 port 39808
Feb 20 09:21:28 np0005625204.localdomain sshd[213119]: Connection closed by invalid user sol 45.148.10.240 port 39808 [preauth]
Feb 20 09:21:29 np0005625204.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 20 09:21:29 np0005625204.localdomain sudo[213167]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36336 DF PROTO=TCP SPT=50972 DPT=9105 SEQ=4082688825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599466280000000001030307) 
Feb 20 09:21:31 np0005625204.localdomain sudo[213281]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeyqksybfrsdcvepzaafsaebjpvlrhhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579289.6713486-342-197053451098466/AnsiballZ_systemd_service.py
Feb 20 09:21:31 np0005625204.localdomain sudo[213281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:31 np0005625204.localdomain python3.9[213283]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:21:31 np0005625204.localdomain sshd[213287]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:21:31 np0005625204.localdomain systemd-rc-local-generator[213315]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:21:31 np0005625204.localdomain systemd-sysv-generator[213318]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:31 np0005625204.localdomain sshd[213287]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: Starting Open-iSCSI...
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: If using hardware iscsi like qla4xxx this message can be ignored.
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Feb 20 09:21:31 np0005625204.localdomain iscsid[213326]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: Started Open-iSCSI.
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 20 09:21:31 np0005625204.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 20 09:21:32 np0005625204.localdomain sudo[213281]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52005 DF PROTO=TCP SPT=54810 DPT=9882 SEQ=1342684072 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599473690000000001030307) 
Feb 20 09:21:33 np0005625204.localdomain python3.9[213437]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:21:33 np0005625204.localdomain network[213454]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:21:33 np0005625204.localdomain network[213455]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:21:33 np0005625204.localdomain network[213456]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:21:33 np0005625204.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 20 09:21:34 np0005625204.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 20 09:21:34 np0005625204.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service.
Feb 20 09:21:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l b9094f2d-0c51-478f-92b6-4506d012d480
Feb 20 09:21:35 np0005625204.localdomain setroubleshoot[213470]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 20 09:21:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22408 DF PROTO=TCP SPT=57456 DPT=9101 SEQ=2747713421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599480340000000001030307) 
Feb 20 09:21:37 np0005625204.localdomain sudo[213703]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmvptnlgbfjjfqcwxzwvldarjrxnzsfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579297.5764332-410-94586480592157/AnsiballZ_dnf.py
Feb 20 09:21:37 np0005625204.localdomain sudo[213703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:38 np0005625204.localdomain python3.9[213705]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:21:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22410 DF PROTO=TCP SPT=57456 DPT=9101 SEQ=2747713421 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59948C290000000001030307) 
Feb 20 09:21:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:21:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:21:40 np0005625204.localdomain podman[213709]: 2026-02-20 09:21:40.157680992 +0000 UTC m=+0.090504504 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:21:40 np0005625204.localdomain podman[213709]: 2026-02-20 09:21:40.18784259 +0000 UTC m=+0.120666112 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:21:40 np0005625204.localdomain systemd[1]: tmp-crun.QaNOQz.mount: Deactivated successfully.
Feb 20 09:21:40 np0005625204.localdomain podman[213708]: 2026-02-20 09:21:40.202854846 +0000 UTC m=+0.135250995 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 20 09:21:40 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:21:40 np0005625204.localdomain podman[213708]: 2026-02-20 09:21:40.306448967 +0000 UTC m=+0.238845106 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:21:40 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:21:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36338 DF PROTO=TCP SPT=50972 DPT=9105 SEQ=4082688825 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599495680000000001030307) 
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:21:42 np0005625204.localdomain systemd-sysv-generator[213790]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:21:42 np0005625204.localdomain systemd-rc-local-generator[213786]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: run-rc5ef00701b6d49138398471c696372cc.service: Deactivated successfully.
Feb 20 09:21:42 np0005625204.localdomain systemd[1]: run-re412e902847f4578b27d977c8b3bcfb2.service: Deactivated successfully.
Feb 20 09:21:42 np0005625204.localdomain sudo[213929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:21:42 np0005625204.localdomain sudo[213929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:21:42 np0005625204.localdomain sudo[213929]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:42 np0005625204.localdomain sudo[213947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:21:42 np0005625204.localdomain sudo[213947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:21:43 np0005625204.localdomain sudo[213947]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:43 np0005625204.localdomain sudo[213703]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:44 np0005625204.localdomain sudo[214103]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrxjsvxnvrgccadxfrfdmsznqkhgjjsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579303.9771433-437-98656616364050/AnsiballZ_file.py
Feb 20 09:21:44 np0005625204.localdomain sudo[214103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:44 np0005625204.localdomain python3.9[214105]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 09:21:44 np0005625204.localdomain sudo[214103]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:45 np0005625204.localdomain sudo[214213]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btypsfcmhckogiootgdvgvwfyjattpzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579304.8945768-462-64633311182205/AnsiballZ_modprobe.py
Feb 20 09:21:45 np0005625204.localdomain sudo[214213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:45 np0005625204.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@2.service: Deactivated successfully.
Feb 20 09:21:45 np0005625204.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 20 09:21:45 np0005625204.localdomain python3.9[214215]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 20 09:21:45 np0005625204.localdomain sudo[214213]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17525 DF PROTO=TCP SPT=52864 DPT=9102 SEQ=2001544759 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994A5680000000001030307) 
Feb 20 09:21:46 np0005625204.localdomain sudo[214327]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aehwqgwrjtxkblrdpceywdkeufcqqclk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579305.7922926-486-185059126350890/AnsiballZ_stat.py
Feb 20 09:21:46 np0005625204.localdomain sudo[214327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:46 np0005625204.localdomain python3.9[214329]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:21:46 np0005625204.localdomain sudo[214327]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:46 np0005625204.localdomain sudo[214363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:21:46 np0005625204.localdomain sudo[214363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:21:46 np0005625204.localdomain sudo[214363]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:46 np0005625204.localdomain sudo[214433]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dnelpsytgvjexxnmegrolvxdmnofboeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579305.7922926-486-185059126350890/AnsiballZ_copy.py
Feb 20 09:21:46 np0005625204.localdomain sudo[214433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:46 np0005625204.localdomain python3.9[214435]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579305.7922926-486-185059126350890/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:46 np0005625204.localdomain sudo[214433]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:47 np0005625204.localdomain sudo[214543]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjvophmtqqbxidaaxoilwwmlmyzapaqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579307.2253113-534-43548639564034/AnsiballZ_lineinfile.py
Feb 20 09:21:47 np0005625204.localdomain sudo[214543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52973 DF PROTO=TCP SPT=40052 DPT=9882 SEQ=2359454866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994ACBA0000000001030307) 
Feb 20 09:21:47 np0005625204.localdomain python3.9[214545]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:47 np0005625204.localdomain sudo[214543]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:48 np0005625204.localdomain sudo[214653]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqouwhjvswpisjfyqvpwutgucvlxsmjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579308.3310714-557-82934213940696/AnsiballZ_systemd.py
Feb 20 09:21:48 np0005625204.localdomain sudo[214653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:49 np0005625204.localdomain python3.9[214655]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:21:49 np0005625204.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 20 09:21:49 np0005625204.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 20 09:21:49 np0005625204.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 20 09:21:49 np0005625204.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 20 09:21:49 np0005625204.localdomain systemd-modules-load[214659]: Module 'msr' is built in
Feb 20 09:21:49 np0005625204.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 20 09:21:49 np0005625204.localdomain sudo[214653]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52975 DF PROTO=TCP SPT=40052 DPT=9882 SEQ=2359454866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994B8A90000000001030307) 
Feb 20 09:21:50 np0005625204.localdomain sudo[214768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eutdhfyfmltkvcnhigrbuhlnorjestlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579310.6216137-583-47779320702751/AnsiballZ_command.py
Feb 20 09:21:50 np0005625204.localdomain sudo[214768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:51 np0005625204.localdomain python3.9[214770]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:21:51 np0005625204.localdomain sudo[214768]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:51 np0005625204.localdomain sudo[214879]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngzbjvttgfyasidkfpiqdjpgzpmtotvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579311.702373-612-170847186153363/AnsiballZ_stat.py
Feb 20 09:21:51 np0005625204.localdomain sudo[214879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:52 np0005625204.localdomain python3.9[214881]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:21:52 np0005625204.localdomain sudo[214879]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:52 np0005625204.localdomain sudo[214989]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bykcpizwxhtmzegeguobsdtoruafomac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579312.475965-638-18158503919149/AnsiballZ_stat.py
Feb 20 09:21:52 np0005625204.localdomain sudo[214989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:52 np0005625204.localdomain python3.9[214991]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:21:52 np0005625204.localdomain sudo[214989]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:53 np0005625204.localdomain sudo[215077]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkujlwyiumuvqrolebmrutvvnscninyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579312.475965-638-18158503919149/AnsiballZ_copy.py
Feb 20 09:21:53 np0005625204.localdomain sudo[215077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:53 np0005625204.localdomain python3.9[215079]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579312.475965-638-18158503919149/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:53 np0005625204.localdomain sudo[215077]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:54 np0005625204.localdomain sudo[215187]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqwckjtnrisqynxhvxwtsqzwrbtsfzso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579313.7266142-684-72219969067170/AnsiballZ_command.py
Feb 20 09:21:54 np0005625204.localdomain sudo[215187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:54 np0005625204.localdomain python3.9[215189]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:21:54 np0005625204.localdomain sudo[215187]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:54 np0005625204.localdomain sudo[215298]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgunpkzldouxyudaejrkvmshlpkonvsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579314.4324684-707-17488090641971/AnsiballZ_lineinfile.py
Feb 20 09:21:54 np0005625204.localdomain sudo[215298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52976 DF PROTO=TCP SPT=40052 DPT=9882 SEQ=2359454866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994C8680000000001030307) 
Feb 20 09:21:54 np0005625204.localdomain python3.9[215300]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:54 np0005625204.localdomain sudo[215298]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:55 np0005625204.localdomain sudo[215408]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vejhnivrxmwidrzglnnebrpsuyrrhomj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579315.1430838-732-232802327297902/AnsiballZ_replace.py
Feb 20 09:21:55 np0005625204.localdomain sudo[215408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:55 np0005625204.localdomain python3.9[215410]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:55 np0005625204.localdomain sudo[215408]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:55 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Feb 20 09:21:55 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:21:55 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:21:55 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:21:56 np0005625204.localdomain sudo[215519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bemmhwcejjcpcgdmlsjqzkyygcccwoaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579315.9594874-756-43596395297261/AnsiballZ_replace.py
Feb 20 09:21:56 np0005625204.localdomain sudo[215519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:56 np0005625204.localdomain python3.9[215521]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:56 np0005625204.localdomain sudo[215519]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:57 np0005625204.localdomain sudo[215629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-colaztjxcbsbiknpldhpniqesqrwffsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579316.7214317-783-190919213808029/AnsiballZ_lineinfile.py
Feb 20 09:21:57 np0005625204.localdomain sudo[215629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:57 np0005625204.localdomain python3.9[215631]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:57 np0005625204.localdomain sudo[215629]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19709 DF PROTO=TCP SPT=51054 DPT=9105 SEQ=3258922464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994D3690000000001030307) 
Feb 20 09:21:58 np0005625204.localdomain sudo[215739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kslpemmawvmiqiscizveijxbsiulwoce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579317.441908-783-279347235409233/AnsiballZ_lineinfile.py
Feb 20 09:21:58 np0005625204.localdomain sudo[215739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:58 np0005625204.localdomain python3.9[215741]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:58 np0005625204.localdomain sudo[215739]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:58 np0005625204.localdomain sudo[215849]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqvjspsvzlufcbqlqrsvotmtwqihmkwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579318.5595677-783-236594548483778/AnsiballZ_lineinfile.py
Feb 20 09:21:58 np0005625204.localdomain sudo[215849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:59 np0005625204.localdomain python3.9[215851]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:59 np0005625204.localdomain sudo[215849]: pam_unix(sudo:session): session closed for user root
Feb 20 09:21:59 np0005625204.localdomain sudo[215959]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzfylpzdyfuzoznyqummglagjtlkpjgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579319.1885846-783-164467480092955/AnsiballZ_lineinfile.py
Feb 20 09:21:59 np0005625204.localdomain sudo[215959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:21:59 np0005625204.localdomain python3.9[215961]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:21:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19710 DF PROTO=TCP SPT=51054 DPT=9105 SEQ=3258922464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994DB680000000001030307) 
Feb 20 09:21:59 np0005625204.localdomain sudo[215959]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:00 np0005625204.localdomain sudo[216069]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bskqkyixlwerqejcyybhurbozdibohjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579320.2994099-870-259419351602517/AnsiballZ_stat.py
Feb 20 09:22:00 np0005625204.localdomain sudo[216069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:00 np0005625204.localdomain python3.9[216071]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:22:00 np0005625204.localdomain sudo[216069]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:01 np0005625204.localdomain sudo[216181]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpfujkdxfzbhkscvfwjgupdsgkcgablp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579320.9688213-893-44730041511537/AnsiballZ_command.py
Feb 20 09:22:01 np0005625204.localdomain sudo[216181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:01 np0005625204.localdomain python3.9[216183]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:01 np0005625204.localdomain sudo[216181]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:02 np0005625204.localdomain sudo[216292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqsbneqfuhgonosfmqvakxosacpiomgt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579321.832772-921-144915724085750/AnsiballZ_systemd_service.py
Feb 20 09:22:02 np0005625204.localdomain sudo[216292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:02 np0005625204.localdomain python3.9[216294]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:02 np0005625204.localdomain systemd[1]: Listening on multipathd control socket.
Feb 20 09:22:02 np0005625204.localdomain sudo[216292]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52604 DF PROTO=TCP SPT=60274 DPT=9100 SEQ=446721079 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994E7680000000001030307) 
Feb 20 09:22:04 np0005625204.localdomain sudo[216406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krszehyyagmkanchpvsnhxxvgcjnbgvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579323.7325916-945-159137476537637/AnsiballZ_systemd_service.py
Feb 20 09:22:04 np0005625204.localdomain sudo[216406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:04 np0005625204.localdomain python3.9[216408]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:05 np0005625204.localdomain systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 20 09:22:05 np0005625204.localdomain udevadm[216413]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 20 09:22:05 np0005625204.localdomain systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 20 09:22:05 np0005625204.localdomain systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 20 09:22:05 np0005625204.localdomain multipathd[216416]: --------start up--------
Feb 20 09:22:05 np0005625204.localdomain multipathd[216416]: read /etc/multipath.conf
Feb 20 09:22:05 np0005625204.localdomain multipathd[216416]: path checkers start up
Feb 20 09:22:05 np0005625204.localdomain systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 20 09:22:05 np0005625204.localdomain sudo[216406]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:22:05.978 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:22:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:22:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:22:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:22:05.982 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:22:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20791 DF PROTO=TCP SPT=37032 DPT=9101 SEQ=76710761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5994F5650000000001030307) 
Feb 20 09:22:06 np0005625204.localdomain sudo[216532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkwhfqjctzrziqxlguilsughhxlhkycu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579326.3869414-981-281323807963805/AnsiballZ_file.py
Feb 20 09:22:06 np0005625204.localdomain sudo[216532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:06 np0005625204.localdomain python3.9[216534]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 09:22:06 np0005625204.localdomain sudo[216532]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:07 np0005625204.localdomain sudo[216642]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcssunyybddnuhgrgxpcykanqfpmqevq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579327.0919578-1005-98036244997467/AnsiballZ_modprobe.py
Feb 20 09:22:07 np0005625204.localdomain sudo[216642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:07 np0005625204.localdomain python3.9[216644]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 20 09:22:07 np0005625204.localdomain sshd[216645]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:22:07 np0005625204.localdomain sudo[216642]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:08 np0005625204.localdomain sudo[216762]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnhzmqwjsqwlfkkdsjmjxazwxqalumem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579327.896345-1028-43114478183607/AnsiballZ_stat.py
Feb 20 09:22:08 np0005625204.localdomain sudo[216762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:08 np0005625204.localdomain python3.9[216764]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:22:08 np0005625204.localdomain sudo[216762]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:08 np0005625204.localdomain sshd[216781]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:22:08 np0005625204.localdomain sudo[216852]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsdcivhbhoceqgxlxpsbsutnjuducwzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579327.896345-1028-43114478183607/AnsiballZ_copy.py
Feb 20 09:22:08 np0005625204.localdomain sudo[216852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:08 np0005625204.localdomain sshd[216645]: Invalid user young from 188.166.218.64 port 36232
Feb 20 09:22:08 np0005625204.localdomain python3.9[216854]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579327.896345-1028-43114478183607/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:08 np0005625204.localdomain sudo[216852]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:09 np0005625204.localdomain sshd[216645]: Received disconnect from 188.166.218.64 port 36232:11: Bye Bye [preauth]
Feb 20 09:22:09 np0005625204.localdomain sshd[216645]: Disconnected from invalid user young 188.166.218.64 port 36232 [preauth]
Feb 20 09:22:09 np0005625204.localdomain sshd[216781]: Received disconnect from 54.36.99.29 port 45000:11: Bye Bye [preauth]
Feb 20 09:22:09 np0005625204.localdomain sshd[216781]: Disconnected from authenticating user root 54.36.99.29 port 45000 [preauth]
Feb 20 09:22:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20793 DF PROTO=TCP SPT=37032 DPT=9101 SEQ=76710761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599501680000000001030307) 
Feb 20 09:22:10 np0005625204.localdomain sudo[216962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwzphsdhnuohijvacdfcgoeoombtphye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579329.7491176-1076-8346211747347/AnsiballZ_lineinfile.py
Feb 20 09:22:10 np0005625204.localdomain sudo[216962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:10 np0005625204.localdomain python3.9[216964]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:10 np0005625204.localdomain sudo[216962]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:10 np0005625204.localdomain sudo[217072]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siszhfgjfnsqclkviuouilvrbjzuxcoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579330.401786-1101-235273392950604/AnsiballZ_systemd.py
Feb 20 09:22:10 np0005625204.localdomain sudo[217072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:22:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:22:10 np0005625204.localdomain sshd[217092]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:22:10 np0005625204.localdomain podman[217076]: 2026-02-20 09:22:10.840774925 +0000 UTC m=+0.085102096 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent)
Feb 20 09:22:10 np0005625204.localdomain systemd[1]: tmp-crun.FZp9jc.mount: Deactivated successfully.
Feb 20 09:22:10 np0005625204.localdomain podman[217075]: 2026-02-20 09:22:10.914111015 +0000 UTC m=+0.158971993 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:22:10 np0005625204.localdomain podman[217076]: 2026-02-20 09:22:10.923247859 +0000 UTC m=+0.167575020 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:22:10 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:22:10 np0005625204.localdomain podman[217075]: 2026-02-20 09:22:10.950510927 +0000 UTC m=+0.195371895 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Feb 20 09:22:10 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:22:11 np0005625204.localdomain sshd[217092]: Invalid user n8n from 18.221.252.160 port 38336
Feb 20 09:22:11 np0005625204.localdomain sshd[217092]: Received disconnect from 18.221.252.160 port 38336:11: Bye Bye [preauth]
Feb 20 09:22:11 np0005625204.localdomain sshd[217092]: Disconnected from invalid user n8n 18.221.252.160 port 38336 [preauth]
Feb 20 09:22:11 np0005625204.localdomain python3.9[217074]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:22:11 np0005625204.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 20 09:22:11 np0005625204.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 20 09:22:11 np0005625204.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 20 09:22:11 np0005625204.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 20 09:22:11 np0005625204.localdomain systemd-modules-load[217122]: Module 'msr' is built in
Feb 20 09:22:11 np0005625204.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 20 09:22:11 np0005625204.localdomain sudo[217072]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19712 DF PROTO=TCP SPT=51054 DPT=9105 SEQ=3258922464 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59950B680000000001030307) 
Feb 20 09:22:12 np0005625204.localdomain sudo[217230]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-woczvnznvvttqqzeeevvncprqmoincgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579331.8366046-1125-18688547629716/AnsiballZ_dnf.py
Feb 20 09:22:12 np0005625204.localdomain sudo[217230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:12 np0005625204.localdomain python3.9[217232]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:22:12 np0005625204.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 20 09:22:14 np0005625204.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 20 09:22:15 np0005625204.localdomain sshd[217237]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:22:15 np0005625204.localdomain sshd[217237]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:22:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36937 DF PROTO=TCP SPT=48260 DPT=9102 SEQ=1628961010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59951B680000000001030307) 
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:22:16 np0005625204.localdomain systemd-sysv-generator[217275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:22:16 np0005625204.localdomain systemd-rc-local-generator[217272]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:22:16 np0005625204.localdomain systemd-rc-local-generator[217309]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:22:16 np0005625204.localdomain systemd-sysv-generator[217312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:16 np0005625204.localdomain systemd-logind[759]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 20 09:22:17 np0005625204.localdomain systemd-logind[759]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 20 09:22:17 np0005625204.localdomain lvm[217359]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 20 09:22:17 np0005625204.localdomain lvm[217359]: VG ceph_vg1 finished
Feb 20 09:22:17 np0005625204.localdomain lvm[217358]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 20 09:22:17 np0005625204.localdomain lvm[217358]: VG ceph_vg0 finished
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:22:17 np0005625204.localdomain systemd-sysv-generator[217410]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:22:17 np0005625204.localdomain systemd-rc-local-generator[217407]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:17 np0005625204.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 20 09:22:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34958 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599521EB0000000001030307) 
Feb 20 09:22:18 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 20 09:22:18 np0005625204.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 20 09:22:18 np0005625204.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.237s CPU time.
Feb 20 09:22:18 np0005625204.localdomain systemd[1]: run-rf52af6cdffbd4b16a79eace143fb076e.service: Deactivated successfully.
Feb 20 09:22:18 np0005625204.localdomain sudo[217230]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:18 np0005625204.localdomain sudo[218664]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pimoepobnfjhpgzvhoqugljzprdgvpkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579338.6666703-1150-13711966215963/AnsiballZ_systemd_service.py
Feb 20 09:22:18 np0005625204.localdomain sudo[218664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:19 np0005625204.localdomain python3.9[218666]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:22:19 np0005625204.localdomain systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 20 09:22:19 np0005625204.localdomain multipathd[216416]: exit (signal)
Feb 20 09:22:19 np0005625204.localdomain multipathd[216416]: --------shut down-------
Feb 20 09:22:19 np0005625204.localdomain systemd[1]: multipathd.service: Deactivated successfully.
Feb 20 09:22:19 np0005625204.localdomain systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 20 09:22:19 np0005625204.localdomain systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 20 09:22:19 np0005625204.localdomain multipathd[218672]: --------start up--------
Feb 20 09:22:19 np0005625204.localdomain multipathd[218672]: read /etc/multipath.conf
Feb 20 09:22:19 np0005625204.localdomain multipathd[218672]: path checkers start up
Feb 20 09:22:19 np0005625204.localdomain systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 20 09:22:19 np0005625204.localdomain sudo[218664]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:20 np0005625204.localdomain python3.9[218788]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:22:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34960 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59952DE90000000001030307) 
Feb 20 09:22:21 np0005625204.localdomain sudo[218900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmzngpeaczfnbmqdhlxumijpjltwueul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579341.079396-1201-87706353547568/AnsiballZ_file.py
Feb 20 09:22:21 np0005625204.localdomain sudo[218900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:21 np0005625204.localdomain python3.9[218902]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:21 np0005625204.localdomain sudo[218900]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:23 np0005625204.localdomain sudo[219010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrbislfjkrnpdzfqrocnviveljpejrjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579342.0585647-1234-166866414430307/AnsiballZ_systemd_service.py
Feb 20 09:22:23 np0005625204.localdomain sudo[219010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:23 np0005625204.localdomain python3.9[219012]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:22:23 np0005625204.localdomain systemd-sysv-generator[219043]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:22:23 np0005625204.localdomain systemd-rc-local-generator[219039]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:23 np0005625204.localdomain sudo[219010]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:24 np0005625204.localdomain python3.9[219156]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:22:24 np0005625204.localdomain network[219173]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:22:24 np0005625204.localdomain network[219174]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:22:24 np0005625204.localdomain network[219175]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:22:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34961 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59953DA80000000001030307) 
Feb 20 09:22:26 np0005625204.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 20 09:22:26 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:22:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42734 DF PROTO=TCP SPT=58974 DPT=9105 SEQ=2211298961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599548A80000000001030307) 
Feb 20 09:22:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42735 DF PROTO=TCP SPT=58974 DPT=9105 SEQ=2211298961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599550A80000000001030307) 
Feb 20 09:22:29 np0005625204.localdomain sudo[219407]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxvrlvrsblzqifvcbayvsphcizyhtcnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579349.642089-1291-26263346564642/AnsiballZ_systemd_service.py
Feb 20 09:22:29 np0005625204.localdomain sudo[219407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:30 np0005625204.localdomain python3.9[219409]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:31 np0005625204.localdomain sudo[219407]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:31 np0005625204.localdomain sudo[219518]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjnjgxulidyeckadbhrdxpkwnnvdnjlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579351.374534-1291-3409939313371/AnsiballZ_systemd_service.py
Feb 20 09:22:31 np0005625204.localdomain sudo[219518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:31 np0005625204.localdomain python3.9[219520]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:31 np0005625204.localdomain sudo[219518]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34962 DF PROTO=TCP SPT=39408 DPT=9882 SEQ=1749105025 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59955D690000000001030307) 
Feb 20 09:22:33 np0005625204.localdomain sudo[219629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxrpjplrsvwxondptiaokbqzujupcjhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579352.8722696-1291-141464898989458/AnsiballZ_systemd_service.py
Feb 20 09:22:33 np0005625204.localdomain sudo[219629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:33 np0005625204.localdomain python3.9[219631]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:33 np0005625204.localdomain sudo[219629]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:33 np0005625204.localdomain sudo[219740]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xswqnxbimaraykihjirfpraimovlqxya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579353.6871023-1291-115471917366053/AnsiballZ_systemd_service.py
Feb 20 09:22:33 np0005625204.localdomain sudo[219740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:34 np0005625204.localdomain python3.9[219742]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:34 np0005625204.localdomain sudo[219740]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:35 np0005625204.localdomain sudo[219851]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sejcvqnifptigeknpqmedmhwpimhbjbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579354.7516885-1291-255756657091315/AnsiballZ_systemd_service.py
Feb 20 09:22:35 np0005625204.localdomain sudo[219851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:35 np0005625204.localdomain python3.9[219853]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5836 DF PROTO=TCP SPT=40392 DPT=9101 SEQ=4145350747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59956A950000000001030307) 
Feb 20 09:22:36 np0005625204.localdomain sudo[219851]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:36 np0005625204.localdomain sudo[219962]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqqpwyztnhsurtkktkjltgdufrzsvogc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579356.516169-1291-38288448244236/AnsiballZ_systemd_service.py
Feb 20 09:22:36 np0005625204.localdomain sudo[219962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:37 np0005625204.localdomain python3.9[219964]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:37 np0005625204.localdomain sudo[219962]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:37 np0005625204.localdomain sudo[220073]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tljsnaiqlnexstvddpmytzbjsifbkxhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579357.1814733-1291-148226159838914/AnsiballZ_systemd_service.py
Feb 20 09:22:37 np0005625204.localdomain sudo[220073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:37 np0005625204.localdomain python3.9[220075]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:38 np0005625204.localdomain sudo[220073]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:39 np0005625204.localdomain sudo[220184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aknjalqnctmsvqxatmvjsxazdoxtouwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579358.9494991-1291-48741536940197/AnsiballZ_systemd_service.py
Feb 20 09:22:39 np0005625204.localdomain sudo[220184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5838 DF PROTO=TCP SPT=40392 DPT=9101 SEQ=4145350747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599576A80000000001030307) 
Feb 20 09:22:39 np0005625204.localdomain python3.9[220186]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:22:39 np0005625204.localdomain sudo[220184]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:40 np0005625204.localdomain sudo[220295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqcptbqgibjrpmgfroqaqrarixduxeuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579360.2088225-1467-59227597324697/AnsiballZ_file.py
Feb 20 09:22:40 np0005625204.localdomain sudo[220295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:40 np0005625204.localdomain python3.9[220297]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:40 np0005625204.localdomain sudo[220295]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:22:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:22:41 np0005625204.localdomain podman[220381]: 2026-02-20 09:22:41.170912639 +0000 UTC m=+0.102825551 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:22:41 np0005625204.localdomain sudo[220428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsxacxtmukeyyjcmhycoypetimxwggec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579360.8367324-1467-174071644222774/AnsiballZ_file.py
Feb 20 09:22:41 np0005625204.localdomain sudo[220428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:41 np0005625204.localdomain podman[220381]: 2026-02-20 09:22:41.201547264 +0000 UTC m=+0.133460216 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 20 09:22:41 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:22:41 np0005625204.localdomain podman[220376]: 2026-02-20 09:22:41.258857616 +0000 UTC m=+0.190848641 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:22:41 np0005625204.localdomain podman[220376]: 2026-02-20 09:22:41.302113642 +0000 UTC m=+0.234104667 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2)
Feb 20 09:22:41 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:22:41 np0005625204.localdomain python3.9[220435]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:41 np0005625204.localdomain sudo[220428]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:41 np0005625204.localdomain sudo[220555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oozbioazgqbhkvrrpdvsigfrbkroazbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579361.5450325-1467-187512095410465/AnsiballZ_file.py
Feb 20 09:22:41 np0005625204.localdomain sudo[220555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:42 np0005625204.localdomain python3.9[220557]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:42 np0005625204.localdomain sudo[220555]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42737 DF PROTO=TCP SPT=58974 DPT=9105 SEQ=2211298961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599581680000000001030307) 
Feb 20 09:22:42 np0005625204.localdomain sudo[220665]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vexaawzektzenfnzqnsoerblwobcbyau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579362.148622-1467-248510916777075/AnsiballZ_file.py
Feb 20 09:22:42 np0005625204.localdomain sudo[220665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:42 np0005625204.localdomain python3.9[220667]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:42 np0005625204.localdomain sudo[220665]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:43 np0005625204.localdomain sudo[220775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvltbbifakolzizgurhneqhokbduumyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579362.8272734-1467-13946811274149/AnsiballZ_file.py
Feb 20 09:22:43 np0005625204.localdomain sudo[220775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:43 np0005625204.localdomain python3.9[220777]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:43 np0005625204.localdomain sudo[220775]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:43 np0005625204.localdomain sudo[220885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wffgjtmxppkcenwjdbpidchckhqlbwci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579363.44549-1467-214607935734292/AnsiballZ_file.py
Feb 20 09:22:43 np0005625204.localdomain sudo[220885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:43 np0005625204.localdomain python3.9[220887]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:43 np0005625204.localdomain sudo[220885]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:44 np0005625204.localdomain sudo[220995]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cacmqmtvzosherbjlrfhcjhyiwfgaeti ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579364.5535374-1467-110183186976769/AnsiballZ_file.py
Feb 20 09:22:44 np0005625204.localdomain sudo[220995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:45 np0005625204.localdomain python3.9[220997]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:45 np0005625204.localdomain sudo[220995]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:45 np0005625204.localdomain sudo[221105]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuikrsdylprhamynmqtmlyfgnvqdcmqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579365.166532-1467-45875868150107/AnsiballZ_file.py
Feb 20 09:22:45 np0005625204.localdomain sudo[221105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:45 np0005625204.localdomain python3.9[221107]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:45 np0005625204.localdomain sudo[221105]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39922 DF PROTO=TCP SPT=44428 DPT=9100 SEQ=3565515427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599591680000000001030307) 
Feb 20 09:22:46 np0005625204.localdomain sudo[221133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:22:46 np0005625204.localdomain sudo[221133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:22:46 np0005625204.localdomain sudo[221133]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:46 np0005625204.localdomain sudo[221181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 09:22:46 np0005625204.localdomain sudo[221181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:22:46 np0005625204.localdomain sudo[221251]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiqhasgracjcbnydhmgdaparmaxrnwan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579366.6291833-1639-90926191427248/AnsiballZ_file.py
Feb 20 09:22:46 np0005625204.localdomain sudo[221251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:47 np0005625204.localdomain python3.9[221253]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:47 np0005625204.localdomain sudo[221251]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:47 np0005625204.localdomain sudo[221181]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:47 np0005625204.localdomain sudo[221318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:22:47 np0005625204.localdomain sudo[221318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:22:47 np0005625204.localdomain sudo[221318]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:47 np0005625204.localdomain sudo[221363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:22:47 np0005625204.localdomain sudo[221363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:22:47 np0005625204.localdomain sudo[221417]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-atvabcnakmorlwzthqhbndpcqxeojzoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579367.1976914-1639-276104490406851/AnsiballZ_file.py
Feb 20 09:22:47 np0005625204.localdomain sudo[221417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:47 np0005625204.localdomain python3.9[221419]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55861 DF PROTO=TCP SPT=46612 DPT=9882 SEQ=270082323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995971A0000000001030307) 
Feb 20 09:22:47 np0005625204.localdomain sudo[221417]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:48 np0005625204.localdomain sudo[221363]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:48 np0005625204.localdomain sudo[221559]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqgigdnhdvxtivpwcjobulwxyqgbwzuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579367.8132226-1639-272127862226878/AnsiballZ_file.py
Feb 20 09:22:48 np0005625204.localdomain sudo[221559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:48 np0005625204.localdomain python3.9[221561]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:48 np0005625204.localdomain sudo[221559]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:48 np0005625204.localdomain sudo[221650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:22:48 np0005625204.localdomain sudo[221650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:22:48 np0005625204.localdomain sudo[221650]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:48 np0005625204.localdomain sudo[221686]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgmeblempdpxswevonpxclxjjnlnyobw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579368.4338706-1639-273775368044360/AnsiballZ_file.py
Feb 20 09:22:48 np0005625204.localdomain sudo[221686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:48 np0005625204.localdomain python3.9[221689]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:48 np0005625204.localdomain sudo[221686]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:49 np0005625204.localdomain sudo[221797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecuhgazocuxyhonvioggmtlczhmsmiui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579369.0510287-1639-19506144361662/AnsiballZ_file.py
Feb 20 09:22:49 np0005625204.localdomain sudo[221797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:49 np0005625204.localdomain python3.9[221799]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:49 np0005625204.localdomain sudo[221797]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:49 np0005625204.localdomain sudo[221907]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cluygtloulyikdtiayxpoybsbrpzuzvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579369.6975534-1639-71038700990807/AnsiballZ_file.py
Feb 20 09:22:49 np0005625204.localdomain sudo[221907]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:50 np0005625204.localdomain python3.9[221909]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:50 np0005625204.localdomain sudo[221907]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:50 np0005625204.localdomain sudo[222017]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyoetfdihpotfwqozrzajpftwujbhxxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579370.2258203-1639-138984836417378/AnsiballZ_file.py
Feb 20 09:22:50 np0005625204.localdomain sudo[222017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:50 np0005625204.localdomain python3.9[222019]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:50 np0005625204.localdomain sudo[222017]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55863 DF PROTO=TCP SPT=46612 DPT=9882 SEQ=270082323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995A3280000000001030307) 
Feb 20 09:22:51 np0005625204.localdomain sudo[222127]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdfjevgutggyqlxtfwencebyhhpyeotu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579370.8240063-1639-229072331080423/AnsiballZ_file.py
Feb 20 09:22:51 np0005625204.localdomain sudo[222127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:51 np0005625204.localdomain python3.9[222129]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:22:51 np0005625204.localdomain sudo[222127]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:51 np0005625204.localdomain sudo[222237]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-otjlbqhkzliktstckowborwgkihdulwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579371.633219-1813-106072885713775/AnsiballZ_command.py
Feb 20 09:22:51 np0005625204.localdomain sudo[222237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:52 np0005625204.localdomain python3.9[222239]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:52 np0005625204.localdomain sudo[222237]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:52 np0005625204.localdomain python3.9[222349]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:22:53 np0005625204.localdomain sudo[222457]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpmblyjvulymevrsrqxbpcojbnumwjba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579373.2585692-1866-247991953530780/AnsiballZ_systemd_service.py
Feb 20 09:22:53 np0005625204.localdomain sudo[222457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:53 np0005625204.localdomain python3.9[222459]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:22:53 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:22:53 np0005625204.localdomain systemd-rc-local-generator[222484]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:22:53 np0005625204.localdomain systemd-sysv-generator[222489]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:22:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:53 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:22:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:54 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:22:54 np0005625204.localdomain sudo[222457]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:54 np0005625204.localdomain sudo[222603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvpnveypfrofhauvpexjioubenrorbvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579374.4233716-1891-125967292547983/AnsiballZ_command.py
Feb 20 09:22:54 np0005625204.localdomain sudo[222603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55864 DF PROTO=TCP SPT=46612 DPT=9882 SEQ=270082323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995B2E80000000001030307) 
Feb 20 09:22:54 np0005625204.localdomain python3.9[222605]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:54 np0005625204.localdomain sudo[222603]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:55 np0005625204.localdomain sudo[222714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srjufhqwutgrkrenqehebzpetuzfrcxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579375.0531042-1891-225655447326761/AnsiballZ_command.py
Feb 20 09:22:55 np0005625204.localdomain sudo[222714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:55 np0005625204.localdomain python3.9[222716]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:55 np0005625204.localdomain sudo[222714]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:56 np0005625204.localdomain sudo[222825]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhxjcyirbqtyfcldpfjpcwiolibywpyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579375.6798565-1891-206534560995272/AnsiballZ_command.py
Feb 20 09:22:56 np0005625204.localdomain sudo[222825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:56 np0005625204.localdomain python3.9[222827]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:56 np0005625204.localdomain sudo[222825]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:57 np0005625204.localdomain sudo[222936]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptslqrpeewxdppbrguoqjrjngajsnkee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579377.095187-1891-210790602313628/AnsiballZ_command.py
Feb 20 09:22:57 np0005625204.localdomain sudo[222936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:57 np0005625204.localdomain python3.9[222938]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:57 np0005625204.localdomain sudo[222936]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16936 DF PROTO=TCP SPT=35494 DPT=9105 SEQ=768474994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995BDE80000000001030307) 
Feb 20 09:22:57 np0005625204.localdomain sudo[223047]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laiyvqcmtnsxzrldaijxvadpjtuxramr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579377.7041175-1891-134662112936411/AnsiballZ_command.py
Feb 20 09:22:57 np0005625204.localdomain sudo[223047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:58 np0005625204.localdomain python3.9[223049]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:58 np0005625204.localdomain sudo[223047]: pam_unix(sudo:session): session closed for user root
Feb 20 09:22:58 np0005625204.localdomain sudo[223158]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tityxdjtdlosbcuivcwnurwmngczdxmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579378.6439269-1891-158178375608687/AnsiballZ_command.py
Feb 20 09:22:58 np0005625204.localdomain sudo[223158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:22:59 np0005625204.localdomain python3.9[223160]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:22:59 np0005625204.localdomain sshd[223162]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:22:59 np0005625204.localdomain sshd[223162]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:22:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16937 DF PROTO=TCP SPT=35494 DPT=9105 SEQ=768474994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995C5E90000000001030307) 
Feb 20 09:23:00 np0005625204.localdomain sudo[223158]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:00 np0005625204.localdomain sudo[223271]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fuagzovtmunqmdmuyzfxlehwwpcsorqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579380.2903688-1891-245210184642000/AnsiballZ_command.py
Feb 20 09:23:00 np0005625204.localdomain sudo[223271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:00 np0005625204.localdomain python3.9[223273]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:23:00 np0005625204.localdomain sudo[223271]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:01 np0005625204.localdomain sudo[223382]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjlwrjjxwonhuufghxmqzcdcojcbmafb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579380.9022768-1891-230454993614667/AnsiballZ_command.py
Feb 20 09:23:01 np0005625204.localdomain sudo[223382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:01 np0005625204.localdomain python3.9[223384]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:23:01 np0005625204.localdomain sudo[223382]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39923 DF PROTO=TCP SPT=44428 DPT=9100 SEQ=3565515427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995D1680000000001030307) 
Feb 20 09:23:02 np0005625204.localdomain sudo[223493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwoyxrdguvciupuldqzfiwrmzcsfuiao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579382.5511255-2097-145312126494494/AnsiballZ_file.py
Feb 20 09:23:02 np0005625204.localdomain sudo[223493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:03 np0005625204.localdomain python3.9[223495]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:03 np0005625204.localdomain sudo[223493]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:03 np0005625204.localdomain sudo[223603]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxokbpacfzxcxilerlbdmvmzisjneerq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579383.219583-2097-235743998508595/AnsiballZ_file.py
Feb 20 09:23:03 np0005625204.localdomain sudo[223603]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:03 np0005625204.localdomain python3.9[223605]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:03 np0005625204.localdomain sudo[223603]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:04 np0005625204.localdomain sudo[223713]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrbhmtrcsiijxxqcwdfzgwldxsppsxwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579383.8851688-2143-180316439551417/AnsiballZ_file.py
Feb 20 09:23:04 np0005625204.localdomain sudo[223713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:04 np0005625204.localdomain python3.9[223715]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:04 np0005625204.localdomain sudo[223713]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:04 np0005625204.localdomain sudo[223823]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmpimvhsiilzmnkgpaukbsgswvnhpllj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579384.5020044-2143-109634590881885/AnsiballZ_file.py
Feb 20 09:23:04 np0005625204.localdomain sudo[223823]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:04 np0005625204.localdomain python3.9[223825]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:05 np0005625204.localdomain sudo[223823]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:05 np0005625204.localdomain sudo[223933]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkkqwaslyavsgbijfpsccbgbouonabbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579385.5652165-2143-205314580338257/AnsiballZ_file.py
Feb 20 09:23:05 np0005625204.localdomain sudo[223933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:23:05.979 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:23:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:23:05.980 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:23:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:23:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:23:06 np0005625204.localdomain python3.9[223935]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:06 np0005625204.localdomain sudo[223933]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41542 DF PROTO=TCP SPT=57246 DPT=9101 SEQ=3677019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995DFC50000000001030307) 
Feb 20 09:23:06 np0005625204.localdomain sudo[224043]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbyqblyqnmvnlpnapfsldtomeigvsyna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579386.2288322-2143-15490038909535/AnsiballZ_file.py
Feb 20 09:23:06 np0005625204.localdomain sudo[224043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:06 np0005625204.localdomain python3.9[224045]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:06 np0005625204.localdomain sudo[224043]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:07 np0005625204.localdomain sudo[224153]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aryniwdrmirjwbtnodaxrdiswwybyrum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579386.884275-2143-138191654029669/AnsiballZ_file.py
Feb 20 09:23:07 np0005625204.localdomain sudo[224153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:07 np0005625204.localdomain python3.9[224155]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:07 np0005625204.localdomain sudo[224153]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:08 np0005625204.localdomain sudo[224263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwgitowupyrqjffdiduaomldamsxevpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579387.843067-2143-180336775468337/AnsiballZ_file.py
Feb 20 09:23:08 np0005625204.localdomain sudo[224263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:08 np0005625204.localdomain python3.9[224265]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:08 np0005625204.localdomain sudo[224263]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:09 np0005625204.localdomain sudo[224373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlbukbhogzqjfwjfdsxaeuuihmyzyovv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579388.7403867-2143-175923068632428/AnsiballZ_file.py
Feb 20 09:23:09 np0005625204.localdomain sudo[224373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:09 np0005625204.localdomain python3.9[224375]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:09 np0005625204.localdomain sudo[224373]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41544 DF PROTO=TCP SPT=57246 DPT=9101 SEQ=3677019780 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995EBE80000000001030307) 
Feb 20 09:23:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16939 DF PROTO=TCP SPT=35494 DPT=9105 SEQ=768474994 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5995F5680000000001030307) 
Feb 20 09:23:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:23:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:23:12 np0005625204.localdomain podman[224393]: 2026-02-20 09:23:12.142748172 +0000 UTC m=+0.079674395 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:23:12 np0005625204.localdomain podman[224393]: 2026-02-20 09:23:12.186321767 +0000 UTC m=+0.123247930 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:23:12 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:23:12 np0005625204.localdomain podman[224394]: 2026-02-20 09:23:12.200164178 +0000 UTC m=+0.136694839 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:23:12 np0005625204.localdomain podman[224394]: 2026-02-20 09:23:12.236000248 +0000 UTC m=+0.172530929 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 20 09:23:12 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:23:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29675 DF PROTO=TCP SPT=47610 DPT=9100 SEQ=1067107032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599605680000000001030307) 
Feb 20 09:23:16 np0005625204.localdomain sudo[224526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhplhculriweghrajluwjflyiidjxznj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579396.2399786-2508-259354683979721/AnsiballZ_getent.py
Feb 20 09:23:16 np0005625204.localdomain sudo[224526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:16 np0005625204.localdomain python3.9[224528]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 20 09:23:16 np0005625204.localdomain sudo[224526]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:17 np0005625204.localdomain sudo[224637]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkiijczdpkivjrtqleighoqyrxforvjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579397.1039965-2532-150661839559066/AnsiballZ_group.py
Feb 20 09:23:17 np0005625204.localdomain sudo[224637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:17 np0005625204.localdomain python3.9[224639]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 20 09:23:17 np0005625204.localdomain groupadd[224640]: group added to /etc/group: name=nova, GID=42436
Feb 20 09:23:17 np0005625204.localdomain groupadd[224640]: group added to /etc/gshadow: name=nova
Feb 20 09:23:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31335 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59960C4B0000000001030307) 
Feb 20 09:23:17 np0005625204.localdomain groupadd[224640]: new group: name=nova, GID=42436
Feb 20 09:23:17 np0005625204.localdomain sudo[224637]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:18 np0005625204.localdomain sudo[224753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhyxpitfmbzgrxmzrpebhzqykjnkbygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579397.9788647-2556-13156994244517/AnsiballZ_user.py
Feb 20 09:23:18 np0005625204.localdomain sudo[224753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:18 np0005625204.localdomain python3.9[224755]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 20 09:23:18 np0005625204.localdomain useradd[224757]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/1
Feb 20 09:23:18 np0005625204.localdomain useradd[224757]: add 'nova' to group 'libvirt'
Feb 20 09:23:18 np0005625204.localdomain useradd[224757]: add 'nova' to shadow group 'libvirt'
Feb 20 09:23:18 np0005625204.localdomain sudo[224753]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:20 np0005625204.localdomain sshd[224781]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:23:20 np0005625204.localdomain sshd[224781]: Accepted publickey for zuul from 192.168.122.30 port 58664 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:23:20 np0005625204.localdomain systemd-logind[759]: New session 55 of user zuul.
Feb 20 09:23:20 np0005625204.localdomain systemd[1]: Started Session 55 of User zuul.
Feb 20 09:23:20 np0005625204.localdomain sshd[224781]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:23:20 np0005625204.localdomain sshd[224784]: Received disconnect from 192.168.122.30 port 58664:11: disconnected by user
Feb 20 09:23:20 np0005625204.localdomain sshd[224784]: Disconnected from user zuul 192.168.122.30 port 58664
Feb 20 09:23:20 np0005625204.localdomain sshd[224781]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:23:20 np0005625204.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Feb 20 09:23:20 np0005625204.localdomain systemd-logind[759]: Session 55 logged out. Waiting for processes to exit.
Feb 20 09:23:20 np0005625204.localdomain systemd-logind[759]: Removed session 55.
Feb 20 09:23:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31337 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599618680000000001030307) 
Feb 20 09:23:21 np0005625204.localdomain python3.9[224892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:22 np0005625204.localdomain python3.9[224947]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:22 np0005625204.localdomain python3.9[225055]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:23 np0005625204.localdomain python3.9[225141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579402.3160772-2631-240597552342396/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:23 np0005625204.localdomain python3.9[225249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:24 np0005625204.localdomain python3.9[225335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579403.4708257-2631-187110727356389/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31338 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599628280000000001030307) 
Feb 20 09:23:25 np0005625204.localdomain python3.9[225443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:25 np0005625204.localdomain python3.9[225529]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579404.6495278-2631-222227080497075/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:26 np0005625204.localdomain python3.9[225637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:26 np0005625204.localdomain python3.9[225723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579405.7649143-2793-177327728494929/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=50598ea057afd85a1f5b995974d61e2c257c9737 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:27 np0005625204.localdomain sudo[225831]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdnxjxbvwkjemitehslazgryhiwanond ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579407.2737715-2838-115057958211086/AnsiballZ_file.py
Feb 20 09:23:27 np0005625204.localdomain sudo[225831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64576 DF PROTO=TCP SPT=53066 DPT=9105 SEQ=3853631590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599632E80000000001030307) 
Feb 20 09:23:27 np0005625204.localdomain python3.9[225833]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:27 np0005625204.localdomain sudo[225831]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:28 np0005625204.localdomain sudo[225941]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdsfhwqqimshydkcujxvvzfgkeunapcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579407.9523814-2863-49317253641551/AnsiballZ_copy.py
Feb 20 09:23:28 np0005625204.localdomain sudo[225941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:28 np0005625204.localdomain python3.9[225943]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:28 np0005625204.localdomain sudo[225941]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:28 np0005625204.localdomain sudo[226051]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpeftxbhfdnygrxdfpjkxvvzwlrkggkv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579408.6292992-2886-196800825247387/AnsiballZ_stat.py
Feb 20 09:23:28 np0005625204.localdomain sudo[226051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:29 np0005625204.localdomain python3.9[226053]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:23:29 np0005625204.localdomain sudo[226051]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:29 np0005625204.localdomain auditd[725]: Audit daemon rotating log files
Feb 20 09:23:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64577 DF PROTO=TCP SPT=53066 DPT=9105 SEQ=3853631590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59963AE80000000001030307) 
Feb 20 09:23:29 np0005625204.localdomain sudo[226163]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qotbynrjadpzyiwmjphnsleetwbfgpqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579409.409725-2913-72074084481895/AnsiballZ_file.py
Feb 20 09:23:29 np0005625204.localdomain sudo[226163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:29 np0005625204.localdomain python3.9[226165]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:29 np0005625204.localdomain sudo[226163]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:30 np0005625204.localdomain python3.9[226273]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:23:31 np0005625204.localdomain sudo[226383]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfzqsuddzmdxruarwrcauwymsbkbzjkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579411.6430078-2970-119553545485685/AnsiballZ_file.py
Feb 20 09:23:31 np0005625204.localdomain sudo[226383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:32 np0005625204.localdomain python3.9[226385]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:32 np0005625204.localdomain sudo[226383]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:32 np0005625204.localdomain sudo[226493]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uakipfnwlwihxqrlknlmaippqcfghekh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579412.3187654-2994-42698016900353/AnsiballZ_file.py
Feb 20 09:23:32 np0005625204.localdomain sudo[226493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:32 np0005625204.localdomain python3.9[226495]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:32 np0005625204.localdomain sudo[226493]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31339 DF PROTO=TCP SPT=49632 DPT=9882 SEQ=3671846449 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599649680000000001030307) 
Feb 20 09:23:33 np0005625204.localdomain python3.9[226603]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:36 np0005625204.localdomain sudo[226905]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgsglmxzysqxulmyuutxbvlccicohonu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579415.627369-3095-75674583188091/AnsiballZ_container_config_data.py
Feb 20 09:23:36 np0005625204.localdomain sudo[226905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:36 np0005625204.localdomain python3.9[226907]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 20 09:23:36 np0005625204.localdomain sudo[226905]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34701 DF PROTO=TCP SPT=53984 DPT=9101 SEQ=1340782939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599654F50000000001030307) 
Feb 20 09:23:37 np0005625204.localdomain sudo[227015]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dpzeurlmisvnrhlnnoqooxejpewlmkka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579416.721274-3129-42516576035605/AnsiballZ_container_config_hash.py
Feb 20 09:23:37 np0005625204.localdomain sudo[227015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:37 np0005625204.localdomain python3.9[227017]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:23:37 np0005625204.localdomain sudo[227015]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:38 np0005625204.localdomain sudo[227125]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nripeqsnbldetphqjjoslfcqkpppgwns ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579417.732299-3158-11642826865700/AnsiballZ_edpm_container_manage.py
Feb 20 09:23:38 np0005625204.localdomain sudo[227125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:38 np0005625204.localdomain python3[227127]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:23:38 np0005625204.localdomain podman[227163]: 
Feb 20 09:23:38 np0005625204.localdomain podman[227163]: 2026-02-20 09:23:38.706682021 +0000 UTC m=+0.087103302 container create 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=nova_compute_init, org.label-schema.schema-version=1.0, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 20 09:23:38 np0005625204.localdomain podman[227163]: 2026-02-20 09:23:38.663201272 +0000 UTC m=+0.043622563 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:23:38 np0005625204.localdomain python3[227127]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 20 09:23:38 np0005625204.localdomain sudo[227125]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34703 DF PROTO=TCP SPT=53984 DPT=9101 SEQ=1340782939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599660E80000000001030307) 
Feb 20 09:23:39 np0005625204.localdomain sudo[227309]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oosqnnxzcjxmedgvpqltwromrcitzbxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579419.132341-3183-44016053221103/AnsiballZ_stat.py
Feb 20 09:23:39 np0005625204.localdomain sudo[227309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:39 np0005625204.localdomain python3.9[227311]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:23:39 np0005625204.localdomain sudo[227309]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:40 np0005625204.localdomain python3.9[227421]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:23:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64579 DF PROTO=TCP SPT=53066 DPT=9105 SEQ=3853631590 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59966B680000000001030307) 
Feb 20 09:23:42 np0005625204.localdomain sudo[227529]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jxkeobwkssjperxjkalkjybfidmvnssn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579421.378471-3263-108440294367717/AnsiballZ_stat.py
Feb 20 09:23:42 np0005625204.localdomain sudo[227529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:23:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:23:43 np0005625204.localdomain systemd[1]: tmp-crun.LMNY1C.mount: Deactivated successfully.
Feb 20 09:23:43 np0005625204.localdomain podman[227532]: 2026-02-20 09:23:43.076895644 +0000 UTC m=+0.088759196 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:23:43 np0005625204.localdomain podman[227532]: 2026-02-20 09:23:43.120057741 +0000 UTC m=+0.131921313 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 09:23:43 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:23:43 np0005625204.localdomain podman[227533]: 2026-02-20 09:23:43.139767761 +0000 UTC m=+0.149420692 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:23:43 np0005625204.localdomain podman[227533]: 2026-02-20 09:23:43.14660828 +0000 UTC m=+0.156261241 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 20 09:23:43 np0005625204.localdomain python3.9[227531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:43 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:23:43 np0005625204.localdomain sudo[227529]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:43 np0005625204.localdomain sudo[227663]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjudkxvmrszcenpholqduxdfyfgzosrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579421.378471-3263-108440294367717/AnsiballZ_copy.py
Feb 20 09:23:43 np0005625204.localdomain sudo[227663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:43 np0005625204.localdomain python3.9[227665]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579421.378471-3263-108440294367717/.source.yaml _original_basename=.nm_8zu1z follow=False checksum=201984e070e9869531933fce67c78d3ce61bb83b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:43 np0005625204.localdomain sudo[227663]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:44 np0005625204.localdomain systemd[1]: tmp-crun.Mtj8Xm.mount: Deactivated successfully.
Feb 20 09:23:44 np0005625204.localdomain sshd[227666]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:23:44 np0005625204.localdomain sshd[227666]: Invalid user nutanix from 96.78.175.36 port 42944
Feb 20 09:23:44 np0005625204.localdomain sshd[227666]: Received disconnect from 96.78.175.36 port 42944:11: Bye Bye [preauth]
Feb 20 09:23:44 np0005625204.localdomain sshd[227666]: Disconnected from invalid user nutanix 96.78.175.36 port 42944 [preauth]
Feb 20 09:23:45 np0005625204.localdomain sudo[227775]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qchjoqionclbwynykgavqhgwitmrzhkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579424.9022446-3314-170956843007591/AnsiballZ_file.py
Feb 20 09:23:45 np0005625204.localdomain sudo[227775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:45 np0005625204.localdomain python3.9[227777]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:45 np0005625204.localdomain sudo[227775]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:45 np0005625204.localdomain sudo[227885]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntzwkhzzjfgtznxqqwdjfgbvcbqvcbfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579425.5596826-3339-168969699161035/AnsiballZ_file.py
Feb 20 09:23:45 np0005625204.localdomain sudo[227885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37443 DF PROTO=TCP SPT=49524 DPT=9102 SEQ=300155209 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59967B680000000001030307) 
Feb 20 09:23:46 np0005625204.localdomain python3.9[227887]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:23:46 np0005625204.localdomain sudo[227885]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:46 np0005625204.localdomain sudo[227995]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmusyejdxqrnayuvstdpegzwcxhvlzgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579426.597802-3363-22528452210213/AnsiballZ_stat.py
Feb 20 09:23:46 np0005625204.localdomain sudo[227995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:47 np0005625204.localdomain python3.9[227997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:23:47 np0005625204.localdomain sudo[227995]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:47 np0005625204.localdomain sudo[228085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbkinxkndswuhtbsdazpznolhkwwgpla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579426.597802-3363-22528452210213/AnsiballZ_copy.py
Feb 20 09:23:47 np0005625204.localdomain sudo[228085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:47 np0005625204.localdomain python3.9[228087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/nova_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579426.597802-3363-22528452210213/.source.json _original_basename=.86e5bwca follow=False checksum=0018389a48392615f4a8869cad43008a907328ff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:47 np0005625204.localdomain sudo[228085]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9076 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996817B0000000001030307) 
Feb 20 09:23:48 np0005625204.localdomain python3.9[228195]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:48 np0005625204.localdomain sudo[228251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:23:48 np0005625204.localdomain sudo[228251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:23:48 np0005625204.localdomain sudo[228251]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:49 np0005625204.localdomain sudo[228285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:23:49 np0005625204.localdomain sudo[228285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:23:50 np0005625204.localdomain sudo[228285]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:50 np0005625204.localdomain sudo[228475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:23:50 np0005625204.localdomain sudo[228475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:23:50 np0005625204.localdomain sudo[228475]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9078 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59968D680000000001030307) 
Feb 20 09:23:51 np0005625204.localdomain sudo[228583]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxaltaoojuehixeczawhsnuissiuszkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579430.801433-3483-13885624980515/AnsiballZ_container_config_data.py
Feb 20 09:23:51 np0005625204.localdomain sudo[228583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:51 np0005625204.localdomain python3.9[228585]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 20 09:23:51 np0005625204.localdomain sudo[228583]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:52 np0005625204.localdomain sudo[228693]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvbbmultctgznkrusmzuwlturybtxxcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579431.7102377-3516-100995542408740/AnsiballZ_container_config_hash.py
Feb 20 09:23:52 np0005625204.localdomain sudo[228693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:52 np0005625204.localdomain python3.9[228695]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:23:52 np0005625204.localdomain sudo[228693]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:53 np0005625204.localdomain sudo[228803]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boodwnctjixdxecrthbiqanwwfmqbcco ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579432.6150217-3545-181816651311785/AnsiballZ_edpm_container_manage.py
Feb 20 09:23:53 np0005625204.localdomain sudo[228803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:53 np0005625204.localdomain python3[228805]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:23:53 np0005625204.localdomain python3[228805]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",
                                                                    "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:31:38.534497001Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1214548351,
                                                                    "VirtualSize": 1214548351,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",
                                                                              "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:39.234075496Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.686286019Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.133364958Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:10.283411186Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:19.407054412Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:42.656365894Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:37.451289936Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.151652427Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532191009Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532298572Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:44.609081717Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:23:53 np0005625204.localdomain podman[228856]: 2026-02-20 09:23:53.971784448 +0000 UTC m=+0.092643819 container remove a8148ad754d3202b2e3b8ea06779a2f8261d84f433fcd602f82fdf8df3463380 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '24eefedeb2e4ab8bab62979b617bbba7-6f2a8ada21c5a8beb0844e05e372be87'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step5, vcs-type=git, build-date=2026-01-12T23:32:04Z, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Feb 20 09:23:53 np0005625204.localdomain python3[228805]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Feb 20 09:23:54 np0005625204.localdomain podman[228870]: 
Feb 20 09:23:54 np0005625204.localdomain podman[228870]: 2026-02-20 09:23:54.079750746 +0000 UTC m=+0.088285951 container create 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:23:54 np0005625204.localdomain podman[228870]: 2026-02-20 09:23:54.035757611 +0000 UTC m=+0.044292816 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:23:54 np0005625204.localdomain python3[228805]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 20 09:23:54 np0005625204.localdomain sudo[228803]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9079 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59969D280000000001030307) 
Feb 20 09:23:55 np0005625204.localdomain sudo[229016]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctyqzifzvnbbjxtkcvhhtyylxucdeqzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579434.4640243-3570-273202128128875/AnsiballZ_stat.py
Feb 20 09:23:55 np0005625204.localdomain sudo[229016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:55 np0005625204.localdomain python3.9[229018]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:23:55 np0005625204.localdomain sudo[229016]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:55 np0005625204.localdomain sudo[229128]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyxlerxmoguqxgevopdqbgjmycqdqxmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579435.7589853-3596-201238062668257/AnsiballZ_file.py
Feb 20 09:23:55 np0005625204.localdomain sudo[229128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:56 np0005625204.localdomain python3.9[229130]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:56 np0005625204.localdomain sudo[229128]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:56 np0005625204.localdomain sudo[229183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tochdmriqhurwbxpmpbuhuxuvaiczefk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579435.7589853-3596-201238062668257/AnsiballZ_stat.py
Feb 20 09:23:56 np0005625204.localdomain sudo[229183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:56 np0005625204.localdomain python3.9[229185]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:23:56 np0005625204.localdomain sudo[229183]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:57 np0005625204.localdomain sudo[229292]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-becrwdanzwkruxafomucrozqsjfuadjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579436.6840694-3596-258715125999464/AnsiballZ_copy.py
Feb 20 09:23:57 np0005625204.localdomain sudo[229292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:57 np0005625204.localdomain python3.9[229294]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579436.6840694-3596-258715125999464/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:23:57 np0005625204.localdomain sudo[229292]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:57 np0005625204.localdomain sudo[229347]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghaaujxcanvjdqcgzimcmprmbhfcmuuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579436.6840694-3596-258715125999464/AnsiballZ_systemd.py
Feb 20 09:23:57 np0005625204.localdomain sudo[229347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20917 DF PROTO=TCP SPT=36818 DPT=9105 SEQ=3333752055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996A8290000000001030307) 
Feb 20 09:23:57 np0005625204.localdomain python3.9[229349]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:23:57 np0005625204.localdomain systemd-sysv-generator[229374]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:23:57 np0005625204.localdomain systemd-rc-local-generator[229371]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:57 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:58 np0005625204.localdomain sudo[229347]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:58 np0005625204.localdomain sudo[229438]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aavjxfwoazaqiqbpaymmclqfvytgvlao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579436.6840694-3596-258715125999464/AnsiballZ_systemd.py
Feb 20 09:23:58 np0005625204.localdomain sudo[229438]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:23:58 np0005625204.localdomain python3.9[229440]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:23:58 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:23:58 np0005625204.localdomain systemd-rc-local-generator[229465]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:23:58 np0005625204.localdomain systemd-sysv-generator[229469]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:23:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: Starting nova_compute container...
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:23:59 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 20 09:23:59 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 20 09:23:59 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 20 09:23:59 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 09:23:59 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 09:23:59 np0005625204.localdomain podman[229481]: 2026-02-20 09:23:59.244000651 +0000 UTC m=+0.107736211 container init 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: tmp-crun.3iwntk.mount: Deactivated successfully.
Feb 20 09:23:59 np0005625204.localdomain podman[229481]: 2026-02-20 09:23:59.259574928 +0000 UTC m=+0.123310508 container start 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Feb 20 09:23:59 np0005625204.localdomain podman[229481]: nova_compute
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + sudo -E kolla_set_configs
Feb 20 09:23:59 np0005625204.localdomain systemd[1]: Started nova_compute container.
Feb 20 09:23:59 np0005625204.localdomain sudo[229438]: pam_unix(sudo:session): session closed for user root
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Validating config file
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying service configuration files
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Deleting /etc/ceph
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Creating directory /etc/ceph
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Writing out command to execute
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: ++ cat /run_command
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + CMD=nova-compute
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + ARGS=
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + sudo kolla_copy_cacerts
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + [[ ! -n '' ]]
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + . kolla_extend_start
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + echo 'Running command: '\''nova-compute'\'''
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: Running command: 'nova-compute'
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + umask 0022
Feb 20 09:23:59 np0005625204.localdomain nova_compute[229496]: + exec nova-compute
Feb 20 09:23:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20918 DF PROTO=TCP SPT=36818 DPT=9105 SEQ=3333752055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996B0280000000001030307) 
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.048 229500 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.048 229500 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.049 229500 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.049 229500 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.164 229500 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.185 229500 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.185 229500 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 20 09:24:01 np0005625204.localdomain python3.9[229618]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.638 229500 INFO nova.virt.driver [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.760 229500 INFO nova.compute.provider_config [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.768 229500 WARNING nova.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.768 229500 DEBUG oslo_concurrency.lockutils [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.768 229500 DEBUG oslo_concurrency.lockutils [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_concurrency.lockutils [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.769 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.770 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.771 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console_host                   = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.772 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.773 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.774 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.775 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.776 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.777 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.778 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.779 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.780 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.781 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.782 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.783 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.784 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.785 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.786 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.787 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.788 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.789 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.790 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.791 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.792 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.793 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.794 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.795 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.796 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.797 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.798 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.799 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.800 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.801 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.802 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.803 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.804 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.805 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.806 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.807 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.808 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.809 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.810 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.811 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.812 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.813 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.814 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.815 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.816 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.817 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.818 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.819 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.820 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.821 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.822 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.823 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.824 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.825 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.826 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.827 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.828 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.829 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.830 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.831 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.832 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.833 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.834 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.835 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.836 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.837 229500 WARNING oslo_config.cfg [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: and ``live_migration_inbound_addr`` respectively.
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: ).  Its value may be silently ignored in the future.
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.837 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.838 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.839 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_secret_uuid        = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.840 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.841 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.842 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.843 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.844 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.845 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.846 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.847 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.848 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.849 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.850 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.851 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.852 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.853 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.854 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.855 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.856 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.857 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.858 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.859 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.860 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.861 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.862 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.863 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.864 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.865 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.866 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.867 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.868 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.869 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.870 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.871 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.872 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.873 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.874 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.875 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.876 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.877 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.878 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.879 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.880 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.881 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.882 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.883 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.884 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.885 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.886 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.887 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.888 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.889 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.890 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.891 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.892 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.893 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.894 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.895 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.896 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.897 229500 DEBUG oslo_service.service [None req-be4e8cb0-7be8-47d8-a62b-0c044c1c0aa2 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.898 229500 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.911 229500 INFO nova.virt.node [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.911 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.912 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.912 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.912 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.923 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f52eafa1910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.925 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f52eafa1910> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.926 229500 INFO nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Connection event '1' reason 'None'
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.937 229500 DEBUG nova.virt.libvirt.volume.mount [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.946 229500 INFO nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host capabilities <capabilities>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <host>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <uuid>f44a30b3-674b-4e65-a07d-fb3d71d4ae11</uuid>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <cpu>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <arch>x86_64</arch>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model>EPYC-Rome-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <vendor>AMD</vendor>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <microcode version='16777317'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <signature family='23' model='49' stepping='0'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='x2apic'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='tsc-deadline'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='osxsave'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='hypervisor'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='tsc_adjust'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='spec-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='stibp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='arch-capabilities'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='ssbd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='cmp_legacy'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='topoext'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='virt-ssbd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='lbrv'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='tsc-scale'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='vmcb-clean'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='pause-filter'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='pfthreshold'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='svme-addr-chk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='rdctl-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='skip-l1dfl-vmentry'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='mds-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature name='pschange-mc-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <pages unit='KiB' size='4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <pages unit='KiB' size='2048'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <pages unit='KiB' size='1048576'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </cpu>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <power_management>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <suspend_mem/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <suspend_disk/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <suspend_hybrid/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </power_management>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <iommu support='no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <migration_features>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <live/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <uri_transports>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <uri_transport>tcp</uri_transport>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <uri_transport>rdma</uri_transport>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </uri_transports>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </migration_features>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <topology>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <cells num='1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <cell id='0'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           <memory unit='KiB'>16116612</memory>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           <pages unit='KiB' size='2048'>0</pages>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           <distances>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <sibling id='0' value='10'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           </distances>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           <cpus num='8'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:           </cpus>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         </cell>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </cells>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </topology>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <cache>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </cache>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <secmodel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model>selinux</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <doi>0</doi>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </secmodel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <secmodel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model>dac</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <doi>0</doi>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </secmodel>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   </host>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <guest>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <os_type>hvm</os_type>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <arch name='i686'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <wordsize>32</wordsize>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <domain type='qemu'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <domain type='kvm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </arch>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <features>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <pae/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <nonpae/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <acpi default='on' toggle='yes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <apic default='on' toggle='no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <cpuselection/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <deviceboot/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <externalSnapshot/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </features>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   </guest>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <guest>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <os_type>hvm</os_type>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <arch name='x86_64'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <wordsize>64</wordsize>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <domain type='qemu'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <domain type='kvm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </arch>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <features>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <acpi default='on' toggle='yes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <apic default='on' toggle='no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <cpuselection/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <deviceboot/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <externalSnapshot/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </features>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   </guest>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: </capabilities>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.955 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.976 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]: <domainCapabilities>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <domain>kvm</domain>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <arch>i686</arch>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <vcpu max='1024'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <iothreads supported='yes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <os supported='yes'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <enum name='firmware'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <loader supported='yes'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>rom</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>pflash</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <enum name='readonly'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>yes</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <enum name='secure'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </loader>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   </os>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:   <cpu>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <enum name='maximumMigratable'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <vendor>AMD</vendor>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='succor'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:     <mode name='custom' supported='yes'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v4'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v5'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v4'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:01 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <memoryBacking supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='sourceType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>anonymous</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>memfd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </memoryBacking>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <disk supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='diskDevice'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>disk</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cdrom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>floppy</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>lun</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>fdc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>sata</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </disk>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <graphics supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vnc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egl-headless</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </graphics>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <video supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='modelType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vga</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cirrus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>none</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>bochs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ramfb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </video>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hostdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='mode'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>subsystem</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='startupPolicy'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>mandatory</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>requisite</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>optional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='subsysType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pci</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='capsType'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='pciBackend'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hostdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <rng supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>random</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </rng>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <filesystem supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='driverType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>path</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>handle</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtiofs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </filesystem>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tpm supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-tis</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-crb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emulator</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>external</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendVersion'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>2.0</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </tpm>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <redirdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </redirdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <channel supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </channel>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <crypto supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </crypto>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <interface supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>passt</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </interface>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <panic supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>isa</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>hyperv</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </panic>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <console supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>null</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dev</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pipe</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stdio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>udp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tcp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu-vdagent</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </console>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <gic supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <genid supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backup supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <async-teardown supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <s390-pv supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <ps2 supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tdx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sev supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sgx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hyperv supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='features'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>relaxed</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vapic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>spinlocks</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vpindex</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>runtime</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>synic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stimer</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reset</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vendor_id</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>frequencies</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reenlightenment</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tlbflush</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ipi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>avic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emsr_bitmap</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>xmm_input</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hyperv>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <launchSecurity supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: </domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:01.995 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: <domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <domain>kvm</domain>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <arch>i686</arch>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <vcpu max='240'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <iothreads supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <os supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='firmware'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <loader supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>rom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pflash</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='readonly'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>yes</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='secure'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </loader>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </os>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='maximumMigratable'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <vendor>AMD</vendor>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='succor'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='custom' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <memoryBacking supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='sourceType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>anonymous</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>memfd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </memoryBacking>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <disk supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='diskDevice'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>disk</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cdrom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>floppy</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>lun</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ide</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>fdc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>sata</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </disk>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <graphics supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vnc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egl-headless</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </graphics>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <video supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='modelType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vga</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cirrus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>none</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>bochs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ramfb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </video>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hostdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='mode'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>subsystem</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='startupPolicy'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>mandatory</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>requisite</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>optional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='subsysType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pci</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='capsType'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='pciBackend'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hostdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <rng supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>random</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </rng>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <filesystem supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='driverType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>path</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>handle</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtiofs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </filesystem>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tpm supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-tis</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-crb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emulator</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>external</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendVersion'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>2.0</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </tpm>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <redirdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </redirdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <channel supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </channel>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <crypto supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </crypto>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <interface supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>passt</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </interface>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <panic supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>isa</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>hyperv</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </panic>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <console supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>null</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dev</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pipe</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stdio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>udp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tcp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu-vdagent</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </console>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <gic supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <genid supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backup supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <async-teardown supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <s390-pv supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <ps2 supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tdx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sev supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sgx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hyperv supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='features'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>relaxed</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vapic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>spinlocks</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vpindex</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>runtime</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>synic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stimer</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reset</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vendor_id</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>frequencies</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reenlightenment</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tlbflush</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ipi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>avic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emsr_bitmap</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>xmm_input</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hyperv>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <launchSecurity supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: </domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.027 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.035 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: <domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <domain>kvm</domain>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <arch>x86_64</arch>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <vcpu max='1024'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <iothreads supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <os supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='firmware'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>efi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <loader supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>rom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pflash</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='readonly'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>yes</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='secure'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>yes</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </loader>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </os>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='maximumMigratable'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <vendor>AMD</vendor>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='succor'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='custom' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain sudo[229753]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwthcdidndxhpdadfjutmvkoqohahpns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579441.9562454-3732-49687863207863/AnsiballZ_stat.py
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain sudo[229753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <memoryBacking supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='sourceType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>anonymous</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>memfd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </memoryBacking>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <disk supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='diskDevice'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>disk</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cdrom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>floppy</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>lun</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>fdc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>sata</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </disk>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <graphics supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vnc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egl-headless</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </graphics>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <video supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='modelType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vga</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cirrus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>none</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>bochs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ramfb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </video>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hostdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='mode'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>subsystem</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='startupPolicy'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>mandatory</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>requisite</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>optional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='subsysType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pci</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='capsType'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='pciBackend'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hostdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <rng supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>random</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </rng>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <filesystem supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='driverType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>path</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>handle</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtiofs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </filesystem>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tpm supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-tis</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-crb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emulator</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>external</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendVersion'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>2.0</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </tpm>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <redirdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </redirdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <channel supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </channel>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <crypto supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </crypto>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <interface supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>passt</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </interface>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <panic supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>isa</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>hyperv</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </panic>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <console supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>null</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dev</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pipe</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stdio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>udp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tcp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu-vdagent</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </console>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <gic supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <genid supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backup supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <async-teardown supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <s390-pv supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <ps2 supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tdx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sev supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sgx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hyperv supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='features'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>relaxed</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vapic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>spinlocks</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vpindex</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>runtime</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>synic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stimer</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reset</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vendor_id</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>frequencies</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reenlightenment</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tlbflush</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ipi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>avic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emsr_bitmap</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>xmm_input</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hyperv>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <launchSecurity supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: </domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.094 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: <domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <domain>kvm</domain>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <arch>x86_64</arch>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <vcpu max='240'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <iothreads supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <os supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='firmware'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <loader supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>rom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pflash</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='readonly'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>yes</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='secure'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>no</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </loader>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </os>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='maximumMigratable'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>on</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>off</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <vendor>AMD</vendor>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='succor'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <mode name='custom' supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ddpd-u'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sha512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm3'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sm4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Denverton-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amd-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='auto-ibrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='perfmon-v2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbpb'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='stibp-always-on'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='EPYC-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-128'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-256'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx10-512'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='prefetchiti'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Haswell-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512er'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512pf'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fma4'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tbm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xop'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='amx-tile'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-bf16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-fp16'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bitalg'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrc'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fzrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='la57'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='taa-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ifma'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cmpccxadd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fbsdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='fsrs'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ibrs-all'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='intel-psfd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='lam'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mcdt-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pbrsb-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='psdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rfds-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='serialize'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vaes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='hle'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='rtm'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512bw'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512cd'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512dq'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512f'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='avx512vl'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='invpcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pcid'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='pku'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='mpx'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='core-capability'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='split-lock-detect'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='cldemote'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='erms'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='gfni'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdir64b'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='movdiri'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='xsaves'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='athlon-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='core2duo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='coreduo-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='n270-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='ss'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <blockers model='phenom-v1'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnow'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <feature name='3dnowext'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </blockers>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </mode>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </cpu>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <memoryBacking supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <enum name='sourceType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>anonymous</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <value>memfd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </memoryBacking>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <disk supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='diskDevice'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>disk</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cdrom</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>floppy</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>lun</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ide</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>fdc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>sata</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </disk>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <graphics supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vnc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egl-headless</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </graphics>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <video supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='modelType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vga</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>cirrus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>none</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>bochs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ramfb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </video>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hostdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='mode'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>subsystem</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='startupPolicy'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>mandatory</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>requisite</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>optional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='subsysType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pci</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>scsi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='capsType'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='pciBackend'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hostdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <rng supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtio-non-transitional</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>random</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>egd</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </rng>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <filesystem supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='driverType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>path</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>handle</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>virtiofs</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </filesystem>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tpm supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-tis</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tpm-crb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emulator</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>external</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendVersion'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>2.0</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </tpm>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <redirdev supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='bus'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>usb</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </redirdev>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <channel supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </channel>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <crypto supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendModel'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>builtin</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </crypto>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <interface supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='backendType'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>default</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>passt</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </interface>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <panic supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='model'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>isa</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>hyperv</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </panic>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <console supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='type'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>null</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vc</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pty</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dev</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>file</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>pipe</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stdio</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>udp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tcp</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>unix</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>qemu-vdagent</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>dbus</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </console>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </devices>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   <features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <gic supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <genid supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <backup supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <async-teardown supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <s390-pv supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <ps2 supported='yes'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <tdx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sev supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <sgx supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <hyperv supported='yes'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <enum name='features'>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>relaxed</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vapic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>spinlocks</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vpindex</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>runtime</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>synic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>stimer</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reset</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>vendor_id</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>frequencies</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>reenlightenment</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>tlbflush</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>ipi</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>avic</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>emsr_bitmap</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <value>xmm_input</value>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </enum>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       <defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:       </defaults>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     </hyperv>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:     <launchSecurity supported='no'/>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:   </features>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: </domainCapabilities>
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.198 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.199 229500 INFO nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Secure Boot support detected
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.203 229500 INFO nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.204 229500 INFO nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.217 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.267 229500 INFO nova.virt.node [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.290 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Verified node 41976f9f-3656-482f-8ad0-c81e454a3952 matches my host np0005625204.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.331 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.335 229500 DEBUG nova.virt.libvirt.vif [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005625204.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T08:23:36Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.336 229500 DEBUG nova.network.os_vif_util [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.337 229500 DEBUG nova.network.os_vif_util [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.338 229500 DEBUG os_vif [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 20 09:24:02 np0005625204.localdomain python3.9[229755]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.380 229500 DEBUG ovsdbapp.backend.ovs_idl [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.381 229500 DEBUG ovsdbapp.backend.ovs_idl [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.381 229500 DEBUG ovsdbapp.backend.ovs_idl [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.381 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.382 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.382 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.383 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.402 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.402 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.403 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.404 229500 INFO oslo.privsep.daemon [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpav37yiqy/privsep.sock']
Feb 20 09:24:02 np0005625204.localdomain sudo[229753]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:02 np0005625204.localdomain sudo[229847]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkhcermiynmxuqgysiwbhwywusgfgjvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579441.9562454-3732-49687863207863/AnsiballZ_copy.py
Feb 20 09:24:02 np0005625204.localdomain sudo[229847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:02 np0005625204.localdomain python3.9[229849]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579441.9562454-3732-49687863207863/.source.yaml _original_basename=.t701q_8c follow=False checksum=a8e9a640ed2d11815875c8a03dd8e15172eb268a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:24:02 np0005625204.localdomain sudo[229847]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.988 229500 INFO oslo.privsep.daemon [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.897 229850 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.900 229850 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.905 229850 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 20 09:24:02 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:02.905 229850 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229850
Feb 20 09:24:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9080 DF PROTO=TCP SPT=56050 DPT=9882 SEQ=3012736530 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996BD690000000001030307) 
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.218 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.270 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.270 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.271 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.272 229500 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.272 229500 INFO os_vif [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.273 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.277 229500 DEBUG nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.277 229500 INFO nova.compute.manager [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.686 229500 INFO nova.service [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating service version for nova-compute on np0005625204.localdomain from 57 to 66
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.755 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.756 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.756 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.756 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:24:03 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:03.757 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:24:04 np0005625204.localdomain python3.9[229963]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.226 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.293 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.294 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:24:04 np0005625204.localdomain systemd[1]: Started libvirt nodedev daemon.
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.660 229500 WARNING nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.662 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12959MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.663 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.663 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.840 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.840 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.841 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.860 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.881 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.881 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.905 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.961 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_AESNI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SVM,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_BMI,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE41,COMPUTE_DEVICE_TAGGING,HW_CPU_X86_FMA3,HW_CPU_X86_AVX,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_CLMUL,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_LAN9118,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_AVX2,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE2,HW_CPU_X86_ABM,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSSE3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_TRUSTED_CERTS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:24:04 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:04.997 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:24:05 np0005625204.localdomain sshd[230135]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:24:05 np0005625204.localdomain python3.9[230134]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.450 229500 DEBUG oslo_concurrency.processutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.455 229500 DEBUG nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.455 229500 INFO nova.virt.libvirt.host [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] kernel doesn't support AMD SEV
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.456 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.456 229500 DEBUG nova.virt.libvirt.driver [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.520 229500 DEBUG nova.scheduler.client.report [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updated inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.521 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.521 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.625 229500 DEBUG nova.compute.provider_tree [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Updating resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 generation from 4 to 5 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.656 229500 DEBUG nova.compute.resource_tracker [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.656 229500 DEBUG oslo_concurrency.lockutils [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.993s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.656 229500 DEBUG nova.service [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.697 229500 DEBUG nova.service [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 20 09:24:05 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:05.698 229500 DEBUG nova.servicegroup.drivers.db [None req-cc95d373-e21c-4569-8005-85f828afc235 - - - - - -] DB_Driver: join new ServiceGroup member np0005625204.localdomain to the compute group, service = <Service: host=np0005625204.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 20 09:24:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:24:05.980 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:24:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:24:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:24:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:24:05.982 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:24:06 np0005625204.localdomain sshd[230135]: Invalid user sol from 45.148.10.240 port 42338
Feb 20 09:24:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4659 DF PROTO=TCP SPT=40096 DPT=9101 SEQ=2473666474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996CA280000000001030307) 
Feb 20 09:24:06 np0005625204.localdomain sshd[230135]: Connection closed by invalid user sol 45.148.10.240 port 42338 [preauth]
Feb 20 09:24:06 np0005625204.localdomain python3.9[230246]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:07 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:07.387 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:07 np0005625204.localdomain sudo[230354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srtfgzkckvcnbgdifzogrttwpshnjylf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579447.111901-3881-138716297092998/AnsiballZ_podman_container.py
Feb 20 09:24:07 np0005625204.localdomain sudo[230354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:07 np0005625204.localdomain python3.9[230356]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 20 09:24:07 np0005625204.localdomain sudo[230354]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:07 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 115.0 (383 of 333 items), suggesting rotation.
Feb 20 09:24:07 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:24:07 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:24:08 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:24:08 np0005625204.localdomain nova_compute[229496]: 2026-02-20 09:24:08.254 229500 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:08 np0005625204.localdomain sudo[230488]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjdvmvksbjknifgdpeqvuascwjsbguxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579448.3622115-3908-225076129095930/AnsiballZ_systemd.py
Feb 20 09:24:08 np0005625204.localdomain sudo[230488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:08 np0005625204.localdomain python3.9[230490]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:24:08 np0005625204.localdomain systemd[1]: Stopping nova_compute container...
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: tmp-crun.AK0QJc.mount: Deactivated successfully.
Feb 20 09:24:09 np0005625204.localdomain virtqemud[206495]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 20 09:24:09 np0005625204.localdomain virtqemud[206495]: hostname: np0005625204.localdomain
Feb 20 09:24:09 np0005625204.localdomain virtqemud[206495]: End of file while reading data: Input/output error
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Deactivated successfully.
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Consumed 4.460s CPU time.
Feb 20 09:24:09 np0005625204.localdomain podman[230494]: 2026-02-20 09:24:09.092365802 +0000 UTC m=+0.092978371 container died 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true, config_id=nova_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=nova_compute)
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: tmp-crun.syYPC6.mount: Deactivated successfully.
Feb 20 09:24:09 np0005625204.localdomain podman[230494]: 2026-02-20 09:24:09.218929613 +0000 UTC m=+0.219542162 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Feb 20 09:24:09 np0005625204.localdomain podman[230494]: nova_compute
Feb 20 09:24:09 np0005625204.localdomain podman[230534]: error opening file `/run/crun/299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782/status`: No such file or directory
Feb 20 09:24:09 np0005625204.localdomain podman[230523]: 2026-02-20 09:24:09.321770276 +0000 UTC m=+0.064224941 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:24:09 np0005625204.localdomain podman[230523]: nova_compute
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: Stopped nova_compute container.
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: Starting nova_compute container...
Feb 20 09:24:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4661 DF PROTO=TCP SPT=40096 DPT=9101 SEQ=2473666474 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996D6280000000001030307) 
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:24:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:09 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:09 np0005625204.localdomain podman[230538]: 2026-02-20 09:24:09.453373078 +0000 UTC m=+0.102332648 container init 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=nova_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3)
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + sudo -E kolla_set_configs
Feb 20 09:24:09 np0005625204.localdomain podman[230538]: 2026-02-20 09:24:09.466135126 +0000 UTC m=+0.115094696 container start 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3)
Feb 20 09:24:09 np0005625204.localdomain podman[230538]: nova_compute
Feb 20 09:24:09 np0005625204.localdomain systemd[1]: Started nova_compute container.
Feb 20 09:24:09 np0005625204.localdomain sudo[230488]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Validating config file
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying service configuration files
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /etc/ceph
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Creating directory /etc/ceph
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Writing out command to execute
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: ++ cat /run_command
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + CMD=nova-compute
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + ARGS=
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + sudo kolla_copy_cacerts
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + [[ ! -n '' ]]
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + . kolla_extend_start
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: Running command: 'nova-compute'
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + echo 'Running command: '\''nova-compute'\'''
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + umask 0022
Feb 20 09:24:09 np0005625204.localdomain nova_compute[230552]: + exec nova-compute
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.155 230556 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.156 230556 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.156 230556 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.156 230556 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.270 230556 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.291 230556 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.291 230556 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 20 09:24:11 np0005625204.localdomain sshd[230586]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.681 230556 INFO nova.virt.driver [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 20 09:24:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20920 DF PROTO=TCP SPT=36818 DPT=9105 SEQ=3333752055 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996DF680000000001030307) 
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.798 230556 INFO nova.compute.provider_config [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.837 230556 WARNING nova.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.838 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.838 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.838 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.839 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.839 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.840 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.841 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.841 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.841 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.842 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.842 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.842 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.843 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.844 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.844 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.844 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console_host                   = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.845 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.846 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.846 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.846 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.847 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.847 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.847 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain sudo[230677]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofubvkapstiwibqkzhkdmizzpeowljni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579451.601313-3933-35758223924426/AnsiballZ_podman_container.py
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.848 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.849 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.849 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.849 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.850 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.851 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.851 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.851 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.852 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.852 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.852 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.853 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.853 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain sudo[230677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.853 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.854 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.854 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.854 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.855 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.856 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.856 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.856 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.857 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.858 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.859 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.859 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.859 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.860 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.861 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.861 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.861 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.862 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.863 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.863 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.863 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.864 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.865 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.865 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.865 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.866 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.867 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.867 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.867 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.868 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.869 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.869 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.869 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.870 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.871 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.871 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.871 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.872 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.873 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.874 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.874 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.874 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.875 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.876 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.876 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.876 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.877 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.878 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.879 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.880 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.881 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.882 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.883 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.884 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.885 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.886 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.887 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.888 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.889 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.890 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.891 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.892 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.893 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.894 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.895 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.896 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.897 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.898 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.899 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.900 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.901 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.902 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.903 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.904 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.905 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.906 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.907 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.908 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.909 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.910 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.911 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.912 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.913 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.914 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.915 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.916 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.917 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.918 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.919 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.920 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.921 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.922 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.923 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.924 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.925 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.926 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.927 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.928 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.929 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.930 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.930 230556 WARNING oslo_config.cfg [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: and ``live_migration_inbound_addr`` respectively.
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: ).  Its value may be silently ignored in the future.
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.931 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.932 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.933 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_secret_uuid        = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.934 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.935 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.936 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.937 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.938 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.939 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.940 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.941 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.942 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.943 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.944 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.945 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.946 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.947 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.948 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.949 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.950 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.951 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.952 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.953 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.954 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.955 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.956 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.957 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.958 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.959 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.960 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.961 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.962 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.963 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.964 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.965 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.966 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.967 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.968 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.969 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.970 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.971 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.972 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.973 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.974 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.975 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.976 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.977 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.978 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.979 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.980 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.981 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.982 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.983 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.984 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.985 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.986 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.987 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.988 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.989 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.990 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.991 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.992 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.993 230556 DEBUG oslo_service.service [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:24:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:11.995 230556 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.010 230556 INFO nova.virt.node [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.011 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.021 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f206a662610> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.024 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f206a662610> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.024 230556 INFO nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Connection event '1' reason 'None'
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.027 230556 INFO nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host capabilities <capabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <host>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <uuid>f44a30b3-674b-4e65-a07d-fb3d71d4ae11</uuid>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <arch>x86_64</arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model>EPYC-Rome-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <vendor>AMD</vendor>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <microcode version='16777317'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <signature family='23' model='49' stepping='0'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='x2apic'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='tsc-deadline'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='osxsave'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='hypervisor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='tsc_adjust'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='spec-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='stibp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='arch-capabilities'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='cmp_legacy'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='topoext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='virt-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='lbrv'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='tsc-scale'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='vmcb-clean'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='pause-filter'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='pfthreshold'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='svme-addr-chk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='rdctl-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='skip-l1dfl-vmentry'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='mds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature name='pschange-mc-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <pages unit='KiB' size='4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <pages unit='KiB' size='2048'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <pages unit='KiB' size='1048576'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <power_management>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <suspend_mem/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <suspend_disk/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <suspend_hybrid/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </power_management>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <iommu support='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <migration_features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <live/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <uri_transports>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <uri_transport>tcp</uri_transport>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <uri_transport>rdma</uri_transport>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </uri_transports>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </migration_features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <topology>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <cells num='1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <cell id='0'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           <memory unit='KiB'>16116612</memory>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           <pages unit='KiB' size='2048'>0</pages>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           <distances>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <sibling id='0' value='10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           </distances>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           <cpus num='8'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:           </cpus>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         </cell>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </cells>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </topology>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <cache>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </cache>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <secmodel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model>selinux</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <doi>0</doi>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </secmodel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <secmodel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model>dac</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <doi>0</doi>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </secmodel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </host>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <guest>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <os_type>hvm</os_type>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <arch name='i686'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <wordsize>32</wordsize>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <domain type='qemu'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <domain type='kvm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <pae/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <nonpae/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <acpi default='on' toggle='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <apic default='on' toggle='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <cpuselection/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <deviceboot/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <externalSnapshot/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </guest>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <guest>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <os_type>hvm</os_type>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <arch name='x86_64'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <wordsize>64</wordsize>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <domain type='qemu'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <domain type='kvm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <acpi default='on' toggle='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <apic default='on' toggle='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <cpuselection/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <deviceboot/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <externalSnapshot/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </guest>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: </capabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.035 230556 DEBUG nova.virt.libvirt.volume.mount [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.036 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.042 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: <domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <domain>kvm</domain>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <arch>i686</arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <vcpu max='1024'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <iothreads supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <os supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='firmware'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <loader supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>rom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pflash</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='readonly'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>yes</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='secure'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </loader>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </os>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='maximumMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <vendor>AMD</vendor>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='succor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='custom' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <memoryBacking supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='sourceType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>anonymous</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>memfd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </memoryBacking>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <disk supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='diskDevice'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>disk</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cdrom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>floppy</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>lun</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>fdc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>sata</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </disk>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <graphics supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vnc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egl-headless</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </graphics>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <video supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='modelType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vga</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cirrus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>none</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>bochs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ramfb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </video>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hostdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='mode'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>subsystem</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='startupPolicy'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>mandatory</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>requisite</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>optional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='subsysType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pci</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='capsType'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='pciBackend'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hostdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <rng supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>random</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </rng>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <filesystem supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='driverType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>path</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>handle</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtiofs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </filesystem>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tpm supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-tis</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-crb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emulator</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>external</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendVersion'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>2.0</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </tpm>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <redirdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </redirdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <channel supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </channel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <crypto supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </crypto>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <interface supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>passt</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </interface>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <panic supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>isa</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>hyperv</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </panic>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <console supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>null</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dev</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pipe</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stdio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>udp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tcp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu-vdagent</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </console>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <gic supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <genid supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backup supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <async-teardown supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <s390-pv supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <ps2 supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tdx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sev supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sgx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hyperv supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='features'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>relaxed</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vapic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>spinlocks</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vpindex</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>runtime</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>synic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stimer</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reset</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vendor_id</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>frequencies</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reenlightenment</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tlbflush</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ipi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>avic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emsr_bitmap</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>xmm_input</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hyperv>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <launchSecurity supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: </domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.049 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: <domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <domain>kvm</domain>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <arch>i686</arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <vcpu max='240'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <iothreads supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <os supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='firmware'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <loader supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>rom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pflash</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='readonly'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>yes</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='secure'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </loader>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </os>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='maximumMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <vendor>AMD</vendor>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='succor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='custom' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain python3.9[230679]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <memoryBacking supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='sourceType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>anonymous</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>memfd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </memoryBacking>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <disk supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='diskDevice'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>disk</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cdrom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>floppy</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>lun</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ide</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>fdc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>sata</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </disk>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <graphics supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vnc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egl-headless</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </graphics>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <video supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='modelType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vga</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cirrus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>none</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>bochs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ramfb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </video>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hostdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='mode'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>subsystem</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='startupPolicy'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>mandatory</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>requisite</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>optional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='subsysType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pci</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='capsType'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='pciBackend'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hostdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <rng supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>random</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </rng>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <filesystem supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='driverType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>path</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>handle</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtiofs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </filesystem>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tpm supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-tis</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-crb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emulator</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>external</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendVersion'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>2.0</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </tpm>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <redirdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </redirdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <channel supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </channel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <crypto supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </crypto>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <interface supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>passt</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </interface>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <panic supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>isa</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>hyperv</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </panic>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <console supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>null</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dev</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pipe</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stdio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>udp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tcp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu-vdagent</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </console>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <gic supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <genid supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backup supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <async-teardown supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <s390-pv supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <ps2 supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tdx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sev supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sgx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hyperv supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='features'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>relaxed</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vapic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>spinlocks</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vpindex</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>runtime</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>synic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stimer</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reset</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vendor_id</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>frequencies</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reenlightenment</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tlbflush</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ipi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>avic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emsr_bitmap</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>xmm_input</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hyperv>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <launchSecurity supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: </domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.098 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.106 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: <domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <domain>kvm</domain>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <arch>x86_64</arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <vcpu max='1024'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <iothreads supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <os supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='firmware'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>efi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <loader supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>rom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pflash</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='readonly'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>yes</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='secure'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>yes</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </loader>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </os>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='maximumMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <vendor>AMD</vendor>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='succor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='custom' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:12 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:12 np0005625204.localdomain systemd[1]: Started libpod-conmon-8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4.scope.
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <memoryBacking supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='sourceType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>anonymous</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>memfd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </memoryBacking>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <disk supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='diskDevice'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>disk</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cdrom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>floppy</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>lun</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>fdc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>sata</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </disk>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <graphics supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vnc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egl-headless</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </graphics>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <video supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='modelType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vga</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cirrus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>none</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>bochs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ramfb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </video>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hostdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='mode'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>subsystem</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='startupPolicy'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>mandatory</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>requisite</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>optional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='subsysType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pci</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='capsType'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='pciBackend'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hostdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <rng supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>random</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </rng>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <filesystem supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='driverType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>path</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>handle</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtiofs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </filesystem>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tpm supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-tis</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-crb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emulator</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>external</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendVersion'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>2.0</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </tpm>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <redirdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </redirdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <channel supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </channel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <crypto supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </crypto>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <interface supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>passt</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </interface>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <panic supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>isa</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>hyperv</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </panic>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <console supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>null</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dev</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pipe</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stdio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>udp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tcp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu-vdagent</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </console>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <gic supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <genid supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backup supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <async-teardown supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <s390-pv supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <ps2 supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tdx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sev supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sgx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hyperv supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='features'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>relaxed</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vapic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>spinlocks</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vpindex</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>runtime</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>synic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stimer</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reset</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vendor_id</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>frequencies</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reenlightenment</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tlbflush</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ipi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>avic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emsr_bitmap</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>xmm_input</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hyperv>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <launchSecurity supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: </domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.213 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: <domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <domain>kvm</domain>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <arch>x86_64</arch>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <vcpu max='240'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <iothreads supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <os supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='firmware'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <loader supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>rom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pflash</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='readonly'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>yes</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='secure'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>no</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </loader>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </os>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='maximum' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='maximumMigratable'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>on</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>off</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='host-model' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <vendor>AMD</vendor>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='x2apic'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='stibp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='succor'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lbrv'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <mode name='custom' supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Broadwell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ddpd-u'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sha512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm3'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sm4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain podman[230725]: 2026-02-20 09:24:12.37101753 +0000 UTC m=+0.133744652 container init 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Cooperlake-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain podman[230725]: 2026-02-20 09:24:12.381165524 +0000 UTC m=+0.143892646 container start 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=nova_compute_init, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Denverton-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Dhyana-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain python3.9[230679]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amd-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='auto-ibrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibpb-brtype'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='no-nested-data-bp'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='null-sel-clr-base'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='perfmon-v2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbpb'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='stibp-always-on'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='EPYC-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Applying nova statedir ownership
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3 already 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3 to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/console.log
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ccf3906461ed5c78e2a6f963756ac32b4b049bce
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ccf3906461ed5c78e2a6f963756ac32b4b049bce
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/d301d14069645d8c23fee2987984776b3e88a570e1aa96d6cf3e31fa880385fd
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute_init[230746]: INFO:nova_statedir:Nova statedir ownership complete
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-128'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-256'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx10-512'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='prefetchiti'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Haswell-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='IvyBridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='KnightsMill-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4fmaps'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-4vnniw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512er'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512pf'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fma4'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tbm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xop'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain systemd[1]: libpod-8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4.scope: Deactivated successfully.
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='amx-tile'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-bf16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-fp16'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bitalg'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vbmi2'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrc'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fzrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='la57'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='taa-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='tsx-ldtrk'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='SierraForest-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ifma'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-ne-convert'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx-vnni-int8'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bhi-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='bus-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cmpccxadd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fbsdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='fsrs'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ibrs-all'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='intel-psfd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ipred-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='lam'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mcdt-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pbrsb-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='psdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rfds-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rrsba-ctrl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='serialize'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vaes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='vpclmulqdq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='hle'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='rtm'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512bw'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512cd'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512dq'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512f'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='avx512vl'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='invpcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pcid'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='pku'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='mpx'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v2'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v3'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='core-capability'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='split-lock-detect'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='Snowridge-v4'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='cldemote'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='erms'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='gfni'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdir64b'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='movdiri'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='xsaves'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='athlon-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='core2duo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='coreduo-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='n270-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='ss'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <blockers model='phenom-v1'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnow'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <feature name='3dnowext'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </blockers>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </mode>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </cpu>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <memoryBacking supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <enum name='sourceType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>anonymous</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <value>memfd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </memoryBacking>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <disk supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='diskDevice'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>disk</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cdrom</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>floppy</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>lun</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ide</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>fdc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>sata</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </disk>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <graphics supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vnc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egl-headless</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </graphics>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <video supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='modelType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vga</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>cirrus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>none</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>bochs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ramfb</value>
Feb 20 09:24:12 np0005625204.localdomain podman[230747]: 2026-02-20 09:24:12.463352858 +0000 UTC m=+0.063452377 container died 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </video>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hostdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='mode'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>subsystem</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='startupPolicy'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>mandatory</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>requisite</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>optional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='subsysType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pci</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>scsi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='capsType'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='pciBackend'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hostdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <rng supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtio-non-transitional</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>random</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>egd</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </rng>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <filesystem supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='driverType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>path</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>handle</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>virtiofs</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </filesystem>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tpm supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-tis</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tpm-crb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emulator</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>external</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendVersion'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>2.0</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </tpm>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <redirdev supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='bus'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>usb</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </redirdev>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <channel supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </channel>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <crypto supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendModel'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>builtin</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </crypto>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <interface supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='backendType'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>default</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>passt</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </interface>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <panic supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='model'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>isa</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>hyperv</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </panic>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <console supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='type'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>null</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vc</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pty</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dev</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>file</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>pipe</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stdio</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>udp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tcp</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>unix</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>qemu-vdagent</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>dbus</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </console>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </devices>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   <features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <gic supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <vmcoreinfo supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <genid supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backingStoreInput supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <backup supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <async-teardown supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <s390-pv supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <ps2 supported='yes'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <tdx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sev supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <sgx supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <hyperv supported='yes'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <enum name='features'>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>relaxed</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vapic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>spinlocks</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vpindex</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>runtime</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>synic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>stimer</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reset</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>vendor_id</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>frequencies</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>reenlightenment</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>tlbflush</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>ipi</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>avic</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>emsr_bitmap</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <value>xmm_input</value>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </enum>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       <defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <spinlocks>4095</spinlocks>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <stimer_direct>on</stimer_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:       </defaults>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     </hyperv>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:     <launchSecurity supported='no'/>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:   </features>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: </domainCapabilities>
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.356 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.357 230556 INFO nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Secure Boot support detected
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.361 230556 INFO nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.361 230556 INFO nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.395 230556 DEBUG nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.429 230556 INFO nova.virt.node [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.448 230556 DEBUG nova.compute.manager [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Verified node 41976f9f-3656-482f-8ad0-c81e454a3952 matches my host np0005625204.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.472 230556 DEBUG nova.compute.manager [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.478 230556 DEBUG nova.virt.libvirt.vif [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005625204.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T08:23:36Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.478 230556 DEBUG nova.network.os_vif_util [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"dhcp_server": "192.168.0.1"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system"}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.479 230556 DEBUG nova.network.os_vif_util [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.480 230556 DEBUG os_vif [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.518 230556 DEBUG ovsdbapp.backend.ovs_idl [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.518 230556 DEBUG ovsdbapp.backend.ovs_idl [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.519 230556 DEBUG ovsdbapp.backend.ovs_idl [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.519 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.520 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.520 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.520 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.521 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.524 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:12 np0005625204.localdomain podman[230757]: 2026-02-20 09:24:12.527746204 +0000 UTC m=+0.078883909 container cleanup 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20260127, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:24:12 np0005625204.localdomain systemd[1]: libpod-conmon-8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4.scope: Deactivated successfully.
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.540 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.540 230556 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.540 230556 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:24:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:12.542 230556 INFO oslo.privsep.daemon [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp7bebqrnp/privsep.sock']
Feb 20 09:24:12 np0005625204.localdomain sudo[230677]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:12 np0005625204.localdomain rsyslogd[758]: imjournal from <localhost:nova_compute>: begin to drop messages due to rate-limiting
Feb 20 09:24:13 np0005625204.localdomain sshd[211700]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: session-54.scope: Consumed 1min 40.135s CPU time.
Feb 20 09:24:13 np0005625204.localdomain systemd-logind[759]: Session 54 logged out. Waiting for processes to exit.
Feb 20 09:24:13 np0005625204.localdomain systemd-logind[759]: Removed session 54.
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.134 230556 INFO oslo.privsep.daemon [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.019 230805 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.021 230805 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.023 230805 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.023 230805 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230805
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf-merged.mount: Deactivated successfully.
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4-userdata-shm.mount: Deactivated successfully.
Feb 20 09:24:13 np0005625204.localdomain podman[230810]: 2026-02-20 09:24:13.383802428 +0000 UTC m=+0.069796989 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 20 09:24:13 np0005625204.localdomain podman[230810]: 2026-02-20 09:24:13.387837167 +0000 UTC m=+0.073831708 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:24:13 np0005625204.localdomain podman[230809]: 2026-02-20 09:24:13.415425988 +0000 UTC m=+0.100720807 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.443 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.443 230556 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.443 230556 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.445 230556 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.445 230556 INFO os_vif [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.446 230556 DEBUG nova.compute.manager [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:24:13 np0005625204.localdomain podman[230809]: 2026-02-20 09:24:13.448713481 +0000 UTC m=+0.134008310 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.450 230556 DEBUG nova.compute.manager [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.450 230556 INFO nova.compute.manager [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 20 09:24:13 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.536 230556 DEBUG oslo_concurrency.lockutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.536 230556 DEBUG oslo_concurrency.lockutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.537 230556 DEBUG oslo_concurrency.lockutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.537 230556 DEBUG nova.compute.resource_tracker [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:24:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:13.537 230556 DEBUG oslo_concurrency.processutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.001 230556 DEBUG oslo_concurrency.processutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.061 230556 DEBUG nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.061 230556 DEBUG nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.288 230556 WARNING nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.290 230556 DEBUG nova.compute.resource_tracker [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12938MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.291 230556 DEBUG oslo_concurrency.lockutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.291 230556 DEBUG oslo_concurrency.lockutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:24:14 np0005625204.localdomain systemd[1]: tmp-crun.PoSu0v.mount: Deactivated successfully.
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.407 230556 DEBUG nova.compute.resource_tracker [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.408 230556 DEBUG nova.compute.resource_tracker [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.408 230556 DEBUG nova.compute.resource_tracker [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.474 230556 DEBUG nova.scheduler.client.report [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.556 230556 DEBUG nova.scheduler.client.report [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.557 230556 DEBUG nova.compute.provider_tree [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.583 230556 DEBUG nova.scheduler.client.report [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.610 230556 DEBUG nova.scheduler.client.report [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:24:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:14.663 230556 DEBUG oslo_concurrency.processutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.202 230556 DEBUG oslo_concurrency.processutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.539s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.208 230556 DEBUG nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.208 230556 INFO nova.virt.libvirt.host [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] kernel doesn't support AMD SEV
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.210 230556 DEBUG nova.compute.provider_tree [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.211 230556 DEBUG nova.virt.libvirt.driver [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.232 230556 DEBUG nova.scheduler.client.report [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.261 230556 DEBUG nova.compute.resource_tracker [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.261 230556 DEBUG oslo_concurrency.lockutils [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.970s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.262 230556 DEBUG nova.service [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.299 230556 DEBUG nova.service [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 20 09:24:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:15.300 230556 DEBUG nova.servicegroup.drivers.db [None req-2657e994-4a91-463c-97b4-2134116d0bc4 - - - - - -] DB_Driver: join new ServiceGroup member np0005625204.localdomain to the compute group, service = <Service: host=np0005625204.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 20 09:24:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25213 DF PROTO=TCP SPT=52982 DPT=9102 SEQ=4192499492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996EF680000000001030307) 
Feb 20 09:24:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:17.522 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4922 DF PROTO=TCP SPT=46256 DPT=9882 SEQ=246528347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5996F6AA0000000001030307) 
Feb 20 09:24:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:18.257 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:19 np0005625204.localdomain sshd[230893]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:24:19 np0005625204.localdomain sshd[230893]: Accepted publickey for zuul from 192.168.122.30 port 43136 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:24:19 np0005625204.localdomain systemd-logind[759]: New session 56 of user zuul.
Feb 20 09:24:19 np0005625204.localdomain systemd[1]: Started Session 56 of User zuul.
Feb 20 09:24:19 np0005625204.localdomain sshd[230893]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:24:20 np0005625204.localdomain python3.9[231004]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:24:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4924 DF PROTO=TCP SPT=46256 DPT=9882 SEQ=246528347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599702A80000000001030307) 
Feb 20 09:24:21 np0005625204.localdomain sudo[231116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlrxhukwgrrtuxmgkveakbrfhqivtyvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579461.1344433-66-248366533472884/AnsiballZ_systemd_service.py
Feb 20 09:24:21 np0005625204.localdomain sudo[231116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:21 np0005625204.localdomain python3.9[231118]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:24:21 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:24:22 np0005625204.localdomain systemd-rc-local-generator[231145]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:24:22 np0005625204.localdomain systemd-sysv-generator[231148]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:24:22 np0005625204.localdomain sshd[230586]: error: kex_exchange_identification: read: Connection timed out
Feb 20 09:24:22 np0005625204.localdomain sshd[230586]: banner exchange: Connection from 114.216.2.84 port 32864: Connection timed out
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:22 np0005625204.localdomain sudo[231116]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:22.524 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:23 np0005625204.localdomain python3.9[231261]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:24:23 np0005625204.localdomain network[231278]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:24:23 np0005625204.localdomain network[231279]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:24:23 np0005625204.localdomain network[231280]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:24:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:23.260 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:24:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4925 DF PROTO=TCP SPT=46256 DPT=9882 SEQ=246528347 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599712680000000001030307) 
Feb 20 09:24:27 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:27.527 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6925 DF PROTO=TCP SPT=48582 DPT=9105 SEQ=3218498153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59971D680000000001030307) 
Feb 20 09:24:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:28.263 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:29 np0005625204.localdomain sudo[231511]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdskdtmpxbqibqarzrtqfhxvafufkenp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579469.3425918-123-80106946266310/AnsiballZ_systemd_service.py
Feb 20 09:24:29 np0005625204.localdomain sudo[231511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6926 DF PROTO=TCP SPT=48582 DPT=9105 SEQ=3218498153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599725680000000001030307) 
Feb 20 09:24:29 np0005625204.localdomain python3.9[231513]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:24:29 np0005625204.localdomain sudo[231511]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:30 np0005625204.localdomain sudo[231622]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eterfgbubgfibsosuaoraqfwupmhrseb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579470.2755961-153-115260961040245/AnsiballZ_file.py
Feb 20 09:24:30 np0005625204.localdomain sudo[231622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:30 np0005625204.localdomain python3.9[231624]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:24:30 np0005625204.localdomain sudo[231622]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:30 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 76.6 (255 of 333 items), suggesting rotation.
Feb 20 09:24:30 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:24:30 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:24:30 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:24:30 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:24:31 np0005625204.localdomain sudo[231733]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqamhtjahjltavnfocoqvyywjxyhlclt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579471.1008768-177-278167992278874/AnsiballZ_file.py
Feb 20 09:24:31 np0005625204.localdomain sudo[231733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:31 np0005625204.localdomain python3.9[231735]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:24:31 np0005625204.localdomain sudo[231733]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:32 np0005625204.localdomain sudo[231843]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njjrzlzpdxzfyuxoauldkvliyiuuyojx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579471.8631113-204-278275567189663/AnsiballZ_command.py
Feb 20 09:24:32 np0005625204.localdomain sudo[231843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:32 np0005625204.localdomain python3.9[231845]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:24:32 np0005625204.localdomain sudo[231843]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:32 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:32.528 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31553 DF PROTO=TCP SPT=38784 DPT=9100 SEQ=1474243654 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599731680000000001030307) 
Feb 20 09:24:33 np0005625204.localdomain python3.9[231955]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:24:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:33.267 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:33 np0005625204.localdomain sudo[232063]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvizaurzdqpfdpobjvtlqbdbrehoaqiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579473.5306618-258-76144066698469/AnsiballZ_systemd_service.py
Feb 20 09:24:33 np0005625204.localdomain sudo[232063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:34 np0005625204.localdomain python3.9[232065]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:24:34 np0005625204.localdomain systemd-rc-local-generator[232090]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:24:34 np0005625204.localdomain systemd-sysv-generator[232096]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:34 np0005625204.localdomain sudo[232063]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:35 np0005625204.localdomain sudo[232209]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgenwilyksdipsdyrtjjqvtbspywgojv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579474.8562858-282-195480927811166/AnsiballZ_command.py
Feb 20 09:24:35 np0005625204.localdomain sudo[232209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:35 np0005625204.localdomain python3.9[232211]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:24:35 np0005625204.localdomain sudo[232209]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59407 DF PROTO=TCP SPT=54442 DPT=9101 SEQ=1734728933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59973F550000000001030307) 
Feb 20 09:24:36 np0005625204.localdomain sudo[232320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovfkyiecwsodflubwslrkqegpakyxaxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579476.6001072-309-202387384991753/AnsiballZ_file.py
Feb 20 09:24:36 np0005625204.localdomain sudo[232320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:37 np0005625204.localdomain python3.9[232322]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:24:37 np0005625204.localdomain sudo[232320]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:37 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:37.531 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:38.269 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:38 np0005625204.localdomain python3.9[232430]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59409 DF PROTO=TCP SPT=54442 DPT=9101 SEQ=1734728933 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59974B680000000001030307) 
Feb 20 09:24:40 np0005625204.localdomain sudo[232540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pslchoukfofajjdsyjdfswdyvvdinepw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579478.6576278-356-28560589070086/AnsiballZ_group.py
Feb 20 09:24:40 np0005625204.localdomain sudo[232540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:40 np0005625204.localdomain python3.9[232542]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 20 09:24:40 np0005625204.localdomain sudo[232540]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:41 np0005625204.localdomain sudo[232650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alfwwxuetwcqpseapptjzmrwgsmvqfyf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579480.6735969-390-87614036511037/AnsiballZ_getent.py
Feb 20 09:24:41 np0005625204.localdomain sudo[232650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:41 np0005625204.localdomain python3.9[232652]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 20 09:24:41 np0005625204.localdomain sudo[232650]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:41 np0005625204.localdomain sudo[232761]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjebtbdkuuhxjvzdkbimamrvekwskjiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579481.4976852-414-149241913787877/AnsiballZ_group.py
Feb 20 09:24:41 np0005625204.localdomain sudo[232761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6928 DF PROTO=TCP SPT=48582 DPT=9105 SEQ=3218498153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599755680000000001030307) 
Feb 20 09:24:42 np0005625204.localdomain python3.9[232763]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 20 09:24:42 np0005625204.localdomain groupadd[232764]: group added to /etc/group: name=ceilometer, GID=42405
Feb 20 09:24:42 np0005625204.localdomain groupadd[232764]: group added to /etc/gshadow: name=ceilometer
Feb 20 09:24:42 np0005625204.localdomain groupadd[232764]: new group: name=ceilometer, GID=42405
Feb 20 09:24:42 np0005625204.localdomain sudo[232761]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:42 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:42.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:42 np0005625204.localdomain sudo[232877]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tneofdhvvnljlmzhanutdjwfwpxbiafd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579482.2843485-438-183780363653224/AnsiballZ_user.py
Feb 20 09:24:42 np0005625204.localdomain sudo[232877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:42 np0005625204.localdomain python3.9[232879]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005625204.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 20 09:24:43 np0005625204.localdomain useradd[232881]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/1
Feb 20 09:24:43 np0005625204.localdomain useradd[232881]: add 'ceilometer' to group 'libvirt'
Feb 20 09:24:43 np0005625204.localdomain useradd[232881]: add 'ceilometer' to shadow group 'libvirt'
Feb 20 09:24:43 np0005625204.localdomain sudo[232877]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:43.271 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:24:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:24:44 np0005625204.localdomain podman[232964]: 2026-02-20 09:24:44.158400185 +0000 UTC m=+0.087255910 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:24:44 np0005625204.localdomain systemd[1]: tmp-crun.hSHIdx.mount: Deactivated successfully.
Feb 20 09:24:44 np0005625204.localdomain podman[232968]: 2026-02-20 09:24:44.224425098 +0000 UTC m=+0.150259240 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:24:44 np0005625204.localdomain podman[232968]: 2026-02-20 09:24:44.25791601 +0000 UTC m=+0.183750102 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:24:44 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:24:44 np0005625204.localdomain podman[232964]: 2026-02-20 09:24:44.276431131 +0000 UTC m=+0.205286846 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20260127)
Feb 20 09:24:44 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:24:44 np0005625204.localdomain python3.9[233015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:45 np0005625204.localdomain python3.9[233122]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771579483.9175606-516-158310086862961/.source.conf _original_basename=ceilometer.conf follow=False checksum=995f60cd4d2c51f98e8243d6429f9405f206b7a7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:24:45 np0005625204.localdomain python3.9[233230]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12352 DF PROTO=TCP SPT=51310 DPT=9102 SEQ=1273400263 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599765680000000001030307) 
Feb 20 09:24:46 np0005625204.localdomain python3.9[233316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771579485.16384-516-178969023023438/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:24:46 np0005625204.localdomain python3.9[233424]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:47 np0005625204.localdomain python3.9[233510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1771579486.1608555-516-45148613501225/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:24:47 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:47.539 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20221 DF PROTO=TCP SPT=37546 DPT=9882 SEQ=1961228338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59976BDB0000000001030307) 
Feb 20 09:24:47 np0005625204.localdomain python3.9[233618]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:48 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:48.303 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:48 np0005625204.localdomain python3.9[233726]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:49 np0005625204.localdomain python3.9[233834]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:49 np0005625204.localdomain python3.9[233920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579488.5519207-693-225965656600121/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=49793345ea5c32ea14c1cea30b8f951fecb6f4d8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:24:50 np0005625204.localdomain python3.9[234028]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:50 np0005625204.localdomain python3.9[234114]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579489.6796713-693-83274816232850/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=e2858327749c09c7b8ca5fc97985d7885b95bd4b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:24:50 np0005625204.localdomain sudo[234120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:24:50 np0005625204.localdomain sudo[234120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:24:50 np0005625204.localdomain sudo[234120]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20223 DF PROTO=TCP SPT=37546 DPT=9882 SEQ=1961228338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599777E80000000001030307) 
Feb 20 09:24:50 np0005625204.localdomain sudo[234150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:24:50 np0005625204.localdomain sudo[234150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:24:51 np0005625204.localdomain python3.9[234258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:51 np0005625204.localdomain podman[234416]: 2026-02-20 09:24:51.67111455 +0000 UTC m=+0.098701873 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, name=rhceph, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True)
Feb 20 09:24:51 np0005625204.localdomain python3.9[234412]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579490.7955-780-252689686215548/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:24:51 np0005625204.localdomain podman[234416]: 2026-02-20 09:24:51.804170069 +0000 UTC m=+0.231757392 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., release=1770267347, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:24:52 np0005625204.localdomain sudo[234150]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:52 np0005625204.localdomain sudo[234498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:24:52 np0005625204.localdomain sudo[234498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:24:52 np0005625204.localdomain sudo[234498]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:52 np0005625204.localdomain sudo[234516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:24:52 np0005625204.localdomain sudo[234516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:24:52 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:52.575 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:52 np0005625204.localdomain python3.9[234624]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:52 np0005625204.localdomain sudo[234516]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:53.305 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:53 np0005625204.localdomain python3.9[234765]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:53 np0005625204.localdomain sudo[234783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:24:53 np0005625204.localdomain sudo[234783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:24:53 np0005625204.localdomain sudo[234783]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:53 np0005625204.localdomain python3.9[234891]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:24:54 np0005625204.localdomain sshd[234909]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:24:54 np0005625204.localdomain sudo[235001]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjbyffkemkbvcqeephosrjgwvtsefegx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579494.1174662-907-134400408190456/AnsiballZ_file.py
Feb 20 09:24:54 np0005625204.localdomain sudo[235001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:54 np0005625204.localdomain python3.9[235003]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:24:54 np0005625204.localdomain sudo[235001]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20224 DF PROTO=TCP SPT=37546 DPT=9882 SEQ=1961228338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599787A80000000001030307) 
Feb 20 09:24:55 np0005625204.localdomain sudo[235111]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inkcwpvmeucunihfgqmfeurfvpqxzxhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579494.867279-931-11661977943660/AnsiballZ_systemd_service.py
Feb 20 09:24:55 np0005625204.localdomain sudo[235111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:55 np0005625204.localdomain python3.9[235113]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:24:55 np0005625204.localdomain systemd-rc-local-generator[235138]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:24:55 np0005625204.localdomain systemd-sysv-generator[235142]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:24:55 np0005625204.localdomain systemd[1]: Listening on Podman API Socket.
Feb 20 09:24:55 np0005625204.localdomain sudo[235111]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:55 np0005625204.localdomain sshd[234909]: Invalid user nutanix from 27.112.79.3 port 50960
Feb 20 09:24:56 np0005625204.localdomain sshd[234909]: Received disconnect from 27.112.79.3 port 50960:11: Bye Bye [preauth]
Feb 20 09:24:56 np0005625204.localdomain sshd[234909]: Disconnected from invalid user nutanix 27.112.79.3 port 50960 [preauth]
Feb 20 09:24:57 np0005625204.localdomain sshd[235171]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:24:57 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:57.578 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41859 DF PROTO=TCP SPT=35374 DPT=9105 SEQ=1328790190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599792A80000000001030307) 
Feb 20 09:24:58 np0005625204.localdomain sshd[235171]: Invalid user airflow from 54.36.99.29 port 55048
Feb 20 09:24:58 np0005625204.localdomain sshd[235171]: Received disconnect from 54.36.99.29 port 55048:11: Bye Bye [preauth]
Feb 20 09:24:58 np0005625204.localdomain sshd[235171]: Disconnected from invalid user airflow 54.36.99.29 port 55048 [preauth]
Feb 20 09:24:58 np0005625204.localdomain sudo[235263]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gynxnqpatirpghsnevteqksfumfuzbfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579497.8664367-957-230439466415150/AnsiballZ_stat.py
Feb 20 09:24:58 np0005625204.localdomain sudo[235263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:24:58.307 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:24:58 np0005625204.localdomain python3.9[235265]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:58 np0005625204.localdomain sudo[235263]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:58 np0005625204.localdomain sudo[235351]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfnuoruwcdkpnmgxzckoybsqvxjfkols ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579497.8664367-957-230439466415150/AnsiballZ_copy.py
Feb 20 09:24:58 np0005625204.localdomain sudo[235351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:58 np0005625204.localdomain python3.9[235353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579497.8664367-957-230439466415150/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:24:58 np0005625204.localdomain sudo[235351]: pam_unix(sudo:session): session closed for user root
Feb 20 09:24:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41860 DF PROTO=TCP SPT=35374 DPT=9105 SEQ=1328790190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59979AA80000000001030307) 
Feb 20 09:24:59 np0005625204.localdomain sudo[235406]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nejwtxaugjgwibexnwxfbuikiyydqfld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579497.8664367-957-230439466415150/AnsiballZ_stat.py
Feb 20 09:24:59 np0005625204.localdomain sudo[235406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:24:59 np0005625204.localdomain python3.9[235408]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:24:59 np0005625204.localdomain sudo[235406]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:00 np0005625204.localdomain sudo[235494]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbcgbqbtcvymrcmismdjigppfrsurxzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579497.8664367-957-230439466415150/AnsiballZ_copy.py
Feb 20 09:25:00 np0005625204.localdomain sudo[235494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:00 np0005625204.localdomain python3.9[235496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579497.8664367-957-230439466415150/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:25:00 np0005625204.localdomain sudo[235494]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:01.303 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:01.324 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 20 09:25:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:01.325 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:25:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:01.325 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:25:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:01.326 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:01.398 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:25:01 np0005625204.localdomain sudo[235604]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmnihgiwscftdirgqyokfksnqjezchlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579501.2262099-1053-12583454438844/AnsiballZ_file.py
Feb 20 09:25:01 np0005625204.localdomain sudo[235604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:01 np0005625204.localdomain python3.9[235606]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:01 np0005625204.localdomain sudo[235604]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:02 np0005625204.localdomain sudo[235714]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ioebkpssamussumcgxmupyzgatxqlwzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579501.918796-1077-143640769440230/AnsiballZ_file.py
Feb 20 09:25:02 np0005625204.localdomain sudo[235714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:02 np0005625204.localdomain python3.9[235716]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:25:02 np0005625204.localdomain sudo[235714]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:02 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:02.629 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:02 np0005625204.localdomain sudo[235824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsfqoezknwqbbsfiqiqjymtxsxaqjfod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579502.6353996-1101-92626507338827/AnsiballZ_stat.py
Feb 20 09:25:02 np0005625204.localdomain sudo[235824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20225 DF PROTO=TCP SPT=37546 DPT=9882 SEQ=1961228338 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997A7680000000001030307) 
Feb 20 09:25:03 np0005625204.localdomain python3.9[235826]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:03 np0005625204.localdomain sudo[235824]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:03.311 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:03 np0005625204.localdomain sudo[235914]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzcbkhjnztgsmlwsozuubcpzydolspig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579502.6353996-1101-92626507338827/AnsiballZ_copy.py
Feb 20 09:25:03 np0005625204.localdomain sudo[235914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:03 np0005625204.localdomain python3.9[235916]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579502.6353996-1101-92626507338827/.source.json _original_basename=.711o1j_e follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:03 np0005625204.localdomain sudo[235914]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:04 np0005625204.localdomain python3.9[236024]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:25:05.981 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:25:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:25:05.982 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:25:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:25:05.984 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:25:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45595 DF PROTO=TCP SPT=52082 DPT=9101 SEQ=1113708391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997B4850000000001030307) 
Feb 20 09:25:06 np0005625204.localdomain sudo[236326]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxfaydifnnutbhlkfokqsljyjspbmfbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579506.1734993-1221-231257823688970/AnsiballZ_container_config_data.py
Feb 20 09:25:06 np0005625204.localdomain sudo[236326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:06 np0005625204.localdomain python3.9[236328]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Feb 20 09:25:06 np0005625204.localdomain sudo[236326]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:07 np0005625204.localdomain sudo[236436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jocrhvflinocjnhedktuxvwhfhhpbsal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579507.192213-1253-109485601656714/AnsiballZ_container_config_hash.py
Feb 20 09:25:07 np0005625204.localdomain sudo[236436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:07 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:07.661 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:07 np0005625204.localdomain python3.9[236438]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:25:07 np0005625204.localdomain sudo[236436]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:08.313 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45597 DF PROTO=TCP SPT=52082 DPT=9101 SEQ=1113708391 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997C0A80000000001030307) 
Feb 20 09:25:09 np0005625204.localdomain sudo[236546]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoyvogaqlphejcvxdfrtvugkgstevpyv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579508.9924998-1283-6298907208686/AnsiballZ_edpm_container_manage.py
Feb 20 09:25:09 np0005625204.localdomain sudo[236546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:09 np0005625204.localdomain python3[236548]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:25:09 np0005625204.localdomain python3[236548]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369",
                                                                    "Digest": "sha256:ac1f7272c172d96937d32067aeabcc7fe133ed3e13c60a2317e815e24d8d2689",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:ac1f7272c172d96937d32067aeabcc7fe133ed3e13c60a2317e815e24d8d2689"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:22:47.562315026Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 506512639,
                                                                    "VirtualSize": 506512639,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:439ba2a9156018a21d5d8f457e8fb5fa9d39d0de094f0cf38abf8f5215170cd7",
                                                                              "sha256:dd5ae5ce1d5c4d01e233915d61f7cac1450768a920fde6603b0c84bf26180c44"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:04.692187463Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:07.73027664Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:50.46772776Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:52.957817153Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:22:08.791988588Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:22:47.559747806Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:22:51.022505453Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 20 09:25:10 np0005625204.localdomain podman[236598]: 2026-02-20 09:25:10.04810766 +0000 UTC m=+0.089270172 container remove cf8cea6460edf16e05e5d818108401d0537ef083461027fe5ceaea5e64b3780a (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'ed809cd151e1fa8da7409fe229c809b7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510)
Feb 20 09:25:10 np0005625204.localdomain python3[236548]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Feb 20 09:25:10 np0005625204.localdomain podman[236612]: 
Feb 20 09:25:10 np0005625204.localdomain podman[236612]: 2026-02-20 09:25:10.147751779 +0000 UTC m=+0.081698708 container create 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:25:10 np0005625204.localdomain podman[236612]: 2026-02-20 09:25:10.109265483 +0000 UTC m=+0.043212422 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 20 09:25:10 np0005625204.localdomain python3[236548]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Feb 20 09:25:10 np0005625204.localdomain sudo[236546]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:11.320 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:11.321 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:11.321 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:25:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:11.321 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:25:11 np0005625204.localdomain sudo[236756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdledbkdymlslyppvigimocitugzpzgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579511.2457952-1308-41313207371459/AnsiballZ_stat.py
Feb 20 09:25:11 np0005625204.localdomain sudo[236756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:11 np0005625204.localdomain python3.9[236758]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:25:11 np0005625204.localdomain sudo[236756]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41862 DF PROTO=TCP SPT=35374 DPT=9105 SEQ=1328790190 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997CB680000000001030307) 
Feb 20 09:25:12 np0005625204.localdomain sudo[236868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlckakdttravhjqhpavxeixjwlhqzdnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579512.0746899-1334-270080254566890/AnsiballZ_file.py
Feb 20 09:25:12 np0005625204.localdomain sudo[236868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.423 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.423 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.423 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.423 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:25:12 np0005625204.localdomain python3.9[236870]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:12 np0005625204.localdomain sudo[236868]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.696 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:12 np0005625204.localdomain sudo[236923]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckpvsmannpwjjpcrldmodmspjxxvguxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579512.0746899-1334-270080254566890/AnsiballZ_stat.py
Feb 20 09:25:12 np0005625204.localdomain sudo[236923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:12 np0005625204.localdomain python3.9[236925]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:25:12 np0005625204.localdomain sudo[236923]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.949 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.978 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.978 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.979 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.980 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.980 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.980 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.981 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.981 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.981 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:25:12 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:12.982 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.004 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.005 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.005 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.005 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.006 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:13 np0005625204.localdomain sudo[237052]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dlzlosmrqebzfixlaxfnocdupjocevvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579512.9726305-1334-157001882814262/AnsiballZ_copy.py
Feb 20 09:25:13 np0005625204.localdomain sudo[237052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.465 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.539 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.539 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:25:13 np0005625204.localdomain python3.9[237054]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579512.9726305-1334-157001882814262/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:13 np0005625204.localdomain sudo[237052]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.760 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.762 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12930MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.763 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.763 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.856 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.857 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.857 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:25:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:13.915 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:25:14 np0005625204.localdomain sudo[237129]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezgihktfloqifxluxaekhbzsazlrdotd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579512.9726305-1334-157001882814262/AnsiballZ_systemd.py
Feb 20 09:25:14 np0005625204.localdomain sudo[237129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:14.361 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:25:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:14.370 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:25:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:14.387 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:25:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:14.390 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:25:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:14.391 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:25:14 np0005625204.localdomain python3.9[237131]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:25:14 np0005625204.localdomain systemd-rc-local-generator[237186]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:25:14 np0005625204.localdomain systemd-sysv-generator[237191]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:25:14 np0005625204.localdomain podman[237136]: 2026-02-20 09:25:14.74014475 +0000 UTC m=+0.153287134 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:25:14 np0005625204.localdomain podman[237135]: 2026-02-20 09:25:14.697194586 +0000 UTC m=+0.110622589 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain podman[237136]: 2026-02-20 09:25:14.770290568 +0000 UTC m=+0.183432882 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:14 np0005625204.localdomain podman[237135]: 2026-02-20 09:25:14.826285593 +0000 UTC m=+0.239713636 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:25:14 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:25:14 np0005625204.localdomain sudo[237129]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:15 np0005625204.localdomain sudo[237267]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqdrwwpozlgaroerhysalghzhysojizt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579512.9726305-1334-157001882814262/AnsiballZ_systemd.py
Feb 20 09:25:15 np0005625204.localdomain sudo[237267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:15 np0005625204.localdomain python3.9[237269]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:25:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=333 DF PROTO=TCP SPT=46852 DPT=9100 SEQ=189675342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997DB680000000001030307) 
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:25:16 np0005625204.localdomain systemd-rc-local-generator[237299]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:25:16 np0005625204.localdomain systemd-sysv-generator[237303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:16 np0005625204.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Feb 20 09:25:17 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:25:17 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4517b7dede5bd11464bacedc9df1dbfc032d57d5ee8302842628702ec46daa3/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Feb 20 09:25:17 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4517b7dede5bd11464bacedc9df1dbfc032d57d5ee8302842628702ec46daa3/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Feb 20 09:25:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:25:17 np0005625204.localdomain podman[237311]: 2026-02-20 09:25:17.138617827 +0000 UTC m=+0.157241416 container init 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + sudo -E kolla_set_configs
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: sudo: unable to send audit message: Operation not permitted
Feb 20 09:25:17 np0005625204.localdomain sudo[237332]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 20 09:25:17 np0005625204.localdomain sudo[237332]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 20 09:25:17 np0005625204.localdomain sudo[237332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 20 09:25:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:25:17 np0005625204.localdomain podman[237311]: 2026-02-20 09:25:17.189862215 +0000 UTC m=+0.208485804 container start 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:25:17 np0005625204.localdomain podman[237311]: ceilometer_agent_compute
Feb 20 09:25:17 np0005625204.localdomain systemd[1]: Started ceilometer_agent_compute container.
Feb 20 09:25:17 np0005625204.localdomain sudo[237267]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Validating config file
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Copying service configuration files
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: INFO:__main__:Writing out command to execute
Feb 20 09:25:17 np0005625204.localdomain sudo[237332]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: ++ cat /run_command
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + ARGS=
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + sudo kolla_copy_cacerts
Feb 20 09:25:17 np0005625204.localdomain sudo[237346]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: sudo: unable to send audit message: Operation not permitted
Feb 20 09:25:17 np0005625204.localdomain sudo[237346]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 20 09:25:17 np0005625204.localdomain sudo[237346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 20 09:25:17 np0005625204.localdomain sudo[237346]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + [[ ! -n '' ]]
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + . kolla_extend_start
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + umask 0022
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Feb 20 09:25:17 np0005625204.localdomain podman[237335]: 2026-02-20 09:25:17.312720351 +0000 UTC m=+0.116685977 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:25:17 np0005625204.localdomain podman[237335]: 2026-02-20 09:25:17.346088809 +0000 UTC m=+0.150054465 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:25:17 np0005625204.localdomain podman[237335]: unhealthy
Feb 20 09:25:17 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:25:17 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:25:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62614 DF PROTO=TCP SPT=49822 DPT=9882 SEQ=975647677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997E10A0000000001030307) 
Feb 20 09:25:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:17.731 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.974 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.974 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.974 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.974 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.974 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.975 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.976 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.977 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.978 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.979 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.980 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.981 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.982 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.983 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.984 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.985 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.986 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.987 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:17 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:17.988 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.009 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.010 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.012 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.113 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.172 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.172 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.172 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.172 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.172 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.173 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.174 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.175 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.176 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.177 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain python3.9[237464]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.178 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.179 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.180 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.181 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.182 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.183 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.184 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.185 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.186 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.187 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.188 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.189 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.190 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.190 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.191 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.197 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 20 09:25:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:18.316 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.503 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}2028e6ce9094494660f5cffcbc779343c02b97a747a97ce89085355a3a72cc91" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.654 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 327 Content-Type: application/json Date: Fri, 20 Feb 2026 09:25:18 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-d24d0e71-7918-43a2-84d8-17f69184ecbf x-openstack-request-id: req-d24d0e71-7918-43a2-84d8-17f69184ecbf _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.655 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "739ef37c-e459-414b-b65a-355581d54c7c", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/739ef37c-e459-414b-b65a-355581d54c7c"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/739ef37c-e459-414b-b65a-355581d54c7c"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.655 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-d24d0e71-7918-43a2-84d8-17f69184ecbf request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.656 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/739ef37c-e459-414b-b65a-355581d54c7c -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}2028e6ce9094494660f5cffcbc779343c02b97a747a97ce89085355a3a72cc91" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.685 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 494 Content-Type: application/json Date: Fri, 20 Feb 2026 09:25:18 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-a77aca17-6ec4-4e6f-9973-e6f70e628866 x-openstack-request-id: req-a77aca17-6ec4-4e6f-9973-e6f70e628866 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.685 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "739ef37c-e459-414b-b65a-355581d54c7c", "name": "m1.small", "ram": 512, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 1, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/739ef37c-e459-414b-b65a-355581d54c7c"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/739ef37c-e459-414b-b65a-355581d54c7c"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.685 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/739ef37c-e459-414b-b65a-355581d54c7c used request id req-a77aca17-6ec4-4e6f-9973-e6f70e628866 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.686 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.687 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.720 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.721 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b44f5b5-2aef-4383-aea2-d6df1c3b13f4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.687223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '128d5dd2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': 'a18454896edfb5f7734bdadde9f3cce158747744e7aedcba3231c8b33948e5a4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.687223', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '128d77cc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': 'a4879e6fe6e331d7ae47e9064cae9cb250a6ab0ede5b42fe235722f7b97c216c'}]}, 'timestamp': '2026-02-20 09:25:18.722093', '_unique_id': 'cec8311c687c4974a85b8d9e87921d85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.728 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.732 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.737 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f9924957-6cff-426e-9f03-c739820f4ff3 / tape7aa8e2a-27 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.737 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f34fac12-b876-49a9-b1d1-3848df0fa138', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.732461', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '128ff13c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '1bb83ad197fa061389810571d600852ad3931c09556a9d98608c9378fca0995e'}]}, 'timestamp': '2026-02-20 09:25:18.738329', '_unique_id': '44755e208fe2417a8b8c3b28f2eec7fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.739 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.740 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.740 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.741 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1efbadbd-6af5-4b7b-b78d-3e86910d06ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.740796', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1290657c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': 'c9d4334f3d24064454f5a7f81914dc2db6f76ac1a88b8d86e298167c5ec45a07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.740796', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '129076f2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': 'd6b3f584b96fbd19403255d18c885095073c89a2b2af339a1cd05607ff22c966'}]}, 'timestamp': '2026-02-20 09:25:18.741732', '_unique_id': '0a2b6cd076b74c28b59ce7544036dce5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.742 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.743 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.744 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5115541-d22d-4db4-b41d-980df592967b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.744098', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '1290ead8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': 'f49786c278c220c040f1d63cd8673175b6774803bac1e020c647fa92e807375e'}]}, 'timestamp': '2026-02-20 09:25:18.744786', '_unique_id': '034c6ef02be24a95a96cd4b7d392c48b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.745 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.746 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.747 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f283af4a-e989-4e7f-b64e-f4882262be23', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.747114', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '12915ff4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '61489f47521f5a1d970bf8a2ffae3e0849b96948e1df4d292fe022949758c840'}]}, 'timestamp': '2026-02-20 09:25:18.747789', '_unique_id': '16c674d29dac4df78ff8dcf0486d15e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.748 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.750 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.750 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.750 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba96f499-7a16-4977-bd5c-adddd1c486d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.750208', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1291d876-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': '53eb4f2c654fa07759839a38751e12b037e5c19c46ae3fc52b2579f158c4425d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.750208', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1291f068-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': '8f216b72d02754e617ffd7795ceee2553651937a4014e05475e14ad08e48c959'}]}, 'timestamp': '2026-02-20 09:25:18.751377', '_unique_id': 'a150e9d8d9c24621b93609c3781f1721'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.752 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.753 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.769 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.770 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4fa2a7e9-9d85-4a1b-aa28-3c0e165727e9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.753828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '1294cf18-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.993239878, 'message_signature': '6052cdb8e3d91abc84b301d100e932d422b2e2859007c3c1948240339a43a1a0'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.753828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '1294e246-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.993239878, 'message_signature': '617eab0665af928d0940164717f2a096bbc1ba23faaa69cc92aae1d40cc8ac71'}]}, 'timestamp': '2026-02-20 09:25:18.770691', '_unique_id': '5ecde3ad36ad44a69432b54d6b4feb8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.771 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.773 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.773 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.773 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5afb7ea-a279-44b8-bb2c-acedaa0469dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.773442', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '12956220-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.993239878, 'message_signature': 'd6595570b9faac88643a80ec27f457ca9ef0bf75efc141d309cfb1e82052d359'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.773442', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '129573be-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.993239878, 'message_signature': 'ecd7bb1152201ee861358005a0627d2d70277e3cda9b768635d368f914cacb3a'}]}, 'timestamp': '2026-02-20 09:25:18.774382', '_unique_id': '7bbe184841584572813d8110a34a3632'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.775 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.776 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.776 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.777 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.777 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.799 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3d7d8f0-e920-489d-b26c-39ef6da8367e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:25:18.777946', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '12996e88-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10078.03873254, 'message_signature': '4ff917d3b03c7abf4730f18e6a5131cf98cf0a3d8c35e824fe570393bc14bd71'}]}, 'timestamp': '2026-02-20 09:25:18.800480', '_unique_id': '9bb8155993ed4410a6320a6452d95364'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.801 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.802 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.803 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b6ffa72-59cc-4b33-8097-b5ab5df23429', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.803052', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '1299e584-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '6ca92f417168bf66ec0ffc3ad1ac4c58743f3822c1efcc01604a3e2324062ed3'}]}, 'timestamp': '2026-02-20 09:25:18.803530', '_unique_id': '4196f1731b96465bb82828f9d2ba8c14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.804 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.805 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.805 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.805 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.806 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.806 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c96b96ff-7de1-4953-9705-17eaf807e6b6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.806300', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '129a6450-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '3135dd027f8d136989229e2c1083af432aefc7053ee8df29f8db37bd1f54abff'}]}, 'timestamp': '2026-02-20 09:25:18.806805', '_unique_id': 'afbc43201b8847a583f27c1a3d93e493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.807 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.808 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.809 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.809 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.809 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.809 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2fd6a04f-d800-4569-aeb0-445249883351', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.809521', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '129ae402-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '24d28e902f9d2111fe8499503bcc231fb38e45e57fa1ec2d04a9d3b23e30008f'}]}, 'timestamp': '2026-02-20 09:25:18.810048', '_unique_id': '8806d1b797c544cca316b192f29f16fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.810 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.812 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.812 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.812 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de121259-e16c-4337-82f9-8ed5c47b4104', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.812234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '129b4bc2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.993239878, 'message_signature': 'bfff5ddc5e3c8df5d099c0f4bf3cd27dd7cd3f4a52fa93a999faa0326fe70a18'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.812234', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '129b5dce-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.993239878, 'message_signature': '3534c2905a7017a1808f70493cf4be0e564bca7eafbebdf0131d302b0baac205'}]}, 'timestamp': '2026-02-20 09:25:18.813134', '_unique_id': 'ff497b90928c4b3b99a55a85ade58e50'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.814 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.815 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.815 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6959e465-6f49-44dc-8062-4ec01a2b06f6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.815328', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '129bc4e4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '45649782c179180905612d26a0ef4c73862c3ee8a12efa572f131374b9a9233c'}]}, 'timestamp': '2026-02-20 09:25:18.815831', '_unique_id': '765e9270ef1b4031953da313f9f29443'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.816 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.817 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.818 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26daa04a-8fb5-4f95-b4f9-f2673c5d3500', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.817976', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '129c2c18-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '63bd041d26256378c42ce950dc7d3af6994f03666f97eb317fbdd7ca8e0babe0'}]}, 'timestamp': '2026-02-20 09:25:18.818441', '_unique_id': '568f5aacf7094111ade69efef4ff27e4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.819 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.820 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.820 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.821 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4547c9be-f539-40b2-94da-c6f05269a8d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.820590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '129c937e-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': 'b59034d3b0dfbea3a9ef52c6ea7f18fa45929fcbf8907f2b5eb8ea8bc48c8dfe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.820590', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '129ca3be-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': '288921891a49ef8935456215761ac0f0bbff0b822d38ca77781b676396ee8ac4'}]}, 'timestamp': '2026-02-20 09:25:18.821473', '_unique_id': '6bafe92df01e48d393150c9033db0888'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.822 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.823 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.823 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 57040000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39498e87-56ac-4793-abe4-b9df24672bc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 57040000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:25:18.823662', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '129d0a5c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10078.03873254, 'message_signature': '0570165c29b2f1486a5ba3284a9015b657d102d47a2f6e359c48a98217de32dc'}]}, 'timestamp': '2026-02-20 09:25:18.824138', '_unique_id': 'cec315c785c34d639501def0cea7dc33'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.824 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.825 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.825 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61350315-7ab2-41ae-a455-1621081f72d1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.825718', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '129d5764-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': 'df962bc6394adf693f2c1204270abc67764350b44f3f8728086965b2fb6611b3'}]}, 'timestamp': '2026-02-20 09:25:18.826015', '_unique_id': 'd10d492afe774eaf9d99b5ae2dfdbe1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.826 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.827 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.827 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.827 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '349862aa-7182-47f2-a1d8-f4a369dac09b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.827382', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '129d981e-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': 'c8962aa50b5008bbde29e3d1676f848f69c84b76731c70a8e6405bd53f0a08d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.827382', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '129da304-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': '7b286b7b7cc8b1445170b9d8d99dde1b553bbad34de702679390759307e2d75f'}]}, 'timestamp': '2026-02-20 09:25:18.827931', '_unique_id': '25fd57ae6ee94841a73284d887404230'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.828 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.829 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.829 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.829 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: test>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: test>]
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.829 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.829 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.829 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d3e1bd1-e0b3-4b98-a342-a85592c0e6ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:25:18.829663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '129df19c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': '97f6d5900af52bc07e4cf148cbebbbdc5bf5775fe7181bda74fd78423264a2ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:25:18.829663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '129dfbd8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.926468222, 'message_signature': '6d801f2a3640dcf1d193e29ab83d45f10a6dd7e714a47aaf3f15a0f17d1c7475'}]}, 'timestamp': '2026-02-20 09:25:18.830205', '_unique_id': 'a66876b879ba47ce8b8c5fb1e2ab27ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.830 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.831 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.831 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '189589c7-3097-49ff-94a4-57cc56bcb9a6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:25:18.831550', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '129e3c24-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10077.97185229, 'message_signature': '061fd3f5e86ba792d87f8dd9d0f98d0a904cd169ef0ae6bd827d22d277301a5b'}]}, 'timestamp': '2026-02-20 09:25:18.831868', '_unique_id': '372e052a499a4b848acdf729942e55b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:25:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:25:18.832 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:25:19 np0005625204.localdomain sudo[237579]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycfbvsqubaniwvvsitbjsbdrtgliuydx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579519.0352652-1469-72815883682193/AnsiballZ_stat.py
Feb 20 09:25:19 np0005625204.localdomain sudo[237579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:19 np0005625204.localdomain python3.9[237581]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:19 np0005625204.localdomain sudo[237579]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:19 np0005625204.localdomain sudo[237669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnylliowotujkwreztlxrfptonrxrygi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579519.0352652-1469-72815883682193/AnsiballZ_copy.py
Feb 20 09:25:19 np0005625204.localdomain sudo[237669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:20 np0005625204.localdomain python3.9[237671]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579519.0352652-1469-72815883682193/.source.yaml _original_basename=.fsp69cl4 follow=False checksum=759c0783fe604271cc6640bac1339e1b1de19d54 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:20 np0005625204.localdomain sudo[237669]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62616 DF PROTO=TCP SPT=49822 DPT=9882 SEQ=975647677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997ED330000000001030307) 
Feb 20 09:25:21 np0005625204.localdomain sudo[237779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nyfawxrdvktgrgprbtogbcgpsplvmhlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579520.9839-1514-204600063440327/AnsiballZ_stat.py
Feb 20 09:25:21 np0005625204.localdomain sudo[237779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:21 np0005625204.localdomain python3.9[237781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:21 np0005625204.localdomain sudo[237779]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:21 np0005625204.localdomain sudo[237867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtvghpburnbnwrxirhjqdhxysbbxuzoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579520.9839-1514-204600063440327/AnsiballZ_copy.py
Feb 20 09:25:21 np0005625204.localdomain sudo[237867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:22 np0005625204.localdomain python3.9[237869]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579520.9839-1514-204600063440327/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:25:22 np0005625204.localdomain sudo[237867]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:22 np0005625204.localdomain sshd[237887]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:25:22 np0005625204.localdomain sshd[237887]: Received disconnect from 18.221.252.160 port 36442:11: Bye Bye [preauth]
Feb 20 09:25:22 np0005625204.localdomain sshd[237887]: Disconnected from authenticating user root 18.221.252.160 port 36442 [preauth]
Feb 20 09:25:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:22.784 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:23 np0005625204.localdomain sudo[237979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qabgbadrxdyohvkiljeiiayzghgfdjnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579522.98955-1578-3822541662333/AnsiballZ_file.py
Feb 20 09:25:23 np0005625204.localdomain sudo[237979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:23.319 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:23 np0005625204.localdomain python3.9[237981]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:23 np0005625204.localdomain sudo[237979]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:24 np0005625204.localdomain sudo[238089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipjguvguqqcznnhxucdnapbsfyfamtbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579523.804724-1601-22513563967696/AnsiballZ_file.py
Feb 20 09:25:24 np0005625204.localdomain sudo[238089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:24 np0005625204.localdomain python3.9[238091]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:25:24 np0005625204.localdomain sudo[238089]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:24 np0005625204.localdomain sudo[238199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zytsndtwlwllcmakmywajnugnempclxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579524.5155168-1626-166919238843199/AnsiballZ_stat.py
Feb 20 09:25:24 np0005625204.localdomain sudo[238199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62617 DF PROTO=TCP SPT=49822 DPT=9882 SEQ=975647677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5997FCE90000000001030307) 
Feb 20 09:25:24 np0005625204.localdomain python3.9[238201]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:24 np0005625204.localdomain sudo[238199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:25 np0005625204.localdomain sudo[238256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eohtxwgivhqikmccvudiwqxqcfxxyxdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579524.5155168-1626-166919238843199/AnsiballZ_file.py
Feb 20 09:25:25 np0005625204.localdomain sudo[238256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:25 np0005625204.localdomain python3.9[238258]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.837yizgf recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:25 np0005625204.localdomain sudo[238256]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:26 np0005625204.localdomain python3.9[238366]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:28.320 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:28.322 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:28.322 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:25:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:28.322 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:28 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56187 DF PROTO=TCP SPT=43472 DPT=9105 SEQ=3099193119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599807A90000000001030307) 
Feb 20 09:25:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:28.438 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:28.439 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56188 DF PROTO=TCP SPT=43472 DPT=9105 SEQ=3099193119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59980FA80000000001030307) 
Feb 20 09:25:30 np0005625204.localdomain sudo[238668]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfzxtgiquhjrtmsfazmaiuqeyspnphhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579529.9610636-1737-172622895574521/AnsiballZ_container_config_data.py
Feb 20 09:25:30 np0005625204.localdomain sudo[238668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:30 np0005625204.localdomain python3.9[238670]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Feb 20 09:25:30 np0005625204.localdomain sudo[238668]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:31 np0005625204.localdomain sudo[238778]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqjrmrmxzqnnptwmucaulmeqrbktlknd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579531.7439287-1770-275748967730549/AnsiballZ_container_config_hash.py
Feb 20 09:25:31 np0005625204.localdomain sudo[238778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:32 np0005625204.localdomain python3.9[238780]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:25:32 np0005625204.localdomain sudo[238778]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=334 DF PROTO=TCP SPT=46852 DPT=9100 SEQ=189675342 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59981B680000000001030307) 
Feb 20 09:25:33 np0005625204.localdomain sudo[238888]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojhitywjllelmxebjjemmrvkklcnkwgu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579532.9641695-1799-180834236134729/AnsiballZ_edpm_container_manage.py
Feb 20 09:25:33 np0005625204.localdomain sudo[238888]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:33.440 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:33.441 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:33.441 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:25:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:33.441 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:33.485 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:33.486 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:33 np0005625204.localdomain python3[238890]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:25:33 np0005625204.localdomain podman[238929]: 
Feb 20 09:25:33 np0005625204.localdomain podman[238929]: 2026-02-20 09:25:33.805282091 +0000 UTC m=+0.078886191 container create f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible)
Feb 20 09:25:33 np0005625204.localdomain podman[238929]: 2026-02-20 09:25:33.763899436 +0000 UTC m=+0.037503586 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Feb 20 09:25:33 np0005625204.localdomain python3[238890]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /:/rootfs:ro --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl --path.rootfs=/rootfs
Feb 20 09:25:33 np0005625204.localdomain sudo[238888]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:35 np0005625204.localdomain sudo[239074]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuoaretwblkieyxxuqytqzojxvpblmev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579534.2183762-1823-102120644410541/AnsiballZ_stat.py
Feb 20 09:25:35 np0005625204.localdomain sudo[239074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:35 np0005625204.localdomain python3.9[239076]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:25:35 np0005625204.localdomain sudo[239074]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:35 np0005625204.localdomain sudo[239186]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpdlubjpzagccvmndnfcxbjaiiznlzyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579535.684381-1850-16367876794351/AnsiballZ_file.py
Feb 20 09:25:35 np0005625204.localdomain sudo[239186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:36 np0005625204.localdomain python3.9[239188]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:36 np0005625204.localdomain sudo[239186]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3590 DF PROTO=TCP SPT=36394 DPT=9101 SEQ=4247187860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599829B40000000001030307) 
Feb 20 09:25:36 np0005625204.localdomain sudo[239241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbwenwzoztxbycdwxawveothmklfijqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579535.684381-1850-16367876794351/AnsiballZ_stat.py
Feb 20 09:25:36 np0005625204.localdomain sudo[239241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:36 np0005625204.localdomain python3.9[239243]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:25:36 np0005625204.localdomain sudo[239241]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:37 np0005625204.localdomain sudo[239350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwlkhucghmqtfczmhenkmovbwiktrbkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579536.6829891-1850-70486703118710/AnsiballZ_copy.py
Feb 20 09:25:37 np0005625204.localdomain sudo[239350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:37 np0005625204.localdomain python3.9[239352]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579536.6829891-1850-70486703118710/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:37 np0005625204.localdomain sudo[239350]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:37 np0005625204.localdomain sudo[239405]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axinpvniudhkrflqlhhvbfoobrepcedf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579536.6829891-1850-70486703118710/AnsiballZ_systemd.py
Feb 20 09:25:37 np0005625204.localdomain sudo[239405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:37 np0005625204.localdomain python3.9[239407]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:25:37 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:25:37 np0005625204.localdomain systemd-rc-local-generator[239431]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:25:37 np0005625204.localdomain systemd-sysv-generator[239437]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:25:37 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:38 np0005625204.localdomain sudo[239405]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:38.487 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:38 np0005625204.localdomain sudo[239496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljwrgcuothnaxgckqvjovxztzlckogjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579536.6829891-1850-70486703118710/AnsiballZ_systemd.py
Feb 20 09:25:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:38.490 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:38.490 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:25:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:38.490 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:38 np0005625204.localdomain sudo[239496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:38.531 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:38.532 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:38 np0005625204.localdomain python3.9[239498]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:25:38 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:25:38 np0005625204.localdomain systemd-rc-local-generator[239523]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:25:38 np0005625204.localdomain systemd-sysv-generator[239527]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: Starting node_exporter container...
Feb 20 09:25:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3592 DF PROTO=TCP SPT=36394 DPT=9101 SEQ=4247187860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599835A80000000001030307) 
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:25:39 np0005625204.localdomain podman[239538]: 2026-02-20 09:25:39.396172425 +0000 UTC m=+0.146111692 container init f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.411Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.411Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.411Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.412Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.412Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.412Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.412Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.412Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.412Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=arp
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=bcache
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=bonding
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=cpu
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=edac
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=filefd
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=netclass
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=netdev
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=netstat
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=nfs
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=nvme
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=softnet
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=systemd
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=xfs
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.413Z caller=node_exporter.go:117 level=info collector=zfs
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.414Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Feb 20 09:25:39 np0005625204.localdomain node_exporter[239552]: ts=2026-02-20T09:25:39.414Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:25:39 np0005625204.localdomain podman[239538]: 2026-02-20 09:25:39.434024051 +0000 UTC m=+0.183963318 container start f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:25:39 np0005625204.localdomain podman[239538]: node_exporter
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: Started node_exporter container.
Feb 20 09:25:39 np0005625204.localdomain sudo[239496]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:39 np0005625204.localdomain podman[239561]: 2026-02-20 09:25:39.518312298 +0000 UTC m=+0.076024363 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:25:39 np0005625204.localdomain podman[239561]: 2026-02-20 09:25:39.529913096 +0000 UTC m=+0.087625151 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:25:39 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:25:40 np0005625204.localdomain python3.9[239689]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:25:41 np0005625204.localdomain sudo[239797]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kephyxkxppicemxundpcqufkmiuivyxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579541.1414497-1985-25697135077147/AnsiballZ_stat.py
Feb 20 09:25:41 np0005625204.localdomain sudo[239797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:41 np0005625204.localdomain python3.9[239799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:41 np0005625204.localdomain sudo[239797]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56190 DF PROTO=TCP SPT=43472 DPT=9105 SEQ=3099193119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59983F680000000001030307) 
Feb 20 09:25:42 np0005625204.localdomain sudo[239887]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leqfzqdqlxijagdhrvbxnunukzeumpfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579541.1414497-1985-25697135077147/AnsiballZ_copy.py
Feb 20 09:25:42 np0005625204.localdomain sudo[239887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:42 np0005625204.localdomain python3.9[239889]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579541.1414497-1985-25697135077147/.source.yaml _original_basename=.d3r2yfjz follow=False checksum=18a0b6e78403f3ab12e5e8e6e71bc7fd62c02b34 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:42 np0005625204.localdomain sudo[239887]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:42 np0005625204.localdomain sudo[239997]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecjjyoybdtxueqrroepnmyjvijvkrnvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579542.6860905-2030-14590386522893/AnsiballZ_stat.py
Feb 20 09:25:42 np0005625204.localdomain sudo[239997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:43 np0005625204.localdomain python3.9[239999]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:43 np0005625204.localdomain sudo[239997]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:43.532 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:43 np0005625204.localdomain sudo[240085]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fahziraezdjjhiomoayyxojkypabcvmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579542.6860905-2030-14590386522893/AnsiballZ_copy.py
Feb 20 09:25:43 np0005625204.localdomain sudo[240085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:44 np0005625204.localdomain python3.9[240087]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579542.6860905-2030-14590386522893/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:25:44 np0005625204.localdomain sudo[240085]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:25:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:25:45 np0005625204.localdomain podman[240106]: 2026-02-20 09:25:45.147577854 +0000 UTC m=+0.080635929 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:25:45 np0005625204.localdomain podman[240106]: 2026-02-20 09:25:45.156379838 +0000 UTC m=+0.089437913 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:25:45 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:25:45 np0005625204.localdomain systemd[1]: tmp-crun.6qL0Ch.mount: Deactivated successfully.
Feb 20 09:25:45 np0005625204.localdomain podman[240105]: 2026-02-20 09:25:45.257473327 +0000 UTC m=+0.192756766 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:25:45 np0005625204.localdomain podman[240105]: 2026-02-20 09:25:45.330345891 +0000 UTC m=+0.265629330 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:25:45 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:25:45 np0005625204.localdomain sudo[240238]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quopwmwjusdszbnfxotqoqezcvfmmlox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579545.384651-2094-37581838908078/AnsiballZ_file.py
Feb 20 09:25:45 np0005625204.localdomain sudo[240238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:45 np0005625204.localdomain python3.9[240240]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:45 np0005625204.localdomain sudo[240238]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50922 DF PROTO=TCP SPT=53608 DPT=9100 SEQ=3611478149 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59984F680000000001030307) 
Feb 20 09:25:46 np0005625204.localdomain sudo[240348]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxhfihntibobukdrqptpsuaclkgohezc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579546.0705636-2118-206014189996847/AnsiballZ_file.py
Feb 20 09:25:46 np0005625204.localdomain sudo[240348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:46 np0005625204.localdomain python3.9[240350]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:25:46 np0005625204.localdomain sudo[240348]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:47 np0005625204.localdomain sudo[240458]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lghfkwtqmutwpuyidnwcbrdqvvrfzsdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579546.768631-2142-153442180562691/AnsiballZ_stat.py
Feb 20 09:25:47 np0005625204.localdomain sudo[240458]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:47 np0005625204.localdomain python3.9[240460]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:25:47 np0005625204.localdomain sudo[240458]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:47 np0005625204.localdomain sudo[240515]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svcdaxhvzivlfkbnbzjwfsbwygtecjeo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579546.768631-2142-153442180562691/AnsiballZ_file.py
Feb 20 09:25:47 np0005625204.localdomain sudo[240515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:25:47 np0005625204.localdomain podman[240518]: 2026-02-20 09:25:47.559100931 +0000 UTC m=+0.061142396 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:25:47 np0005625204.localdomain podman[240518]: 2026-02-20 09:25:47.590047855 +0000 UTC m=+0.092089320 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:25:47 np0005625204.localdomain podman[240518]: unhealthy
Feb 20 09:25:47 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:25:47 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:25:47 np0005625204.localdomain python3.9[240517]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.o_kbh2h8 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36408 DF PROTO=TCP SPT=59782 DPT=9882 SEQ=722642973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998563A0000000001030307) 
Feb 20 09:25:47 np0005625204.localdomain sudo[240515]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:48 np0005625204.localdomain python3.9[240643]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:48 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:48.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:50 np0005625204.localdomain sudo[240945]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwefmlqsaqcqvomgugzalnipjztbuxna ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579550.0852256-2253-5866184268559/AnsiballZ_container_config_data.py
Feb 20 09:25:50 np0005625204.localdomain sudo[240945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:50 np0005625204.localdomain python3.9[240947]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 20 09:25:50 np0005625204.localdomain sudo[240945]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36410 DF PROTO=TCP SPT=59782 DPT=9882 SEQ=722642973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599862280000000001030307) 
Feb 20 09:25:51 np0005625204.localdomain sudo[241055]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pibkzapavfbrwchvyhzkfutfaayfzrsm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579551.0641594-2286-14684261817501/AnsiballZ_container_config_hash.py
Feb 20 09:25:51 np0005625204.localdomain sudo[241055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:51 np0005625204.localdomain python3.9[241057]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:25:51 np0005625204.localdomain sudo[241055]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:52 np0005625204.localdomain sudo[241165]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmmrtpupcqjbtgtzyyswpzxnjyktxjfx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579552.0234659-2315-30855661818438/AnsiballZ_edpm_container_manage.py
Feb 20 09:25:52 np0005625204.localdomain sudo[241165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:52 np0005625204.localdomain python3[241167]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:25:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:53.538 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:53.539 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:25:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:53.540 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:25:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:53.540 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:53.564 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:53.565 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:25:53 np0005625204.localdomain sudo[241194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:25:53 np0005625204.localdomain sudo[241194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:25:53 np0005625204.localdomain sudo[241194]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:53 np0005625204.localdomain sudo[241212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:25:53 np0005625204.localdomain sudo[241212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:25:54 np0005625204.localdomain podman[241181]: 2026-02-20 09:25:52.684092841 +0000 UTC m=+0.045860105 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 20 09:25:54 np0005625204.localdomain podman[241305]: 
Feb 20 09:25:54 np0005625204.localdomain podman[241305]: 2026-02-20 09:25:54.38360291 +0000 UTC m=+0.075609064 container create 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible)
Feb 20 09:25:54 np0005625204.localdomain podman[241305]: 2026-02-20 09:25:54.345345945 +0000 UTC m=+0.037352109 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 20 09:25:54 np0005625204.localdomain python3[241167]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 20 09:25:54 np0005625204.localdomain sudo[241212]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:54 np0005625204.localdomain sudo[241165]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36411 DF PROTO=TCP SPT=59782 DPT=9882 SEQ=722642973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599871E80000000001030307) 
Feb 20 09:25:55 np0005625204.localdomain sudo[241461]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmpctxiuksvgkphscemucrwuchcdnyws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579554.7952387-2339-268635492226839/AnsiballZ_stat.py
Feb 20 09:25:55 np0005625204.localdomain sudo[241461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:55 np0005625204.localdomain sudo[241463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:25:55 np0005625204.localdomain sudo[241463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:25:55 np0005625204.localdomain sudo[241463]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:55 np0005625204.localdomain python3.9[241477]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:25:55 np0005625204.localdomain sudo[241461]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:56 np0005625204.localdomain sudo[241591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzrrngpivnggxfxqstqbsfthteylbtvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579556.0944047-2366-84861639805043/AnsiballZ_file.py
Feb 20 09:25:56 np0005625204.localdomain sudo[241591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:56 np0005625204.localdomain python3.9[241593]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:56 np0005625204.localdomain sudo[241591]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:56 np0005625204.localdomain sudo[241646]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynymlqjwftoslsawqqajdsenkatzqdpp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579556.0944047-2366-84861639805043/AnsiballZ_stat.py
Feb 20 09:25:56 np0005625204.localdomain sudo[241646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:57 np0005625204.localdomain python3.9[241648]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:25:57 np0005625204.localdomain sudo[241646]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:57 np0005625204.localdomain sudo[241755]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvibjokwckvizjicrclmtsdaiwvmcber ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579557.1343648-2366-25527471250431/AnsiballZ_copy.py
Feb 20 09:25:57 np0005625204.localdomain sudo[241755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31635 DF PROTO=TCP SPT=54660 DPT=9105 SEQ=2177006928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59987CE90000000001030307) 
Feb 20 09:25:57 np0005625204.localdomain python3.9[241757]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579557.1343648-2366-25527471250431/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:25:57 np0005625204.localdomain sudo[241755]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:57 np0005625204.localdomain sudo[241810]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqklielkzjtwtbmcxeuntiinjvyuqeug ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579557.1343648-2366-25527471250431/AnsiballZ_systemd.py
Feb 20 09:25:57 np0005625204.localdomain sudo[241810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:58 np0005625204.localdomain python3.9[241812]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:25:58 np0005625204.localdomain rsyslogd[758]: imjournal: 4238 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:25:58 np0005625204.localdomain systemd-sysv-generator[241840]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:25:58 np0005625204.localdomain systemd-rc-local-generator[241834]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:58.562 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:25:58.565 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:25:58 np0005625204.localdomain sudo[241810]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:58 np0005625204.localdomain sudo[241900]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvwgyptouvyqrmddvcmyoevpkpivuhtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579557.1343648-2366-25527471250431/AnsiballZ_systemd.py
Feb 20 09:25:58 np0005625204.localdomain sudo[241900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:25:59 np0005625204.localdomain python3.9[241902]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:25:59 np0005625204.localdomain systemd-sysv-generator[241931]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:25:59 np0005625204.localdomain systemd-rc-local-generator[241927]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:25:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31636 DF PROTO=TCP SPT=54660 DPT=9105 SEQ=2177006928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599884E80000000001030307) 
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Starting podman_exporter container...
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:25:59 np0005625204.localdomain podman[241943]: 2026-02-20 09:25:59.862294533 +0000 UTC m=+0.173733827 container init 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:25:59 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 20 09:25:59 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 20 09:25:59 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 20 09:25:59 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:25:59.882Z caller=handler.go:105 level=info collector=container
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Starting Podman API Service...
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Started Podman API Service.
Feb 20 09:25:59 np0005625204.localdomain podman[241943]: 2026-02-20 09:25:59.894027059 +0000 UTC m=+0.205466393 container start 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:25:59 np0005625204.localdomain podman[241943]: podman_exporter
Feb 20 09:25:59 np0005625204.localdomain systemd[1]: Started podman_exporter container.
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="Setting parallel job count to 25"
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:25:59 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 20 09:25:59 np0005625204.localdomain sudo[241900]: pam_unix(sudo:session): session closed for user root
Feb 20 09:25:59 np0005625204.localdomain podman[241968]: time="2026-02-20T09:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:26:00 np0005625204.localdomain podman[241967]: 2026-02-20 09:26:00.567895535 +0000 UTC m=+0.667012170 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:26:00 np0005625204.localdomain podman[241967]: 2026-02-20 09:26:00.579958764 +0000 UTC m=+0.679075419 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:26:00 np0005625204.localdomain podman[241967]: unhealthy
Feb 20 09:26:01 np0005625204.localdomain python3.9[242112]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:26:02 np0005625204.localdomain sudo[242220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-extbwbgzawfblznszjyodeieuikcbtfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579562.472272-2502-233918779226488/AnsiballZ_stat.py
Feb 20 09:26:02 np0005625204.localdomain sudo[242220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:26:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36412 DF PROTO=TCP SPT=59782 DPT=9882 SEQ=722642973 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599891690000000001030307) 
Feb 20 09:26:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:26:02 np0005625204.localdomain python3.9[242222]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:26:02 np0005625204.localdomain sudo[242220]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:26:03 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:26:03 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:26:03 np0005625204.localdomain sudo[242310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkvpvfncpptylatiqjdnluavmhxxvgdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579562.472272-2502-233918779226488/AnsiballZ_copy.py
Feb 20 09:26:03 np0005625204.localdomain sudo[242310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:03 np0005625204.localdomain python3.9[242312]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579562.472272-2502-233918779226488/.source.yaml _original_basename=.7fgbausk follow=False checksum=dae36056a950a4131d7691afd655cacfc03f4930 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:03 np0005625204.localdomain sudo[242310]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:03.566 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:03.568 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:03.568 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:03.568 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:03.605 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:03.606 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:04 np0005625204.localdomain sudo[242420]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmslivvtgcavkghqbnypkvypipuxalep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579563.757594-2546-4685911450116/AnsiballZ_stat.py
Feb 20 09:26:04 np0005625204.localdomain sudo[242420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:04 np0005625204.localdomain python3.9[242422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:26:04 np0005625204.localdomain sudo[242420]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:26:05 np0005625204.localdomain sudo[242508]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tapxzyfafwwyjrzbspsmnshxjuezbqcb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579563.757594-2546-4685911450116/AnsiballZ_copy.py
Feb 20 09:26:05 np0005625204.localdomain sudo[242508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:26:05 np0005625204.localdomain python3.9[242510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579563.757594-2546-4685911450116/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:26:05 np0005625204.localdomain sudo[242508]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:26:05.984 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:26:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:26:05.988 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:26:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:26:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:26:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28721 DF PROTO=TCP SPT=37896 DPT=9101 SEQ=1703451543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59989EE50000000001030307) 
Feb 20 09:26:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:06 np0005625204.localdomain sudo[242618]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giampnhoxgjsbvqceilgcvfdaskvzyvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579566.083437-2610-183643433057813/AnsiballZ_file.py
Feb 20 09:26:06 np0005625204.localdomain sudo[242618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:06 np0005625204.localdomain python3.9[242620]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:06 np0005625204.localdomain sudo[242618]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:07 np0005625204.localdomain sudo[242728]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syphnbxnguhiggtyfcfxvdzzifdgebyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579567.5941906-2634-176397612384006/AnsiballZ_file.py
Feb 20 09:26:07 np0005625204.localdomain sudo[242728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:08 np0005625204.localdomain python3.9[242730]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:26:08 np0005625204.localdomain sudo[242728]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:08 np0005625204.localdomain sudo[242838]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbblmrcmqxzckucctfiikuctxjlaopmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579568.3096473-2657-260708091032150/AnsiballZ_stat.py
Feb 20 09:26:08 np0005625204.localdomain sudo[242838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.606 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.633 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.634 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:08.637 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:08 np0005625204.localdomain python3.9[242840]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:26:08 np0005625204.localdomain sudo[242838]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:09 np0005625204.localdomain sudo[242895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtaqtwabuhurmubbnprmejsrtfwgkmkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579568.3096473-2657-260708091032150/AnsiballZ_file.py
Feb 20 09:26:09 np0005625204.localdomain sudo[242895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:09 np0005625204.localdomain python3.9[242897]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.pg83jp83 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:09 np0005625204.localdomain sudo[242895]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28723 DF PROTO=TCP SPT=37896 DPT=9101 SEQ=1703451543 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998AAE80000000001030307) 
Feb 20 09:26:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:26:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:26:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5-merged.mount: Deactivated successfully.
Feb 20 09:26:09 np0005625204.localdomain podman[242953]: 2026-02-20 09:26:09.686673687 +0000 UTC m=+0.069463567 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:26:09 np0005625204.localdomain podman[242953]: 2026-02-20 09:26:09.695700157 +0000 UTC m=+0.078490047 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:26:09 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:26:09 np0005625204.localdomain python3.9[243028]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bd78b9713dd3e99e643b00393778754778aa559cd714791929ade977105955f5-merged.mount: Deactivated successfully.
Feb 20 09:26:12 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31638 DF PROTO=TCP SPT=54660 DPT=9105 SEQ=2177006928 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998B5680000000001030307) 
Feb 20 09:26:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:26:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 20 09:26:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:13.635 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.366 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.366 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.385 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.385 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.386 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.470 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.470 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.470 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.471 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:26:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:14.946 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.000 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.000 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.001 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.002 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.015 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.015 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.015 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.016 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.016 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:26:15 np0005625204.localdomain sudo[243351]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhbjqtiwilkbvtjhpdbxqxkvnobaifmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579574.9323103-2770-75985211269072/AnsiballZ_container_config_data.py
Feb 20 09:26:15 np0005625204.localdomain sudo[243351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:26:15 np0005625204.localdomain systemd[1]: tmp-crun.UG6X66.mount: Deactivated successfully.
Feb 20 09:26:15 np0005625204.localdomain podman[243353]: 2026-02-20 09:26:15.307672718 +0000 UTC m=+0.089667939 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:26:15 np0005625204.localdomain podman[243353]: 2026-02-20 09:26:15.316893914 +0000 UTC m=+0.098889085 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:26:15 np0005625204.localdomain python3.9[243354]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 20 09:26:15 np0005625204.localdomain sudo[243351]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.450 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.496 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.496 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.663 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.664 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12644MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.665 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.665 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:26:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.759 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.759 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.759 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:26:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:26:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:15.820 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:26:16 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:26:16 np0005625204.localdomain podman[243373]: 2026-02-20 09:26:16.084977162 +0000 UTC m=+0.324707547 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 20 09:26:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36127 DF PROTO=TCP SPT=44482 DPT=9102 SEQ=170331392 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998C5680000000001030307) 
Feb 20 09:26:16 np0005625204.localdomain podman[243373]: 2026-02-20 09:26:16.180040395 +0000 UTC m=+0.419770770 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:26:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:26:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:16.320 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:26:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:16.328 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:26:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:16.348 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:26:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:16.350 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:26:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:16.351 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:26:16 np0005625204.localdomain sudo[243526]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icvdcghwprirfbazqwuyozetbmaqhffz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579576.1931436-2802-222719057455977/AnsiballZ_container_config_hash.py
Feb 20 09:26:16 np0005625204.localdomain sudo[243526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:16 np0005625204.localdomain python3.9[243528]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:26:16 np0005625204.localdomain sudo[243526]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:17 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:26:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:26:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32882 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998CB6A0000000001030307) 
Feb 20 09:26:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:17 np0005625204.localdomain podman[243578]: 2026-02-20 09:26:17.73841999 +0000 UTC m=+0.091973676 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 20 09:26:17 np0005625204.localdomain podman[243578]: 2026-02-20 09:26:17.776012955 +0000 UTC m=+0.129566641 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute)
Feb 20 09:26:17 np0005625204.localdomain podman[243578]: unhealthy
Feb 20 09:26:17 np0005625204.localdomain sudo[243654]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gizeheybdzomtwzbrtlrvjrmdkplalxs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579577.5881782-2831-70260829342222/AnsiballZ_edpm_container_manage.py
Feb 20 09:26:17 np0005625204.localdomain sudo[243654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:18 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:26:18 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:26:18 np0005625204.localdomain python3[243656]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.675 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.675 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5037 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.675 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.676 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.676 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:18.678 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 20 09:26:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d-merged.mount: Deactivated successfully.
Feb 20 09:26:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c773c83c6503477114a5b4bf49e71270791ffb8bdafb74f6f588401adb71807d-merged.mount: Deactivated successfully.
Feb 20 09:26:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32884 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998D7680000000001030307) 
Feb 20 09:26:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:26:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:26:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:26:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:23.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:23.682 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:23.682 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:23.682 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:23.721 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:23.722 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:26:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:26:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32885 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998E7280000000001030307) 
Feb 20 09:26:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:26 np0005625204.localdomain podman[243672]: 2026-02-20 09:26:20.421698333 +0000 UTC m=+0.046798782 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c
Feb 20 09:26:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40765 DF PROTO=TCP SPT=54628 DPT=9105 SEQ=2569274226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998F2280000000001030307) 
Feb 20 09:26:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:28 np0005625204.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 20 09:26:28 np0005625204.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 20 09:26:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:28.722 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:28 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:28.724 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40766 DF PROTO=TCP SPT=54628 DPT=9105 SEQ=2569274226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5998FA290000000001030307) 
Feb 20 09:26:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:26:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fe283b2661fd1264356cfa0f1ee8829c20c69c30fbf0fb0b95461c38422d3260-merged.mount: Deactivated successfully.
Feb 20 09:26:30 np0005625204.localdomain kernel: overlayfs: upperdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 20 09:26:30 np0005625204.localdomain kernel: overlayfs: workdir is in-use as upperdir/workdir of another mount, accessing files from both mounts will result in undefined behavior.
Feb 20 09:26:30 np0005625204.localdomain podman[243732]: 2026-02-20 09:26:28.137945189 +0000 UTC m=+0.050716845 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c
Feb 20 09:26:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 20 09:26:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 20 09:26:31 np0005625204.localdomain podman[243732]: 
Feb 20 09:26:31 np0005625204.localdomain podman[243732]: 2026-02-20 09:26:31.368435171 +0000 UTC m=+3.281206777 container create 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, version=9.7, config_id=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.openshift.expose-services=, release=1770267347, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:26:32 np0005625204.localdomain python3[243656]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c
Feb 20 09:26:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32886 DF PROTO=TCP SPT=57704 DPT=9882 SEQ=1406770644 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599907680000000001030307) 
Feb 20 09:26:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:26:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:33.725 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:33.727 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:33.728 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:33.728 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:33.780 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:33 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:33.781 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 20 09:26:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-53ada3da4ca04351bf169e5d627c0fcff441ff8e221128687b0e29666c5bc26c-merged.mount: Deactivated successfully.
Feb 20 09:26:34 np0005625204.localdomain sudo[243654]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:34 np0005625204.localdomain podman[243768]: 2026-02-20 09:26:34.122005813 +0000 UTC m=+0.828063858 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:26:34 np0005625204.localdomain podman[243768]: 2026-02-20 09:26:34.157011074 +0000 UTC m=+0.863069169 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:26:34 np0005625204.localdomain podman[243768]: unhealthy
Feb 20 09:26:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:26:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 20 09:26:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 20 09:26:35 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:26:35 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:26:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18328 DF PROTO=TCP SPT=57176 DPT=9101 SEQ=3187546761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599914140000000001030307) 
Feb 20 09:26:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:26:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:26:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:38.782 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:38.784 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:38.784 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:38.785 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:38.814 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:38 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:38.815 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18330 DF PROTO=TCP SPT=57176 DPT=9101 SEQ=3187546761 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599920290000000001030307) 
Feb 20 09:26:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:26:40 np0005625204.localdomain podman[243808]: 2026-02-20 09:26:40.144358183 +0000 UTC m=+0.081168834 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:26:40 np0005625204.localdomain podman[243808]: 2026-02-20 09:26:40.153201439 +0000 UTC m=+0.090012120 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:26:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 20 09:26:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-271fbe47d50a90f03735a26a1ff5b20e2027c13cb6e9d5c8a6a9112793cd7c92-merged.mount: Deactivated successfully.
Feb 20 09:26:40 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:26:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 20 09:26:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40768 DF PROTO=TCP SPT=54628 DPT=9105 SEQ=2569274226 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599929680000000001030307) 
Feb 20 09:26:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:41 np0005625204.localdomain sudo[243921]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iqzrdptgrawpbllmzlgzvtaapdaoiumd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579601.5764792-2855-76574783195026/AnsiballZ_stat.py
Feb 20 09:26:41 np0005625204.localdomain sudo[243921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:41 np0005625204.localdomain sshd[243924]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:26:42 np0005625204.localdomain python3.9[243923]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:26:42 np0005625204.localdomain sudo[243921]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:42 np0005625204.localdomain sudo[244035]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pissbhrfeoqvsbwgtexqpowrizymgviv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579602.3643603-2882-161129755338584/AnsiballZ_file.py
Feb 20 09:26:42 np0005625204.localdomain sudo[244035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:42 np0005625204.localdomain sshd[243924]: Invalid user sol from 45.148.10.240 port 43880
Feb 20 09:26:42 np0005625204.localdomain python3.9[244037]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:42 np0005625204.localdomain sudo[244035]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:42 np0005625204.localdomain sshd[243924]: Connection closed by invalid user sol 45.148.10.240 port 43880 [preauth]
Feb 20 09:26:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 20 09:26:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0b03ed83be81af8ca31d355d34bc84741adbeedeb0b33580fe27349115e799d7-merged.mount: Deactivated successfully.
Feb 20 09:26:43 np0005625204.localdomain sudo[244090]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzrrueeffenpbrgwfmuehwjvtkuuruac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579602.3643603-2882-161129755338584/AnsiballZ_stat.py
Feb 20 09:26:43 np0005625204.localdomain sudo[244090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:43 np0005625204.localdomain python3.9[244092]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:26:43 np0005625204.localdomain sudo[244090]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:43 np0005625204.localdomain sudo[244199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlgeenalhhfgwajlnsmgxrmjwdnfmxfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579603.3670156-2882-133781798226221/AnsiballZ_copy.py
Feb 20 09:26:43 np0005625204.localdomain sudo[244199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 20 09:26:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 20 09:26:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:43.816 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:43.818 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:43.818 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:43.818 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 20 09:26:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:43.860 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:43 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:43.861 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:43 np0005625204.localdomain python3.9[244201]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579603.3670156-2882-133781798226221/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:43 np0005625204.localdomain sudo[244199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:44 np0005625204.localdomain sudo[244254]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqfrxjtuzkdbdbrndovdfgmenyvibict ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579603.3670156-2882-133781798226221/AnsiballZ_systemd.py
Feb 20 09:26:44 np0005625204.localdomain sudo[244254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:44 np0005625204.localdomain python3.9[244256]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:26:44 np0005625204.localdomain systemd-rc-local-generator[244284]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:26:44 np0005625204.localdomain systemd-sysv-generator[244287]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 20 09:26:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:44 np0005625204.localdomain sudo[244254]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:45 np0005625204.localdomain sudo[244345]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wzvvwyujkrwtfppwnivpawpqraxkbozc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579603.3670156-2882-133781798226221/AnsiballZ_systemd.py
Feb 20 09:26:45 np0005625204.localdomain sudo[244345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:45 np0005625204.localdomain python3.9[244347]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:26:45 np0005625204.localdomain systemd-sysv-generator[244375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:26:45 np0005625204.localdomain systemd-rc-local-generator[244372]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:26:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9499 DF PROTO=TCP SPT=45642 DPT=9102 SEQ=2999730602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599939680000000001030307) 
Feb 20 09:26:45 np0005625204.localdomain systemd[1]: Starting openstack_network_exporter container...
Feb 20 09:26:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 20 09:26:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:26:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-33265afbb0ab1192cc35fd8be9e517c4969c8f23f7a1676738a90556ed12fe7c-merged.mount: Deactivated successfully.
Feb 20 09:26:46 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:26:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0619ed2bc9c42056f6c58d6012f63756e6012dafd4c93436ed775c2fbc752107/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 20 09:26:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0619ed2bc9c42056f6c58d6012f63756e6012dafd4c93436ed775c2fbc752107/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 20 09:26:46 np0005625204.localdomain podman[244399]: 2026-02-20 09:26:46.514230069 +0000 UTC m=+0.109641681 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:26:46 np0005625204.localdomain podman[244399]: 2026-02-20 09:26:46.521793755 +0000 UTC m=+0.117205377 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:26:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:26:46 np0005625204.localdomain podman[244387]: 2026-02-20 09:26:46.557185091 +0000 UTC m=+0.586030566 container init 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9)
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *bridge.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *coverage.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *datapath.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *iface.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *memory.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *ovn.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *pmd_perf.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *pmd_rxq.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: INFO    09:26:46 main.go:48: registering *vswitch.Collector
Feb 20 09:26:46 np0005625204.localdomain openstack_network_exporter[244414]: NOTICE  09:26:46 main.go:82: listening on http://:9105/metrics
Feb 20 09:26:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:26:46 np0005625204.localdomain podman[244387]: 2026-02-20 09:26:46.585294597 +0000 UTC m=+0.614140062 container start 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, version=9.7, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 20 09:26:46 np0005625204.localdomain podman[244387]: openstack_network_exporter
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: Started openstack_network_exporter container.
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:26:47 np0005625204.localdomain sudo[244345]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:47 np0005625204.localdomain podman[244429]: 2026-02-20 09:26:47.156451826 +0000 UTC m=+0.566145593 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=starting, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:26:47 np0005625204.localdomain podman[244429]: 2026-02-20 09:26:47.188120359 +0000 UTC m=+0.597814106 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Feb 20 09:26:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5084 DF PROTO=TCP SPT=44340 DPT=9882 SEQ=2692118219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999409B0000000001030307) 
Feb 20 09:26:47 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:26:47 np0005625204.localdomain podman[244440]: 2026-02-20 09:26:47.823808388 +0000 UTC m=+0.718940560 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 20 09:26:47 np0005625204.localdomain podman[244440]: 2026-02-20 09:26:47.900813639 +0000 UTC m=+0.795945871 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:26:48 np0005625204.localdomain podman[244493]: 2026-02-20 09:26:48.210501096 +0000 UTC m=+0.066753312 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:26:48 np0005625204.localdomain podman[244493]: 2026-02-20 09:26:48.21797187 +0000 UTC m=+0.074224116 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:26:48 np0005625204.localdomain podman[244493]: unhealthy
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:26:48 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:48.861 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:48 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:48.863 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:26:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-66b4607051ec4b678b98370429ea66c5b0f53009a9a85441acbc9ac68d517903-merged.mount: Deactivated successfully.
Feb 20 09:26:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 20 09:26:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 20 09:26:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5086 DF PROTO=TCP SPT=44340 DPT=9882 SEQ=2692118219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59994CA80000000001030307) 
Feb 20 09:26:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 20 09:26:51 np0005625204.localdomain python3.9[244601]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:26:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:26:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 20 09:26:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 20 09:26:52 np0005625204.localdomain sudo[244709]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjjjaqocmruxeeqypzfmgsqaanjqfxum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579612.5429497-3018-118334797304688/AnsiballZ_stat.py
Feb 20 09:26:52 np0005625204.localdomain sudo[244709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:53 np0005625204.localdomain python3.9[244711]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:26:53 np0005625204.localdomain sudo[244709]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:26:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:26:53 np0005625204.localdomain sudo[244799]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxrbxjevjsihmpzyhkycpmxnuopwxoeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579612.5429497-3018-118334797304688/AnsiballZ_copy.py
Feb 20 09:26:53 np0005625204.localdomain sudo[244799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:26:53 np0005625204.localdomain python3.9[244801]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579612.5429497-3018-118334797304688/.source.yaml _original_basename=.zyvf5_1v follow=False checksum=3d9c806251215c5317a47411279e51c792f2fd64 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:26:53 np0005625204.localdomain sudo[244799]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 20 09:26:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:26:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:26:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:53.865 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:53.867 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:53.868 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:53.868 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:53.893 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:53.893 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:54 np0005625204.localdomain sudo[244909]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vibnhqiuqiczcfulnlpafwadoiatiihk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579613.8120415-3063-33129159382393/AnsiballZ_find.py
Feb 20 09:26:54 np0005625204.localdomain sudo[244909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:26:54 np0005625204.localdomain python3.9[244911]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:26:54 np0005625204.localdomain sudo[244909]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5087 DF PROTO=TCP SPT=44340 DPT=9882 SEQ=2692118219 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59995C680000000001030307) 
Feb 20 09:26:55 np0005625204.localdomain sudo[244929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:26:55 np0005625204.localdomain sudo[244929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:26:55 np0005625204.localdomain sudo[244929]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:55 np0005625204.localdomain sudo[244947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:26:55 np0005625204.localdomain sudo[244947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:26:55 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 20 09:26:56 np0005625204.localdomain sudo[244947]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 20 09:26:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33357 DF PROTO=TCP SPT=40374 DPT=9105 SEQ=522897056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599967280000000001030307) 
Feb 20 09:26:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:26:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:58.894 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:58.896 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:26:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:58.896 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:26:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:58.896 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:58.916 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:26:58 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:26:58.917 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:26:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:26:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:59 np0005625204.localdomain sudo[244997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:26:59 np0005625204.localdomain sudo[244997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:26:59 np0005625204.localdomain sudo[244997]: pam_unix(sudo:session): session closed for user root
Feb 20 09:26:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:26:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33358 DF PROTO=TCP SPT=40374 DPT=9105 SEQ=522897056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59996F280000000001030307) 
Feb 20 09:27:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 20 09:27:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a-merged.mount: Deactivated successfully.
Feb 20 09:27:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bc671c147b0a6bc2addeae32ae0394502c615f26aaaa88aba4e93f9affd3e24a-merged.mount: Deactivated successfully.
Feb 20 09:27:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 20 09:27:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 20 09:27:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51020 DF PROTO=TCP SPT=60852 DPT=9100 SEQ=2056959636 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59997B690000000001030307) 
Feb 20 09:27:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 20 09:27:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0b4a27664720ce930aee8034c0e3a2e981bce86564061fc7e3c5cc60116ab629-merged.mount: Deactivated successfully.
Feb 20 09:27:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:03.917 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:03.920 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:03.920 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:03.920 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:03.962 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:03 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:03.963 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:27:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:27:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:27:05 np0005625204.localdomain podman[245015]: 2026-02-20 09:27:05.681260634 +0000 UTC m=+0.102410115 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:27:05 np0005625204.localdomain podman[245015]: 2026-02-20 09:27:05.691958926 +0000 UTC m=+0.113108387 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:27:05 np0005625204.localdomain podman[245015]: unhealthy
Feb 20 09:27:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:27:05.985 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:27:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:27:05.986 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:27:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:27:05.987 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:27:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46618 DF PROTO=TCP SPT=41488 DPT=9101 SEQ=1518676747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599989440000000001030307) 
Feb 20 09:27:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:07 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:27:07 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:27:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:08.963 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:08.965 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:08.965 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:08.965 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:08.995 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:08 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:08.995 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46620 DF PROTO=TCP SPT=41488 DPT=9101 SEQ=1518676747 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599995680000000001030307) 
Feb 20 09:27:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:27:11 np0005625204.localdomain podman[245039]: 2026-02-20 09:27:11.150393578 +0000 UTC m=+0.084026453 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:27:11 np0005625204.localdomain podman[245039]: 2026-02-20 09:27:11.158840059 +0000 UTC m=+0.092472914 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:27:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:27:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e-merged.mount: Deactivated successfully.
Feb 20 09:27:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-63f33056b00261d0e07f47c80ba10ef73a797672a3169ee41fd4894170668f6e-merged.mount: Deactivated successfully.
Feb 20 09:27:11 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:27:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33360 DF PROTO=TCP SPT=40374 DPT=9105 SEQ=522897056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59999F690000000001030307) 
Feb 20 09:27:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 20 09:27:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 20 09:27:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:13.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 20 09:27:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-27ac25f75ac951fbeef2be74c2898e3e141e5c323a5908632b2bdca4094605f7-merged.mount: Deactivated successfully.
Feb 20 09:27:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48963 DF PROTO=TCP SPT=37642 DPT=9102 SEQ=2769952032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999AF680000000001030307) 
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.353 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.354 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.354 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.354 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.503 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.503 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.503 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:27:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:16.504 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:27:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:27:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:27:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.040 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.230 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.230 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.231 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.231 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.232 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.233 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.233 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.262 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.262 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.263 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.263 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.263 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:27:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57961 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999B5CA0000000001030307) 
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.731 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.831 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:27:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:17.832 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.008 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.009 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12623MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.009 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.009 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.138 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.139 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.139 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:27:18 np0005625204.localdomain podman[245082]: 2026-02-20 09:27:18.144756565 +0000 UTC m=+0.080210300 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, maintainer=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:27:18 np0005625204.localdomain podman[245082]: 2026-02-20 09:27:18.16000651 +0000 UTC m=+0.095460265 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.190 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.201 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.208 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd627cdf-8f63-493d-b5a7-934b32d197e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.202320', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59c5d24c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '309037857f5827173b9cd85ca4ff641d55498d3bee52b2c262fc0d16259b860d'}]}, 'timestamp': '2026-02-20 09:27:18.209761', '_unique_id': '2cd11d8fad9d4915afa7978b53a4e450'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.212 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.213 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.213 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5c13110-d407-4d0b-b834-b6c4867e5a2e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.213393', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59c67a1c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': 'de7d6e230007d60f9b2f787d74afb105ac24a6773d2ff3af72e8756683cad9d5'}]}, 'timestamp': '2026-02-20 09:27:18.213960', '_unique_id': 'e45c3d8041cc4aabb33ec713e81f5647'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.215 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.216 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.235 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7aab82c-9de1-4041-953b-e9918ec3ed74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:27:18.216406', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '59c9cffa-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.474230558, 'message_signature': 'd3b916a30ab10980606c4abcfe898c8ccbb3666b99d2387ac496df7102c2b739'}]}, 'timestamp': '2026-02-20 09:27:18.235757', '_unique_id': 'b618959fdf664225af5321afdfc07cf6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.236 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.237 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.237 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac223414-02e4-44cd-a922-275fecc1c479', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.237495', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59ca2310-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '331ec437af9a2897a73d8c9e796c2c6e5792f5b2140128834f2ca5a7178b7aa7'}]}, 'timestamp': '2026-02-20 09:27:18.237828', '_unique_id': '11c733e834a74cf8a2c028c4b7c75637'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.238 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.239 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.239 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.239 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0d650d67-1f3f-4ff8-b805-328026ad48c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.239288', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59ca67d0-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '8d6f0e38bc343f6eff198b47dc761ae0c2180bd56d1d5e8ffb4c517265c9b3ea'}]}, 'timestamp': '2026-02-20 09:27:18.239581', '_unique_id': 'de33689b16a64982a5fda0afad9b3e78'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.254 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f023142-829d-4906-95e3-a554b108049d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.240947', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59ccc35e-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': '9011ea0603f0977ad8339c522d19ebed4561b58a306bdea0dda5eca373f40d15'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.240947', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59ccd042-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': 'b0c578147ff09aa3afb3f6e2e3f4dea2d4ae021177ff93a220832feacb19f53a'}]}, 'timestamp': '2026-02-20 09:27:18.255382', '_unique_id': 'c49c119b5c4f4dee94a5a7d9d8f71ac9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.256 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.257 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29109ce3-f454-445c-be59-52632c2af9f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.257155', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59cd229a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': 'ac840176ece93bf0f6a087d6f47e858ace16d0732475c0f839776579fd521ba4'}]}, 'timestamp': '2026-02-20 09:27:18.257494', '_unique_id': '0191877b273c4805b82171244960d0e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3fea2984-91bd-4024-92f4-bf449d63c4ff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.258955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d29f36-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '5fc7641cd88c7640d136651ad577186ffb1b6868db81f9dade182e849ad692bb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.258955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d2b1ce-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '23984be8de04affb061c35133ef231f7c108317b3567bbc966839e085e442af0'}]}, 'timestamp': '2026-02-20 09:27:18.293966', '_unique_id': '96ccd1a65ef04f84a3d2888a612d83b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.295 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.296 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b237f7a-9fa9-44a6-ad5e-0fee599749f0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.296677', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d32d20-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '19cf2324b2d148f376f6f35e5ea360c6e465c19e111a6209f70f685fa86883be'}]}, 'timestamp': '2026-02-20 09:27:18.297123', '_unique_id': 'f040b331336049d8a4b99a69ded2f5a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c44ea349-9eac-4b3e-acbf-c96abccb2bb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.299124', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d38c02-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '3f5f985857978e1aab639e0e5e7518208e2d2cf57546d94f54040a082cd52e98'}]}, 'timestamp': '2026-02-20 09:27:18.299562', '_unique_id': '19ff01a33daa487a94c601eb17e0ab0a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82581654-2d79-4a51-a62e-67469877044c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.301520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d3ea62-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '1cc3a57ad581ea60a6d6514ecb0359c40428101ee78253a5549c1e8de92974db'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.301520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d3f980-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '6a2e5003efeb6780417f4610337113a33315d353ede1ad61fd501345baf77303'}]}, 'timestamp': '2026-02-20 09:27:18.302349', '_unique_id': '0f4f409ec65d404d82c93d5d5c7eb0de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 58020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71808694-8773-4e27-b25d-4ca92bf38f49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 58020000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:27:18.304413', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '59d45baa-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.474230558, 'message_signature': '64ea9c09b6c2d30613aa86af26ab7172c69d9bc7f5b450d2d2362f6e427b6a1e'}]}, 'timestamp': '2026-02-20 09:27:18.304871', '_unique_id': '721df51cf25d41f3801ed1f67ab80148'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b7e79424-c2c3-4dc2-af0d-dcd31aaf2ef0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.306808', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d4b8f2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '33fad4e97a17e9508a37074846fe6f799866a2c9200a6a34222da47726806654'}]}, 'timestamp': '2026-02-20 09:27:18.307265', '_unique_id': 'c0a76d4c3d654be69e3fb04003b7480b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b7297f2-63bf-4e14-a202-f9885054cb0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.309209', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d515cc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '9459c5d8a1e1c7598e099c097ebcf8d349308d20932761543b285f0833298244'}]}, 'timestamp': '2026-02-20 09:27:18.309662', '_unique_id': '0f2aa3b126894347b9fbc30009fff9ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5c97103-f308-4be3-a2fd-3e29615e5767', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:27:18.311744', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '59d578d2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.441540766, 'message_signature': '55f0de0dde6251047b7b54fda4107edb994eab13e911dcb7b2b1a34a17524db2'}]}, 'timestamp': '2026-02-20 09:27:18.312192', '_unique_id': '2f5319f7e5ce47beb92054d7e83e0fca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c734bf9a-fd7b-4e6f-a8c1-3566297b229a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.314126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d5d5d4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': '081d8a3c21c90c39c679ff0460798fe8b9bdd823706886a0762137754b6157df'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.314126', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d5e696-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': 'b90b4fb3d5ca77b2600caae8d9f3c6023ae6e5173c998f6e41a5a69dfad17846'}]}, 'timestamp': '2026-02-20 09:27:18.314987', '_unique_id': '0804290f14f34abb8ea14a1053486795'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '158e9dc0-983f-4702-9cc6-f4db2c78b937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.316970', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d644ce-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '47f5a2c1967bd92f4eebb42a667a696c9d26431f01c583c099745cc90f8b53f9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.316970', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d6534c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '87653bc174a9c981daf0ed3a3bdcb7f98a0c3dd3a269196be0c4d302625c9db1'}]}, 'timestamp': '2026-02-20 09:27:18.317755', '_unique_id': '91612bf374fb45219191819d51412194'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c4d87e5-38db-40e5-8d86-117e50d0a624', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.319656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d6ae0a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '9f0311c74d1e85c1a7188c7b210715b317023d0c260fc6dca34259646ee99a27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.319656', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d6bca6-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '89e0d38d0e94c9ac7e50bbebe4bb88c660e2f13ef970ed96610da15e39e46a8c'}]}, 'timestamp': '2026-02-20 09:27:18.320428', '_unique_id': 'c7c9c63c4dda40d1b542fc3fd1df1323'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2e5482e-b343-48c6-82e3-b7e38019c4c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.322216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d71098-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': 'ea806c8ef6dfbec3dcc1ea81f26a8db2a3c28c141174b0bcab9be8192ac6957c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.322216', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d71ee4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.480126038, 'message_signature': '7165b4860b750db63c83ce20dccb18ad47d345380b58dfbe0bdb5e037e9cc0c5'}]}, 'timestamp': '2026-02-20 09:27:18.322959', '_unique_id': '958a4da6c3444da48ccbd9a04fe94c3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.324 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f722bcaf-3832-4652-a448-8c58c74e7002', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.324755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d773f8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '44b546b6457768c935532a91665af51c6da4386910322a9783bbf68954ef00fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.324755', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d780dc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '0ee06d2b8c3a9e75d15c3b2847be7563474589db7e19c7706c7faa2cb383e71a'}]}, 'timestamp': '2026-02-20 09:27:18.325441', '_unique_id': '52686bf7685741659194284ee0c5ef40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcc4e981-52c8-4dca-bf82-44e1fb3914df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:27:18.327441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '59d7db9a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '8d65c4baf20bea04dead104259f8d92f025874f0fd0336a72b7075bfdeb7abfb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:27:18.327441', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '59d7e752-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10197.498150361, 'message_signature': '6f3744c0b84d0a0894500e5d814ce36f6413cacc38ea25f6fac1fbf790fdc18b'}]}, 'timestamp': '2026-02-20 09:27:18.328031', '_unique_id': 'cd3f68b954a049aaa780bb8aff26bf4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:27:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:27:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.642 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.648 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.666 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.669 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:27:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.669 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:27:18 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:27:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:18.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:19.002 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:19.002 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:19.002 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:19.042 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:19.043 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:19 np0005625204.localdomain podman[245083]: 2026-02-20 09:27:19.054273062 +0000 UTC m=+0.988139343 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:27:19 np0005625204.localdomain podman[245111]: 2026-02-20 09:27:19.081117144 +0000 UTC m=+0.896149734 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:27:19 np0005625204.localdomain podman[245111]: 2026-02-20 09:27:19.126448901 +0000 UTC m=+0.941481451 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 20 09:27:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:27:19 np0005625204.localdomain podman[245083]: 2026-02-20 09:27:19.141492092 +0000 UTC m=+1.075358363 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:27:19 np0005625204.localdomain podman[245144]: 2026-02-20 09:27:19.214561395 +0000 UTC m=+0.253747143 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 20 09:27:19 np0005625204.localdomain podman[245144]: 2026-02-20 09:27:19.243502964 +0000 UTC m=+0.282688722 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:27:19 np0005625204.localdomain podman[245144]: unhealthy
Feb 20 09:27:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:27:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57963 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999C1E90000000001030307) 
Feb 20 09:27:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:27:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:24.043 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:24.046 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:24.046 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:24.046 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:24.084 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:24.085 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 20 09:27:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 20 09:27:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57964 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999D1A80000000001030307) 
Feb 20 09:27:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59108 DF PROTO=TCP SPT=51124 DPT=9105 SEQ=692478070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999DC690000000001030307) 
Feb 20 09:27:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 20 09:27:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-6a9b5811d370cf611c5d7f7587dd7d8e1e05fe7557daab610e6d30271092c47d-merged.mount: Deactivated successfully.
Feb 20 09:27:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:29.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:29.088 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:29.088 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:29.089 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:29.121 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:29.122 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59109 DF PROTO=TCP SPT=51124 DPT=9105 SEQ=692478070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999E4680000000001030307) 
Feb 20 09:27:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:27:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:27:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:27:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:27:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:27:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57965 DF PROTO=TCP SPT=50804 DPT=9882 SEQ=3653224083 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999F1680000000001030307) 
Feb 20 09:27:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:34.123 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:34.125 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:34.125 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:34.125 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:34.170 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:34.171 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11280 DF PROTO=TCP SPT=43356 DPT=9101 SEQ=3933118052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A5999FE740000000001030307) 
Feb 20 09:27:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:27:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ef9cc1375c4a3e979779fde9a22d44caa1f8d54d9be8e432ea85c98c54294ad4-merged.mount: Deactivated successfully.
Feb 20 09:27:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ef9cc1375c4a3e979779fde9a22d44caa1f8d54d9be8e432ea85c98c54294ad4-merged.mount: Deactivated successfully.
Feb 20 09:27:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2-merged.mount: Deactivated successfully.
Feb 20 09:27:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:27:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:27:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully.
Feb 20 09:27:37 np0005625204.localdomain podman[245180]: 2026-02-20 09:27:37.955782015 +0000 UTC m=+0.071556973 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:27:37 np0005625204.localdomain podman[245180]: 2026-02-20 09:27:37.99008738 +0000 UTC m=+0.105862318 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:27:37 np0005625204.localdomain podman[245180]: unhealthy
Feb 20 09:27:38 np0005625204.localdomain systemd[1]: tmp-crun.sB1Gf2.mount: Deactivated successfully.
Feb 20 09:27:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully.
Feb 20 09:27:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:27:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:27:38 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:27:38 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:27:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:39.172 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:39.174 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:39.174 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:39.175 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:39.207 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:39.209 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 20 09:27:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11282 DF PROTO=TCP SPT=43356 DPT=9101 SEQ=3933118052 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A0A680000000001030307) 
Feb 20 09:27:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:27:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:27:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:41 np0005625204.localdomain sshd[245203]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:27:42 np0005625204.localdomain sshd[245203]: Received disconnect from 54.36.99.29 port 52724:11: Bye Bye [preauth]
Feb 20 09:27:42 np0005625204.localdomain sshd[245203]: Disconnected from authenticating user root 54.36.99.29 port 52724 [preauth]
Feb 20 09:27:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:27:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:27:42 np0005625204.localdomain podman[245205]: 2026-02-20 09:27:42.133814023 +0000 UTC m=+0.078461831 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:27:42 np0005625204.localdomain podman[245205]: 2026-02-20 09:27:42.144981988 +0000 UTC m=+0.089629776 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:27:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59111 DF PROTO=TCP SPT=51124 DPT=9105 SEQ=692478070 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A15680000000001030307) 
Feb 20 09:27:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-17d67d7c6c3046ba2041c4048263641e426665d92e1e8fa18e3c871ca9222f66-merged.mount: Deactivated successfully.
Feb 20 09:27:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:44.209 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:44.211 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:44.211 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:44.211 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:44.212 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:44.212 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:27:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:27:45 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:27:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8512 DF PROTO=TCP SPT=48940 DPT=9100 SEQ=4107317744 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A25680000000001030307) 
Feb 20 09:27:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38848 DF PROTO=TCP SPT=46946 DPT=9882 SEQ=1065721462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A2AFA0000000001030307) 
Feb 20 09:27:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:27:49 np0005625204.localdomain podman[245228]: 2026-02-20 09:27:49.151612059 +0000 UTC m=+0.091896458 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, version=9.7, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:27:49 np0005625204.localdomain podman[245228]: 2026-02-20 09:27:49.166970579 +0000 UTC m=+0.107254988 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter)
Feb 20 09:27:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:49.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:49.245 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:49.246 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5033 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:49.246 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:49.251 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:49.252 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:49 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:27:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:50 np0005625204.localdomain sshd[245248]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:27:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:27:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:27:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:27:50 np0005625204.localdomain systemd[1]: tmp-crun.qTS2Zk.mount: Deactivated successfully.
Feb 20 09:27:50 np0005625204.localdomain podman[245252]: 2026-02-20 09:27:50.658040451 +0000 UTC m=+0.086448046 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:27:50 np0005625204.localdomain podman[245252]: 2026-02-20 09:27:50.701990981 +0000 UTC m=+0.130398576 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:27:50 np0005625204.localdomain podman[245251]: 2026-02-20 09:27:50.711289501 +0000 UTC m=+0.142622207 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:27:50 np0005625204.localdomain podman[245251]: 2026-02-20 09:27:50.744088443 +0000 UTC m=+0.175421119 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute)
Feb 20 09:27:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38850 DF PROTO=TCP SPT=46946 DPT=9882 SEQ=1065721462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A36E80000000001030307) 
Feb 20 09:27:50 np0005625204.localdomain podman[245251]: unhealthy
Feb 20 09:27:51 np0005625204.localdomain sshd[245248]: Invalid user oracle from 188.166.218.64 port 32788
Feb 20 09:27:51 np0005625204.localdomain sshd[245248]: Received disconnect from 188.166.218.64 port 32788:11: Bye Bye [preauth]
Feb 20 09:27:51 np0005625204.localdomain sshd[245248]: Disconnected from invalid user oracle 188.166.218.64 port 32788 [preauth]
Feb 20 09:27:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:27:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3-merged.mount: Deactivated successfully.
Feb 20 09:27:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e6d071d08fea63259fe30a26bb9b27228bc0b7a6111c0f215f4e35846a4b7e3-merged.mount: Deactivated successfully.
Feb 20 09:27:52 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:27:52 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:27:52 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:27:52 np0005625204.localdomain podman[245250]: 2026-02-20 09:27:52.500281312 +0000 UTC m=+1.919710217 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:27:52 np0005625204.localdomain podman[245250]: 2026-02-20 09:27:52.584181107 +0000 UTC m=+2.003610002 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 09:27:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:27:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 20 09:27:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 20 09:27:53 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:27:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.253 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.255 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.278 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.278 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:27:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:54.281 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:27:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38851 DF PROTO=TCP SPT=46946 DPT=9882 SEQ=1065721462 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A46A90000000001030307) 
Feb 20 09:27:55 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4bb1f8a81ebf31c6df88a84cd13b1c78ab0b7c78b4f247f0212f5208091a25c0-merged.mount: Deactivated successfully.
Feb 20 09:27:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:27:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63867 DF PROTO=TCP SPT=39066 DPT=9105 SEQ=239460387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A51A80000000001030307) 
Feb 20 09:27:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:27:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:59.280 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:27:59.284 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:27:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:27:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:27:59 np0005625204.localdomain sudo[245308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:27:59 np0005625204.localdomain sudo[245308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:27:59 np0005625204.localdomain sudo[245308]: pam_unix(sudo:session): session closed for user root
Feb 20 09:27:59 np0005625204.localdomain sudo[245326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:27:59 np0005625204.localdomain sudo[245326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:27:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63868 DF PROTO=TCP SPT=39066 DPT=9105 SEQ=239460387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A59A80000000001030307) 
Feb 20 09:27:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:01 np0005625204.localdomain sudo[245326]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:01 np0005625204.localdomain sudo[245376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:28:01 np0005625204.localdomain sudo[245376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:28:01 np0005625204.localdomain sudo[245376]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=171 DF PROTO=TCP SPT=50692 DPT=9102 SEQ=2315604798 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A65690000000001030307) 
Feb 20 09:28:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:28:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865-merged.mount: Deactivated successfully.
Feb 20 09:28:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:04.285 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e94527f44cf462204e4693ca956cece239562477adb3a43148eff33840dc865-merged.mount: Deactivated successfully.
Feb 20 09:28:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:28:05.989 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:28:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:28:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:28:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:28:05.995 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:28:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18399 DF PROTO=TCP SPT=56546 DPT=9101 SEQ=4089080019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A73A50000000001030307) 
Feb 20 09:28:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 20 09:28:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 20 09:28:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 20 09:28:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:28:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 20 09:28:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:28:09 np0005625204.localdomain podman[245394]: 2026-02-20 09:28:09.131053629 +0000 UTC m=+0.083735521 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:28:09 np0005625204.localdomain podman[245394]: 2026-02-20 09:28:09.158470234 +0000 UTC m=+0.111152166 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:28:09 np0005625204.localdomain podman[245394]: unhealthy
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:28:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:09.290 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4993-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:09.292 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:09.292 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:09.292 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:09.313 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:09.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:28:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18401 DF PROTO=TCP SPT=56546 DPT=9101 SEQ=4089080019 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A7FA90000000001030307) 
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:28:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:28:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63870 DF PROTO=TCP SPT=39066 DPT=9105 SEQ=239460387 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A89690000000001030307) 
Feb 20 09:28:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 20 09:28:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf-merged.mount: Deactivated successfully.
Feb 20 09:28:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-7ff7ba63966e943893a40b6c376bf7e1a08ba347363406c436be92326b7436bf-merged.mount: Deactivated successfully.
Feb 20 09:28:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:14.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:14.317 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:14.317 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:14.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:14.346 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:14.347 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:28:14 np0005625204.localdomain sshd[230896]: Received disconnect from 192.168.122.30 port 43136:11: disconnected by user
Feb 20 09:28:14 np0005625204.localdomain sshd[230896]: Disconnected from user zuul 192.168.122.30 port 43136
Feb 20 09:28:14 np0005625204.localdomain sshd[230893]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:28:14 np0005625204.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Feb 20 09:28:14 np0005625204.localdomain systemd[1]: session-56.scope: Consumed 1min 6.945s CPU time.
Feb 20 09:28:14 np0005625204.localdomain systemd-logind[759]: Session 56 logged out. Waiting for processes to exit.
Feb 20 09:28:14 np0005625204.localdomain systemd-logind[759]: Removed session 56.
Feb 20 09:28:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:28:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61176 DF PROTO=TCP SPT=57410 DPT=9100 SEQ=2428030968 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599A99680000000001030307) 
Feb 20 09:28:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:28:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:16 np0005625204.localdomain systemd[1]: tmp-crun.RPsEsU.mount: Deactivated successfully.
Feb 20 09:28:16 np0005625204.localdomain podman[245417]: 2026-02-20 09:28:16.176712471 +0000 UTC m=+0.114708507 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:28:16 np0005625204.localdomain podman[245417]: 2026-02-20 09:28:16.208675558 +0000 UTC m=+0.146671584 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:28:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:16 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:28:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38508 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AA02B0000000001030307) 
Feb 20 09:28:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:18.610 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:18.610 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:18.636 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:18.637 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:28:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:18.637 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:28:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.185 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.185 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.185 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.186 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.348 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.350 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.351 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.351 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.388 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.389 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:28:19 np0005625204.localdomain systemd[1]: tmp-crun.7NXbzu.mount: Deactivated successfully.
Feb 20 09:28:19 np0005625204.localdomain podman[245440]: 2026-02-20 09:28:19.636210122 +0000 UTC m=+0.100230375 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public)
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.646 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:28:19 np0005625204.localdomain podman[245440]: 2026-02-20 09:28:19.652997336 +0000 UTC m=+0.117017619 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, architecture=x86_64, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.664 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.665 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.665 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.665 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.666 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.666 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.667 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.667 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.667 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.668 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.683 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.684 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.684 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.684 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:28:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:19.685 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:28:20 np0005625204.localdomain sshd[245479]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.182 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.497s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.257 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.258 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:28:20 np0005625204.localdomain sshd[245479]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 09:28:20 np0005625204.localdomain sshd[245479]: Connection closed by 115.191.29.215 port 11650
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.467 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.468 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12499MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.469 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.469 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.568 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.569 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.569 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:28:20 np0005625204.localdomain sshd[245482]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:28:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:20.617 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:28:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38510 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AAC280000000001030307) 
Feb 20 09:28:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:28:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:21.122 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:28:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:21.134 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:28:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de-merged.mount: Deactivated successfully.
Feb 20 09:28:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:21.152 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:28:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:21.154 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:28:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:21.155 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:28:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d0efa0f5db57c39c7bb160a49b5780c03ae06dca3a570fc6900a29b607ec05de-merged.mount: Deactivated successfully.
Feb 20 09:28:21 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:28:22 np0005625204.localdomain sshd[245482]: Connection closed by authenticating user root 115.191.29.215 port 11652 [preauth]
Feb 20 09:28:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully.
Feb 20 09:28:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:28:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:28:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully.
Feb 20 09:28:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully.
Feb 20 09:28:22 np0005625204.localdomain podman[245507]: 2026-02-20 09:28:22.690008388 +0000 UTC m=+0.134603376 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 20 09:28:22 np0005625204.localdomain podman[245507]: 2026-02-20 09:28:22.698893805 +0000 UTC m=+0.143488813 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 20 09:28:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:28:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:28:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully.
Feb 20 09:28:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully.
Feb 20 09:28:23 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:28:23 np0005625204.localdomain podman[245535]: 2026-02-20 09:28:23.988682849 +0000 UTC m=+0.203935848 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:28:24 np0005625204.localdomain podman[245535]: 2026-02-20 09:28:24.072096409 +0000 UTC m=+0.287349448 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:28:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:24.385 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:24.391 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:28:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:28:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38511 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599ABBE80000000001030307) 
Feb 20 09:28:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:28:24 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:28:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 20 09:28:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:28:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:28:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully.
Feb 20 09:28:26 np0005625204.localdomain podman[245506]: 2026-02-20 09:28:26.854867097 +0000 UTC m=+4.301584392 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 20 09:28:26 np0005625204.localdomain podman[245506]: 2026-02-20 09:28:26.88801855 +0000 UTC m=+4.334735835 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:28:26 np0005625204.localdomain podman[245506]: unhealthy
Feb 20 09:28:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 20 09:28:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:28:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51025 DF PROTO=TCP SPT=58706 DPT=9105 SEQ=1261460501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AC6E80000000001030307) 
Feb 20 09:28:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:28:27 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:28:27 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:28:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 20 09:28:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:28:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:29.392 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51026 DF PROTO=TCP SPT=58706 DPT=9105 SEQ=1261460501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599ACEE90000000001030307) 
Feb 20 09:28:30 np0005625204.localdomain sshd[245566]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:28:30 np0005625204.localdomain sshd[245566]: Invalid user x from 18.221.252.160 port 58128
Feb 20 09:28:30 np0005625204.localdomain sshd[245566]: Received disconnect from 18.221.252.160 port 58128:11: Bye Bye [preauth]
Feb 20 09:28:30 np0005625204.localdomain sshd[245566]: Disconnected from invalid user x 18.221.252.160 port 58128 [preauth]
Feb 20 09:28:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38512 DF PROTO=TCP SPT=49728 DPT=9882 SEQ=1173638961 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599ADB690000000001030307) 
Feb 20 09:28:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.393 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.419 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.419 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5026 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.420 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.422 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.422 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:34.425 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:35 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55868 DF PROTO=TCP SPT=42486 DPT=9101 SEQ=1402282965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AE8D50000000001030307) 
Feb 20 09:28:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bcdbf69658b435b3643ef361fbfcbd57ebf5cb53d4f9a18cec2f56d5690ff17c-merged.mount: Deactivated successfully.
Feb 20 09:28:39 np0005625204.localdomain sshd[245568]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:28:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:28:39 np0005625204.localdomain sshd[245568]: Accepted publickey for zuul from 192.168.122.30 port 53136 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:28:39 np0005625204.localdomain systemd-logind[759]: New session 57 of user zuul.
Feb 20 09:28:39 np0005625204.localdomain systemd[1]: Started Session 57 of User zuul.
Feb 20 09:28:39 np0005625204.localdomain sshd[245568]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:28:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55870 DF PROTO=TCP SPT=42486 DPT=9101 SEQ=1402282965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AF4E80000000001030307) 
Feb 20 09:28:39 np0005625204.localdomain podman[245570]: 2026-02-20 09:28:39.413376613 +0000 UTC m=+0.110410973 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:28:39 np0005625204.localdomain podman[245570]: 2026-02-20 09:28:39.418368148 +0000 UTC m=+0.115402478 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:28:39 np0005625204.localdomain podman[245570]: unhealthy
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.446 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.447 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.448 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5022 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.448 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.449 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.449 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:39.451 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:39 np0005625204.localdomain sudo[245685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frkvsuspicwqdqpuciqgvuvmfucljmrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579719.4137876-3484-271393712688236/AnsiballZ_podman_container_info.py
Feb 20 09:28:39 np0005625204.localdomain sudo[245685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:39 np0005625204.localdomain python3.9[245687]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 20 09:28:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:40 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:28:40 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:28:42 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51028 DF PROTO=TCP SPT=58706 DPT=9105 SEQ=1261460501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599AFF680000000001030307) 
Feb 20 09:28:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:43 np0005625204.localdomain sudo[245685]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:44 np0005625204.localdomain sudo[245808]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujiynlqnrrsktoibrwqggchizjqnhjlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579724.1623728-3495-252264590575862/AnsiballZ_podman_container_exec.py
Feb 20 09:28:44 np0005625204.localdomain sudo[245808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.491 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.491 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5039 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.491 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:44.500 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:44 np0005625204.localdomain python3.9[245810]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:28:44 np0005625204.localdomain systemd[1]: Started libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope.
Feb 20 09:28:44 np0005625204.localdomain podman[245811]: 2026-02-20 09:28:44.771499653 +0000 UTC m=+0.113648303 container exec 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 20 09:28:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:44 np0005625204.localdomain podman[245811]: 2026-02-20 09:28:44.801660904 +0000 UTC m=+0.143809584 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:28:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:44 np0005625204.localdomain sudo[245808]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:45 np0005625204.localdomain sudo[245948]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwrcdvezbucbeocdradwagszbjhmnvtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579725.1191318-3503-98488647621984/AnsiballZ_podman_container_exec.py
Feb 20 09:28:45 np0005625204.localdomain sudo[245948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:45 np0005625204.localdomain python3.9[245950]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:28:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13461 DF PROTO=TCP SPT=51214 DPT=9102 SEQ=3193409718 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B0F680000000001030307) 
Feb 20 09:28:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:28:47 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40a74de9e16f39ddd50a68ccb753b2764268a068f562b46f9bbfdae63acb7788-merged.mount: Deactivated successfully.
Feb 20 09:28:47 np0005625204.localdomain systemd[1]: libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope: Deactivated successfully.
Feb 20 09:28:47 np0005625204.localdomain systemd[1]: Started libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope.
Feb 20 09:28:47 np0005625204.localdomain podman[245951]: 2026-02-20 09:28:47.297734425 +0000 UTC m=+1.632760993 container exec 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 20 09:28:47 np0005625204.localdomain podman[245951]: 2026-02-20 09:28:47.332944633 +0000 UTC m=+1.667971241 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:28:47 np0005625204.localdomain podman[245962]: 2026-02-20 09:28:47.346756354 +0000 UTC m=+0.281685422 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:28:47 np0005625204.localdomain podman[245962]: 2026-02-20 09:28:47.390932771 +0000 UTC m=+0.325861789 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:28:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18703 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B155A0000000001030307) 
Feb 20 09:28:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 20 09:28:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 20 09:28:48 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:28:48 np0005625204.localdomain sudo[245948]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:48 np0005625204.localdomain sudo[246110]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaivvhdlkwsjxulkkfrmulnjeirndfeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579728.3448439-3511-226678975992857/AnsiballZ_file.py
Feb 20 09:28:48 np0005625204.localdomain sudo[246110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:48 np0005625204.localdomain python3.9[246112]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:28:48 np0005625204.localdomain sudo[246110]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:48 np0005625204.localdomain systemd[1]: libpod-conmon-67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.scope: Deactivated successfully.
Feb 20 09:28:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:49 np0005625204.localdomain sudo[246220]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yptrfewmlvqzmhkzsjdxpodeatnrbuno ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579729.0741193-3520-244679094697994/AnsiballZ_podman_container_info.py
Feb 20 09:28:49 np0005625204.localdomain sudo[246220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:49.501 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:49.503 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:49.504 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:49.504 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:49.534 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:49.534 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:49 np0005625204.localdomain python3.9[246222]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 20 09:28:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 20 09:28:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2f138d9d6c461962e8cf2ee8539c9294af2f13aab0c8b266d53219a78c733e21-merged.mount: Deactivated successfully.
Feb 20 09:28:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18705 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B21680000000001030307) 
Feb 20 09:28:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:28:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:28:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 20 09:28:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 20 09:28:51 np0005625204.localdomain sudo[246220]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:51 np0005625204.localdomain podman[246236]: 2026-02-20 09:28:51.756489704 +0000 UTC m=+0.160579206 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, container_name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:28:51 np0005625204.localdomain podman[246236]: 2026-02-20 09:28:51.799222396 +0000 UTC m=+0.203311918 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Feb 20 09:28:52 np0005625204.localdomain sudo[246362]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqjzobgrzwkhafutivpojqrgujvwyfrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579731.9147706-3528-198518297924282/AnsiballZ_podman_container_exec.py
Feb 20 09:28:52 np0005625204.localdomain sudo[246362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:52 np0005625204.localdomain python3.9[246364]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:28:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:28:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:28:53 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:28:53 np0005625204.localdomain systemd[1]: Started libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope.
Feb 20 09:28:53 np0005625204.localdomain podman[246365]: 2026-02-20 09:28:53.24434096 +0000 UTC m=+0.880484085 container exec ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:28:53 np0005625204.localdomain podman[246365]: 2026-02-20 09:28:53.249232903 +0000 UTC m=+0.885376068 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:28:54 np0005625204.localdomain sudo[246362]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:54 np0005625204.localdomain podman[246394]: 2026-02-20 09:28:54.468375883 +0000 UTC m=+0.405381637 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 20 09:28:54 np0005625204.localdomain podman[246394]: 2026-02-20 09:28:54.498723469 +0000 UTC m=+0.435729243 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:28:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:54.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:54.537 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:54.537 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:54.538 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:54.572 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:54.572 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18706 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B31280000000001030307) 
Feb 20 09:28:54 np0005625204.localdomain sudo[246519]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceyscromqvuyzsfyovfzqsipdnobjlvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579734.6146944-3536-139221716449729/AnsiballZ_podman_container_exec.py
Feb 20 09:28:54 np0005625204.localdomain sudo[246519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:28:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:55 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:28:55 np0005625204.localdomain systemd[1]: libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope: Deactivated successfully.
Feb 20 09:28:55 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:28:55 np0005625204.localdomain podman[246522]: 2026-02-20 09:28:55.125122293 +0000 UTC m=+0.121944032 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:28:55 np0005625204.localdomain python3.9[246521]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:28:55 np0005625204.localdomain podman[246522]: 2026-02-20 09:28:55.169102774 +0000 UTC m=+0.165924533 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:28:55 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:28:55 np0005625204.localdomain systemd[1]: Started libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope.
Feb 20 09:28:55 np0005625204.localdomain podman[246547]: 2026-02-20 09:28:55.524653769 +0000 UTC m=+0.353325836 container exec ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:28:55 np0005625204.localdomain podman[246547]: 2026-02-20 09:28:55.531051005 +0000 UTC m=+0.359723112 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 20 09:28:56 np0005625204.localdomain sshd[246576]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:28:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 20 09:28:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340-merged.mount: Deactivated successfully.
Feb 20 09:28:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d59145aa9c81750f9d2e26499ec90595af58708a19d0844b9fae7fcd52a3b340-merged.mount: Deactivated successfully.
Feb 20 09:28:56 np0005625204.localdomain sudo[246519]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:57 np0005625204.localdomain sudo[246685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orpnkwjmzsijizwrnmkphpwnihzarriw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579737.141827-3544-78939033337241/AnsiballZ_file.py
Feb 20 09:28:57 np0005625204.localdomain sudo[246685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:57 np0005625204.localdomain sshd[246576]: Invalid user ubuntu from 27.112.79.3 port 37894
Feb 20 09:28:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63337 DF PROTO=TCP SPT=39892 DPT=9105 SEQ=3109845429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B3C290000000001030307) 
Feb 20 09:28:57 np0005625204.localdomain python3.9[246687]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:28:57 np0005625204.localdomain sudo[246685]: pam_unix(sudo:session): session closed for user root
Feb 20 09:28:57 np0005625204.localdomain sshd[246576]: Received disconnect from 27.112.79.3 port 37894:11: Bye Bye [preauth]
Feb 20 09:28:57 np0005625204.localdomain sshd[246576]: Disconnected from invalid user ubuntu 27.112.79.3 port 37894 [preauth]
Feb 20 09:28:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:28:58 np0005625204.localdomain sudo[246806]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnmnnpyhetvnuzwqgiemyzhzoasrwyef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579737.9393904-3553-206968643027213/AnsiballZ_podman_container_info.py
Feb 20 09:28:58 np0005625204.localdomain sudo[246806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:28:58 np0005625204.localdomain python3.9[246808]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Feb 20 09:28:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:28:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:28:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:59.573 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:59.575 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:28:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:59.576 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:28:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:59.576 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:59.602 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:28:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:28:59.603 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:28:59 np0005625204.localdomain systemd[1]: libpod-conmon-ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.scope: Deactivated successfully.
Feb 20 09:28:59 np0005625204.localdomain podman[246751]: 2026-02-20 09:28:59.656919032 +0000 UTC m=+1.587977094 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:28:59 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63338 DF PROTO=TCP SPT=39892 DPT=9105 SEQ=3109845429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B44280000000001030307) 
Feb 20 09:29:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:01 np0005625204.localdomain podman[246751]: 2026-02-20 09:29:01.68113028 +0000 UTC m=+3.612188322 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 20 09:29:01 np0005625204.localdomain podman[246751]: unhealthy
Feb 20 09:29:02 np0005625204.localdomain sudo[246827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:29:02 np0005625204.localdomain sudo[246827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:29:02 np0005625204.localdomain sudo[246827]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:02 np0005625204.localdomain sudo[246845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:29:02 np0005625204.localdomain sudo[246845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:29:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:02 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:29:02 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Failed with result 'exit-code'.
Feb 20 09:29:02 np0005625204.localdomain sudo[246806]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18707 DF PROTO=TCP SPT=53274 DPT=9882 SEQ=3027371234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B51680000000001030307) 
Feb 20 09:29:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:03 np0005625204.localdomain sudo[246845]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:03 np0005625204.localdomain sudo[247003]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itbrttmkkiqsfsghrjyyjbtipledgukj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579742.9825635-3561-198971815216520/AnsiballZ_podman_container_exec.py
Feb 20 09:29:03 np0005625204.localdomain sudo[247003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:04 np0005625204.localdomain python3.9[247005]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:04 np0005625204.localdomain systemd[1]: Started libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope.
Feb 20 09:29:04 np0005625204.localdomain podman[247006]: 2026-02-20 09:29:04.30968721 +0000 UTC m=+0.104489924 container exec 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:29:04 np0005625204.localdomain podman[247006]: 2026-02-20 09:29:04.340148504 +0000 UTC m=+0.134951238 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 20 09:29:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:04.604 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:04.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:04.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:04.608 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:04.637 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:04.638 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:05 np0005625204.localdomain sudo[247035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:29:05 np0005625204.localdomain sudo[247035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:29:05 np0005625204.localdomain sudo[247035]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:29:05.990 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:29:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:29:05.991 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:29:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:29:05.993 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:29:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 20 09:29:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61-merged.mount: Deactivated successfully.
Feb 20 09:29:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-5a6d255614f6fb8bbe458bab22374857122c06c78d4c0aacb8f6490a72d4cd61-merged.mount: Deactivated successfully.
Feb 20 09:29:06 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57224 DF PROTO=TCP SPT=43146 DPT=9101 SEQ=2809445260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B5E050000000001030307) 
Feb 20 09:29:06 np0005625204.localdomain sudo[247003]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:06 np0005625204.localdomain sudo[247160]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtmcglxhshbterqjaqihigvfzlfjdjyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579746.496214-3569-10103751596603/AnsiballZ_podman_container_exec.py
Feb 20 09:29:06 np0005625204.localdomain sudo[247160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:06 np0005625204.localdomain python3.9[247162]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:09 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:09 np0005625204.localdomain systemd[1]: libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope: Deactivated successfully.
Feb 20 09:29:09 np0005625204.localdomain systemd[1]: Started libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope.
Feb 20 09:29:09 np0005625204.localdomain podman[247163]: 2026-02-20 09:29:09.152865107 +0000 UTC m=+2.178629905 container exec 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:29:09 np0005625204.localdomain podman[247163]: 2026-02-20 09:29:09.18329048 +0000 UTC m=+2.209055338 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:29:09 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57226 DF PROTO=TCP SPT=43146 DPT=9101 SEQ=2809445260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B6A290000000001030307) 
Feb 20 09:29:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:09.638 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:09.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:09.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5042 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:09.680 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:09.687 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:09.688 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:29:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:11 np0005625204.localdomain sudo[247160]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:11 np0005625204.localdomain podman[247192]: 2026-02-20 09:29:11.299619711 +0000 UTC m=+0.234411300 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:29:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:11.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:11.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 09:29:11 np0005625204.localdomain podman[247192]: 2026-02-20 09:29:11.308958812 +0000 UTC m=+0.243750401 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:29:11 np0005625204.localdomain podman[247192]: unhealthy
Feb 20 09:29:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:11.329 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 09:29:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:11.331 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:11.331 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 09:29:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:11.350 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:11 np0005625204.localdomain sudo[247321]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvmpbrbajsftcuooipxkanybfomyyxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579751.4013093-3577-217033734029671/AnsiballZ_file.py
Feb 20 09:29:11 np0005625204.localdomain sudo[247321]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:11 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63340 DF PROTO=TCP SPT=39892 DPT=9105 SEQ=3109845429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B73690000000001030307) 
Feb 20 09:29:11 np0005625204.localdomain python3.9[247323]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:11 np0005625204.localdomain sudo[247321]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: libpod-conmon-8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.scope: Deactivated successfully.
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:29:12 np0005625204.localdomain sudo[247431]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvwqlqogqvpzwfmplgmaxmrwqxrzdnda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579752.0819995-3586-201594908472924/AnsiballZ_podman_container_info.py
Feb 20 09:29:12 np0005625204.localdomain sudo[247431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:12 np0005625204.localdomain python3.9[247433]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:13 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:13.383 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:13 np0005625204.localdomain sudo[247431]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:13 np0005625204.localdomain sudo[247555]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvvxhfpibitykoaxsnscrtajjgjlwcue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579753.6417975-3594-32560614600184/AnsiballZ_podman_container_exec.py
Feb 20 09:29:13 np0005625204.localdomain sudo[247555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:14 np0005625204.localdomain python3.9[247557]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:14 np0005625204.localdomain systemd[1]: Started libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope.
Feb 20 09:29:14 np0005625204.localdomain podman[247558]: 2026-02-20 09:29:14.21367895 +0000 UTC m=+0.105938782 container exec f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:29:14 np0005625204.localdomain podman[247558]: 2026-02-20 09:29:14.246115618 +0000 UTC m=+0.138375470 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.296 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.298 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.317 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.317 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.317 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.318 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.318 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.689 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.691 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.692 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.692 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.732 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.733 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:14.788 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.052 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.052 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.249 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.251 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12417MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.251 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.252 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.352 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.353 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.353 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.398 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.457 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.458 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.485 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.505 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:29:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:15.537 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:29:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54969 DF PROTO=TCP SPT=46126 DPT=9102 SEQ=3036961238 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B83680000000001030307) 
Feb 20 09:29:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c27e6babc49c3a4e136695682090684220031809a5e2e858a21f85d5b61fb17d-merged.mount: Deactivated successfully.
Feb 20 09:29:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:16.006 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:29:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:16.018 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:29:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:16.037 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:29:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:16.041 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:29:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:16.042 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.790s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:29:16 np0005625204.localdomain sudo[247555]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 20 09:29:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:29:16 np0005625204.localdomain sudo[247739]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmakyliysahkzsmjywwrsozmijognsbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579756.2455983-3602-133546909129880/AnsiballZ_podman_container_exec.py
Feb 20 09:29:16 np0005625204.localdomain sudo[247739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:17 np0005625204.localdomain python3.9[247741]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.043 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.044 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.044 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:29:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:29:17 np0005625204.localdomain systemd[1]: libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope: Deactivated successfully.
Feb 20 09:29:17 np0005625204.localdomain systemd[1]: Started libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope.
Feb 20 09:29:17 np0005625204.localdomain podman[247742]: 2026-02-20 09:29:17.199503547 +0000 UTC m=+0.175480608 container exec f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:29:17 np0005625204.localdomain podman[247742]: 2026-02-20 09:29:17.234232659 +0000 UTC m=+0.210209720 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.546 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:29:17 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61962 DF PROTO=TCP SPT=43588 DPT=9882 SEQ=1917522212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B8A8C0000000001030307) 
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.886 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.899 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.899 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.900 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.901 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.901 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:29:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:17.902 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.201 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.201 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.204 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b70dd158-b587-4ee7-aedd-f97edd4d0186', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.201866', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a14ba9fc-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'eef811acd47218083a7e194ce90146e60a436ab76be28c660b992467c4a802a0'}]}, 'timestamp': '2026-02-20 09:29:18.204930', '_unique_id': '40bb2491480549a1bf25c7204e88ff24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.205 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.206 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '674d8021-d3e3-4d20-b717-afdfbddce903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.206400', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a14bedd6-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'c42052d66c0e21176f18e8bdfbf07084c3607104f8d0ed07ef5d221ce219e6dd'}]}, 'timestamp': '2026-02-20 09:29:18.206622', '_unique_id': '242b0afdf47d487c93279d9078a47e43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.231 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.232 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8d30e0e-18de-4447-9880-e9cfd38d1652', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.207604', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a14fdc0c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '8855a4e030ad1beea095494c845561e15e1d96f80f34f4d7082fe96e59e40f27'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.207604', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a14fe5b2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'ef9e4f77b65d70161f7b02e606413ff98c419f233125ec616ebbd26b456a43d2'}]}, 'timestamp': '2026-02-20 09:29:18.232624', '_unique_id': '67053b086f2a4659a9b6848f9195119f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.233 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1ef7c7c-803c-4673-85cc-c93db4c487bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.234090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a1518dea-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': 'ab47198e338af69c443f9a3f7eb22c8c116884d3580d4829d7f68cc3342c72a9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.234090', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a1519600-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '6cd8ee387e76c928496c6568531450697a568e36d2e4c0a34077f68ca0ac9500'}]}, 'timestamp': '2026-02-20 09:29:18.243702', '_unique_id': '28df2373523344f6b1adfe30fcc8d99c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.244 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '615ef17a-d306-485d-a4ff-c12714c34758', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.244945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a151cf8a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '85f120fd046dc755bb35938d412273bb8b384fb439a7376b09b0f1b1dafd4123'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.244945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a151d746-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '6fc1e83de4112db3f490703e811af9bdda982c5fdfe746fac7a5053bd59bee11'}]}, 'timestamp': '2026-02-20 09:29:18.245349', '_unique_id': 'f96c720c4f8c4b0ab72f9c91f44e21bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.246 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da5306b-de6b-470d-b9b2-bdd25e240dbc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.246374', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1520752-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'a28af7f59cef164d752a584ebd6f911a3e9ad7da0f9df21af28e8e0f9ef184fd'}]}, 'timestamp': '2026-02-20 09:29:18.246593', '_unique_id': '60866eb962a34589818b911c531fa471'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.247 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf6ae61c-d580-4a33-bc60-a67c2dceb18d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.247543', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1523650-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '5ca9d47052d7d95316aad629cd2a3c376f37b36672a756e9aaa466f1729fef2e'}]}, 'timestamp': '2026-02-20 09:29:18.247797', '_unique_id': '39b2b0a092a84938b0b8f5147fccf8de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.248 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c855185-63fd-4ad0-a6c7-80659a2b4b4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.248778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a15264ea-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '4afe277bdebb90ac7c9f2f32be4366a3df35dc0d271fdeed09a17cedc2abd754'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.248778', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a1526c1a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'f9f59808ca26138d786844e73a2a985e72e04f90815523a18b0386858743e727'}]}, 'timestamp': '2026-02-20 09:29:18.249158', '_unique_id': 'f6dc3c70e7914b19a4a43ec31f0a51a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.249 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f951ad5c-8f33-42e7-837e-48cc474bacaf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.251072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a152be9a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '3233aeea1d611f4cf349c687aa414a7e8731227076f38c51094bbb6817550aa4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.251072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a152c61a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'fe290eae114bde1d46eb35cf47c622dfb9ab8c5a92a88f015803545fcea5b1d9'}]}, 'timestamp': '2026-02-20 09:29:18.251463', '_unique_id': '0bf6f7e4f5f64801893d8d1cbedc160b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9be8c296-d9a8-4d1c-b690-385467e6d8a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.252472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a152f536-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '9b7e1a6431022c0fe195139ba46627877232b723a42e124c238fd38ee62a25d3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.252472', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a152fd38-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '413c7fbd7a984336ce1376aa6cda2a657c30ee18eba7218d2e8c1f010c4e3b1d'}]}, 'timestamp': '2026-02-20 09:29:18.252874', '_unique_id': 'ff2e3887d6b24eb0a2bbe160306f6330'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.254 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1684e9f9-3910-44dd-86c5-37d62d7c8c0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.254705', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1534cac-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '2f0e4d04ad3ade9fc17264c0aefc934c403e63283da3ae6e7adfbdabe166d5fe'}]}, 'timestamp': '2026-02-20 09:29:18.254922', '_unique_id': 'c9ac08111c6c4d3ba21f59e74efa9cba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72081d2c-cda2-4bd1-9b51-b44fc7d0ed04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.255879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a1537a42-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '65222d989d5213a39a7b80b320a78a4c4dc36816753ab6231772758eaedbcbb9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.255879', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a153821c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'add27f31c7f120bec4a0e199292e0fe7644237d56a6064e39c8323c9b80f15af'}]}, 'timestamp': '2026-02-20 09:29:18.256275', '_unique_id': '90ff62f8f23e480c83be3c47e5531b9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.256 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4eaf06c-f0c5-45f8-a82d-f60d144a3652', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.257241', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a153af94-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '6dfef66d421a4006d909b765b0f94a6bba0c35d8b99a2e4b400e214c8755260a'}]}, 'timestamp': '2026-02-20 09:29:18.257453', '_unique_id': 'b6e989bf6a1941f4b990053670766a3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c9d7012-d1fb-415d-a0a0-a0a9983119fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.258405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a153dce4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': '4beb8830f1bd77919ad9e07c5c468b45128f0a858a327f8eefa7902756c52d9d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.258405', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a153e4be-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.446801008, 'message_signature': 'fe22c0b7ba3d8458427e09586ffd1b097659f01c043a41668fc05e874a80562d'}]}, 'timestamp': '2026-02-20 09:29:18.258799', '_unique_id': '19b39702213e4782bcf9fbfb59ea9581'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.259 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68ddd221-8431-4be9-a38a-959d0b097acd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.259756', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a15411be-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'bec66284c9054bd7c50b45e959974de50e9c4ef228f5309444601af5a7dcbe2c'}]}, 'timestamp': '2026-02-20 09:29:18.259965', '_unique_id': 'e9e8e1d189eb42219a657ef8a22b22c1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.260 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11a4730f-261a-4dff-aab5-e3091e1eabcc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.260943', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a1544026-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': 'd23d940dcdf4ffec28db47ed41f40dabc7fe870d3a3af500715c955a3b85cdb1'}]}, 'timestamp': '2026-02-20 09:29:18.261152', '_unique_id': '74f06d3c01394abe83699abaaaa6421f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 59050000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'faf94a01-bbf0-4ccd-87c2-927dcc0d1a2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 59050000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:29:18.262088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a156ef10-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.517559022, 'message_signature': 'abb1ce2d2d501ea1df316cea1df7259bbc7a69942221e354484134e991fa06a7'}]}, 'timestamp': '2026-02-20 09:29:18.278752', '_unique_id': 'e96de3ff7b9445ab82c209d8640f34d4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8a73df8-0963-499b-9227-320ee3ab0ea5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:29:18.279702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a1571cd8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': '75b999193a6df88a3e01c71d52b147f35f322eda619fdd93072f139175902a06'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:29:18.279702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a15723ea-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.473274002, 'message_signature': 'd825d57d3f112e4847d83af33c9a98bbc2afd5c430c0dcbce4577295c88adf45'}]}, 'timestamp': '2026-02-20 09:29:18.280091', '_unique_id': 'afb6d32d2ba245fc99f305c3a48e3ca7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83d1ef13-6899-4a0f-84d1-4a2ed95ae28d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:29:18.281052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a1575194-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.517559022, 'message_signature': 'fb7bf8fc523c1ba8c45551b98ede7d80677cc3edd7fcf81c72814b8df02dda17'}]}, 'timestamp': '2026-02-20 09:29:18.281253', '_unique_id': '27bc6bfdc6fc4723928de10410f46e13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.281 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'db7c1574-441e-4f68-9bcc-8d4388649fbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.282248', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a157804c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '4e10ed92719cb2163280ea8955b00e90f9faaf7ab96a204be125121ad41e74d3'}]}, 'timestamp': '2026-02-20 09:29:18.282455', '_unique_id': 'f16a6ebc6f434acdbc9a4463c003a6dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.283 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd46b0e91-30b9-458a-94e0-c58d06bed979', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:29:18.283395', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a157ad24-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10317.441040392, 'message_signature': '64eedd6f68a274c2a36b8b6ad5b5e8b287eacded519fd9d80b4951652cd329f2'}]}, 'timestamp': '2026-02-20 09:29:18.283605', '_unique_id': '4eb41026091a4cc4b09e20c31f367ea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:29:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:29:18.284 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:29:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:29:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:29:18 np0005625204.localdomain sudo[247739]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:18 np0005625204.localdomain sudo[247890]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tldpoofphqaljlwjgcyphtoqjtokcnxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579758.507184-3610-278075130290587/AnsiballZ_file.py
Feb 20 09:29:18 np0005625204.localdomain sudo[247890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:18 np0005625204.localdomain python3.9[247892]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:18 np0005625204.localdomain sudo[247890]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:19 np0005625204.localdomain sudo[248000]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duqufclfuebkufnejdtkdhbwcteomplb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579759.1367872-3619-154585220143618/AnsiballZ_podman_container_info.py
Feb 20 09:29:19 np0005625204.localdomain sudo[248000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:19 np0005625204.localdomain python3.9[248002]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 20 09:29:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:19.734 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:19.737 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:19.737 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:19.737 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:19.764 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:19.765 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:20 np0005625204.localdomain sshd[248015]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:29:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:20 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61964 DF PROTO=TCP SPT=43588 DPT=9882 SEQ=1917522212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599B96A90000000001030307) 
Feb 20 09:29:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:20 np0005625204.localdomain systemd[1]: libpod-conmon-f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.scope: Deactivated successfully.
Feb 20 09:29:20 np0005625204.localdomain podman[247772]: 2026-02-20 09:29:20.872846031 +0000 UTC m=+2.546478101 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:29:20 np0005625204.localdomain podman[247772]: 2026-02-20 09:29:20.932170178 +0000 UTC m=+2.605802278 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:29:21 np0005625204.localdomain sshd[248015]: Invalid user sol from 45.148.10.240 port 48662
Feb 20 09:29:21 np0005625204.localdomain sshd[248015]: Connection closed by invalid user sol 45.148.10.240 port 48662 [preauth]
Feb 20 09:29:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:22 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:29:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:29:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:24 np0005625204.localdomain sudo[248000]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:24 np0005625204.localdomain podman[248028]: 2026-02-20 09:29:24.029788033 +0000 UTC m=+0.256483102 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., version=9.7, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 20 09:29:24 np0005625204.localdomain podman[248028]: 2026-02-20 09:29:24.049127797 +0000 UTC m=+0.275822826 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, distribution-scope=public, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter)
Feb 20 09:29:24 np0005625204.localdomain sudo[248154]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeuhalmvcigfqvxvpyoffgrimoihmylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579764.1771626-3627-198228970509459/AnsiballZ_podman_container_exec.py
Feb 20 09:29:24 np0005625204.localdomain sudo[248154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:24 np0005625204.localdomain python3.9[248156]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:24 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:29:24 np0005625204.localdomain systemd[1]: Started libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope.
Feb 20 09:29:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:24.765 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:24 np0005625204.localdomain podman[248157]: 2026-02-20 09:29:24.770919613 +0000 UTC m=+0.136436477 container exec 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:29:24 np0005625204.localdomain podman[248157]: 2026-02-20 09:29:24.803148803 +0000 UTC m=+0.168665667 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:29:24 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61965 DF PROTO=TCP SPT=43588 DPT=9882 SEQ=1917522212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BA6680000000001030307) 
Feb 20 09:29:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:25 np0005625204.localdomain sudo[248154]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:29:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:29:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:26 np0005625204.localdomain sudo[248318]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlhndkbosivaqrupaixjzsfcpmkfnmqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579765.8013434-3635-215942108649781/AnsiballZ_podman_container_exec.py
Feb 20 09:29:26 np0005625204.localdomain sudo[248318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:26 np0005625204.localdomain python3.9[248320]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-a977e69aa1d921623e711e9fd358dcacc9436eba0b435de46bbf80b585921d95-merged.mount: Deactivated successfully.
Feb 20 09:29:27 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58311 DF PROTO=TCP SPT=48800 DPT=9105 SEQ=3761662783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BB1280000000001030307) 
Feb 20 09:29:27 np0005625204.localdomain systemd[1]: libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope: Deactivated successfully.
Feb 20 09:29:27 np0005625204.localdomain podman[248187]: 2026-02-20 09:29:27.74003001 +0000 UTC m=+2.184054450 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:29:27 np0005625204.localdomain podman[248188]: 2026-02-20 09:29:27.793076533 +0000 UTC m=+2.233197208 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:29:27 np0005625204.localdomain podman[248188]: 2026-02-20 09:29:27.801942839 +0000 UTC m=+2.242063554 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 20 09:29:27 np0005625204.localdomain systemd[1]: Started libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope.
Feb 20 09:29:27 np0005625204.localdomain podman[248187]: 2026-02-20 09:29:27.90820336 +0000 UTC m=+2.352227820 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:29:27 np0005625204.localdomain podman[248321]: 2026-02-20 09:29:27.917439458 +0000 UTC m=+1.649754259 container exec 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:29:27 np0005625204.localdomain podman[248321]: 2026-02-20 09:29:27.951207068 +0000 UTC m=+1.683521879 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:29:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 20 09:29:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:29:28 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:29:28 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:29:28 np0005625204.localdomain sudo[248318]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 20 09:29:28 np0005625204.localdomain systemd[1]: libpod-conmon-010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.scope: Deactivated successfully.
Feb 20 09:29:29 np0005625204.localdomain sudo[248478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwwakfxoneyrwkjhlqanwxuthgliehgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579768.9101603-3643-94615726630111/AnsiballZ_file.py
Feb 20 09:29:29 np0005625204.localdomain sudo[248478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:29 np0005625204.localdomain python3.9[248480]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:29 np0005625204.localdomain sudo[248478]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c81d46f937f1f84faf68fb71f862e4bd868921a7d16384d44790308d98719f-merged.mount: Deactivated successfully.
Feb 20 09:29:29 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58312 DF PROTO=TCP SPT=48800 DPT=9105 SEQ=3761662783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BB9280000000001030307) 
Feb 20 09:29:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:29.768 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:29.770 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:29.770 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:29.770 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:29.772 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:29 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:29.772 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:29 np0005625204.localdomain sudo[248588]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxiatbitumdtujkdniaxoxnbthvhtqfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579769.5970836-3652-246349961695984/AnsiballZ_podman_container_info.py
Feb 20 09:29:29 np0005625204.localdomain sudo[248588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8194fb55613f783d5a49e05926ed565eee5321a07b6e20f485946b5c4f31cab4-merged.mount: Deactivated successfully.
Feb 20 09:29:30 np0005625204.localdomain python3.9[248590]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 20 09:29:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:29:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 20 09:29:31 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 20 09:29:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26776 DF PROTO=TCP SPT=40870 DPT=9100 SEQ=2551385514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BC5680000000001030307) 
Feb 20 09:29:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:29:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:29:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 20 09:29:33 np0005625204.localdomain podman[248604]: 2026-02-20 09:29:33.181234995 +0000 UTC m=+0.184564900 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:29:33 np0005625204.localdomain sudo[248588]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:33 np0005625204.localdomain podman[248604]: 2026-02-20 09:29:33.196034273 +0000 UTC m=+0.199364238 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:29:33 np0005625204.localdomain sudo[248728]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxstudfndgcdqioyulpjuaqfpvujhbqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579773.3831291-3660-204909000846907/AnsiballZ_podman_container_exec.py
Feb 20 09:29:33 np0005625204.localdomain sudo[248728]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:33 np0005625204.localdomain python3.9[248730]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: Started libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope.
Feb 20 09:29:34 np0005625204.localdomain podman[248731]: 2026-02-20 09:29:34.400225984 +0000 UTC m=+0.540699629 container exec 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, name=ubi9/ubi-minimal, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 20 09:29:34 np0005625204.localdomain podman[248731]: 2026-02-20 09:29:34.435148012 +0000 UTC m=+0.575621617 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, version=9.7, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, distribution-scope=public, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 20 09:29:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:34.773 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:34.775 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:34.775 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:34.776 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:34.812 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:34 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:34.813 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:35 np0005625204.localdomain sudo[248728]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:35 np0005625204.localdomain systemd[1]: libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope: Deactivated successfully.
Feb 20 09:29:36 np0005625204.localdomain sudo[248867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxovytogwhyzerttxybignldglrrqxsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579775.1898458-3668-92659863435428/AnsiballZ_podman_container_exec.py
Feb 20 09:29:36 np0005625204.localdomain sudo[248867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:36 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10338 DF PROTO=TCP SPT=39658 DPT=9101 SEQ=3112116702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BD3350000000001030307) 
Feb 20 09:29:36 np0005625204.localdomain python3.9[248869]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 20 09:29:36 np0005625204.localdomain systemd[1]: Started libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope.
Feb 20 09:29:36 np0005625204.localdomain podman[248870]: 2026-02-20 09:29:36.479912003 +0000 UTC m=+0.106316603 container exec 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, config_id=openstack_network_exporter, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible)
Feb 20 09:29:36 np0005625204.localdomain podman[248870]: 2026-02-20 09:29:36.508951061 +0000 UTC m=+0.135355621 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-type=git, maintainer=Red Hat, Inc.)
Feb 20 09:29:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 20 09:29:36 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-b52b75a7380249fd6beb40dca6e23a5c2c2b3650de6523e005db6f52b5fe90d0-merged.mount: Deactivated successfully.
Feb 20 09:29:37 np0005625204.localdomain sudo[248867]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:37 np0005625204.localdomain sudo[249006]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhvfrasmuequaefoieuzyocdpvlmdocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579777.2110631-3676-281432697708379/AnsiballZ_file.py
Feb 20 09:29:37 np0005625204.localdomain sudo[249006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:37 np0005625204.localdomain python3.9[249008]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:37 np0005625204.localdomain sudo[249006]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:38 np0005625204.localdomain sudo[249116]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trqyjqkjvvjygyahtyofetpdtwoaxuvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579778.5088367-3690-256748300703035/AnsiballZ_file.py
Feb 20 09:29:38 np0005625204.localdomain sudo[249116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:38 np0005625204.localdomain python3.9[249118]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:38 np0005625204.localdomain sudo[249116]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:39 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10340 DF PROTO=TCP SPT=39658 DPT=9101 SEQ=3112116702 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BDF280000000001030307) 
Feb 20 09:29:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:39 np0005625204.localdomain systemd[1]: libpod-conmon-7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.scope: Deactivated successfully.
Feb 20 09:29:39 np0005625204.localdomain sudo[249226]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvdlpepmwbwcgglqnzphazhkayfhmbnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579779.3375716-3717-12228653072524/AnsiballZ_stat.py
Feb 20 09:29:39 np0005625204.localdomain sudo[249226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:39 np0005625204.localdomain python3.9[249228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:39.814 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:39.817 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:39.817 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:39.817 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:39 np0005625204.localdomain sudo[249226]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:39.856 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:39 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:39.856 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:40 np0005625204.localdomain sudo[249314]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjloycqdzyqrnouwbveriapjdfivmfpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579779.3375716-3717-12228653072524/AnsiballZ_copy.py
Feb 20 09:29:40 np0005625204.localdomain sudo[249314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:40 np0005625204.localdomain python3.9[249316]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579779.3375716-3717-12228653072524/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:40 np0005625204.localdomain sudo[249314]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:41 np0005625204.localdomain sudo[249424]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqksytyjkeeiotcrzeinakdprbibcrjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579780.7793317-3765-169534852782759/AnsiballZ_file.py
Feb 20 09:29:41 np0005625204.localdomain sudo[249424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:41 np0005625204.localdomain python3.9[249426]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:41 np0005625204.localdomain sudo[249424]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:41 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:41 np0005625204.localdomain sudo[249534]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyhzzlvtnmybqxxwxqfmcyderyodszrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579781.481293-3789-191747856322010/AnsiballZ_stat.py
Feb 20 09:29:41 np0005625204.localdomain sudo[249534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:41 np0005625204.localdomain python3.9[249536]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:41 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58314 DF PROTO=TCP SPT=48800 DPT=9105 SEQ=3761662783 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BE9680000000001030307) 
Feb 20 09:29:41 np0005625204.localdomain sudo[249534]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:42 np0005625204.localdomain sudo[249591]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omxbstjnbqnotqjbtjwrhjrzdjwxngbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579781.481293-3789-191747856322010/AnsiballZ_file.py
Feb 20 09:29:42 np0005625204.localdomain sudo[249591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:42 np0005625204.localdomain python3.9[249593]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:42 np0005625204.localdomain sudo[249591]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:29:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:42 np0005625204.localdomain podman[249611]: 2026-02-20 09:29:42.683214095 +0000 UTC m=+0.095226345 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:29:42 np0005625204.localdomain podman[249611]: 2026-02-20 09:29:42.691630187 +0000 UTC m=+0.103642397 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:29:42 np0005625204.localdomain podman[249611]: unhealthy
Feb 20 09:29:42 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:29:42 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:29:42 np0005625204.localdomain sudo[249723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-werlnpajxxyypmdmbqivulkiryxvhcnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579782.6265762-3825-125789490356606/AnsiballZ_stat.py
Feb 20 09:29:42 np0005625204.localdomain sudo[249723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:43 np0005625204.localdomain python3.9[249725]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:43 np0005625204.localdomain sudo[249723]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:43 np0005625204.localdomain sudo[249780]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wflmzmofkvrgortrfrvkopojkrmjwstv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579782.6265762-3825-125789490356606/AnsiballZ_file.py
Feb 20 09:29:43 np0005625204.localdomain sudo[249780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:43 np0005625204.localdomain python3.9[249782]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.5fp4eekw recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:43 np0005625204.localdomain sudo[249780]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:44 np0005625204.localdomain sudo[249890]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iboynznqblcdftmysmotqhcnzgajqatu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579783.8443358-3861-208051654969840/AnsiballZ_stat.py
Feb 20 09:29:44 np0005625204.localdomain sudo[249890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:44 np0005625204.localdomain python3.9[249892]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:44 np0005625204.localdomain sudo[249890]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:44 np0005625204.localdomain sudo[249947]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdyfbcalojcsdpsglkouvfuluvpkeder ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579783.8443358-3861-208051654969840/AnsiballZ_file.py
Feb 20 09:29:44 np0005625204.localdomain sudo[249947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:44 np0005625204.localdomain python3.9[249949]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:44 np0005625204.localdomain sudo[249947]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:44 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:44.857 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:45 np0005625204.localdomain sudo[250057]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmyawvjtefqrwsbfjdwbhhasadgyhspx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579785.027662-3900-61332573273459/AnsiballZ_command.py
Feb 20 09:29:45 np0005625204.localdomain sudo[250057]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:45 np0005625204.localdomain python3.9[250059]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:29:45 np0005625204.localdomain sudo[250057]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6765 DF PROTO=TCP SPT=37224 DPT=9102 SEQ=3285053212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BF9680000000001030307) 
Feb 20 09:29:46 np0005625204.localdomain sudo[250168]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrmywakntingojogkmsforvfyobmwmoe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579785.7400513-3924-97403891813893/AnsiballZ_edpm_nftables_from_files.py
Feb 20 09:29:46 np0005625204.localdomain sudo[250168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:46 np0005625204.localdomain python3[250170]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 20 09:29:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-81def779b7ffced42b1ca3f9e33c7066e3e88c316142aa24c993126fa1840f24-merged.mount: Deactivated successfully.
Feb 20 09:29:46 np0005625204.localdomain sudo[250168]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:29:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 4939 writes, 22K keys, 4939 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4939 writes, 637 syncs, 7.75 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:29:47 np0005625204.localdomain sudo[250278]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebvlufvijbryogplkogsjimssqdcynki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579786.572642-3948-238458748742639/AnsiballZ_stat.py
Feb 20 09:29:47 np0005625204.localdomain sudo[250278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:47 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21189 DF PROTO=TCP SPT=58658 DPT=9882 SEQ=85989402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599BFFBA0000000001030307) 
Feb 20 09:29:47 np0005625204.localdomain python3.9[250280]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:47 np0005625204.localdomain sudo[250278]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:48 np0005625204.localdomain sudo[250335]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnckqonufsiqawqfyxkotuowxyuwmbhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579786.572642-3948-238458748742639/AnsiballZ_file.py
Feb 20 09:29:48 np0005625204.localdomain sudo[250335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:48 np0005625204.localdomain python3.9[250337]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:48 np0005625204.localdomain sudo[250335]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:48 np0005625204.localdomain sudo[250445]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvizpxgeuogsffyyenmzemmpzphesyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579788.4429328-3984-99305883287197/AnsiballZ_stat.py
Feb 20 09:29:48 np0005625204.localdomain sudo[250445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:48 np0005625204.localdomain python3.9[250447]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:48 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:48 np0005625204.localdomain sudo[250445]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:49 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:49 np0005625204.localdomain sudo[250502]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuxjdrlcmjeuzxgkxzcwxrnqovqoglvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579788.4429328-3984-99305883287197/AnsiballZ_file.py
Feb 20 09:29:49 np0005625204.localdomain sudo[250502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:49 np0005625204.localdomain python3.9[250504]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:49 np0005625204.localdomain sudo[250502]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:49.860 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:49.862 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:49.863 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:49.863 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:49.898 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:49 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:49.898 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:50 np0005625204.localdomain sudo[250612]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yervghzsmogugnfhoyuljvcrnrsrigtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579790.0100608-4020-73201735055128/AnsiballZ_stat.py
Feb 20 09:29:50 np0005625204.localdomain sudo[250612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:50 np0005625204.localdomain python3.9[250614]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:50 np0005625204.localdomain sudo[250612]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:50 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21191 DF PROTO=TCP SPT=58658 DPT=9882 SEQ=85989402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C0BA80000000001030307) 
Feb 20 09:29:50 np0005625204.localdomain sudo[250669]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdjpfdtdajhhdjoycjyewqtaiobkanfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579790.0100608-4020-73201735055128/AnsiballZ_file.py
Feb 20 09:29:50 np0005625204.localdomain sudo[250669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:50 np0005625204.localdomain python3.9[250671]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:51 np0005625204.localdomain sudo[250669]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:29:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5716 writes, 24K keys, 5716 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5716 writes, 803 syncs, 7.12 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:29:51 np0005625204.localdomain sudo[250779]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjzqecwssabhbehygybhvdbfublpdwxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579791.165594-4056-173191269881263/AnsiballZ_stat.py
Feb 20 09:29:51 np0005625204.localdomain sudo[250779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:51 np0005625204.localdomain python3.9[250781]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:51 np0005625204.localdomain sudo[250779]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:51 np0005625204.localdomain sudo[250836]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpzvgquaocfvvfrqjodskkpevxgekfbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579791.165594-4056-173191269881263/AnsiballZ_file.py
Feb 20 09:29:51 np0005625204.localdomain sudo[250836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:52 np0005625204.localdomain python3.9[250838]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:52 np0005625204.localdomain sudo[250836]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:29:52 np0005625204.localdomain sudo[250946]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osjyokfxxhmxksfsgmcxhqrthoxnsvom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579792.422267-4092-214355718096356/AnsiballZ_stat.py
Feb 20 09:29:52 np0005625204.localdomain sudo[250946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:52 np0005625204.localdomain python3.9[250948]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:29:53 np0005625204.localdomain sudo[250946]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:29:53 np0005625204.localdomain podman[250967]: 2026-02-20 09:29:53.156927331 +0000 UTC m=+0.088825129 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:29:53 np0005625204.localdomain podman[250967]: 2026-02-20 09:29:53.163623047 +0000 UTC m=+0.095520875 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:29:53 np0005625204.localdomain sudo[251059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfvlnmhnqlntkhyjdgbazpyivlnxuhfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579792.422267-4092-214355718096356/AnsiballZ_copy.py
Feb 20 09:29:53 np0005625204.localdomain sudo[251059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:29:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:29:53 np0005625204.localdomain python3.9[251061]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1771579792.422267-4092-214355718096356/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:53 np0005625204.localdomain sudo[251059]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:53 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:29:54 np0005625204.localdomain sudo[251169]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwvggbkpbmbxscieokyuuwttuesgmaxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579793.8046927-4136-109376496882916/AnsiballZ_file.py
Feb 20 09:29:54 np0005625204.localdomain sudo[251169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:54 np0005625204.localdomain python3.9[251171]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:54 np0005625204.localdomain sudo[251169]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:54 np0005625204.localdomain sudo[251279]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxbjqgetghbsaihgfxsbdplxagnuferm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579794.516782-4161-53809999077285/AnsiballZ_command.py
Feb 20 09:29:54 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21192 DF PROTO=TCP SPT=58658 DPT=9882 SEQ=85989402 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C1B680000000001030307) 
Feb 20 09:29:54 np0005625204.localdomain sudo[251279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:29:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:54.899 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:54.901 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:54.901 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:54.901 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:54.925 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:54 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:54.925 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:54 np0005625204.localdomain podman[251282]: 2026-02-20 09:29:54.9263047 +0000 UTC m=+0.119148847 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, version=9.7, distribution-scope=public, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 20 09:29:54 np0005625204.localdomain podman[251282]: 2026-02-20 09:29:54.939992902 +0000 UTC m=+0.132837089 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:29:55 np0005625204.localdomain python3.9[251281]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:29:55 np0005625204.localdomain sudo[251279]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:55 np0005625204.localdomain sudo[251411]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zswmkvdtuzyesntaciefmizxiaxlxwmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579795.2676363-4185-228229386260802/AnsiballZ_blockinfile.py
Feb 20 09:29:55 np0005625204.localdomain sudo[251411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:55 np0005625204.localdomain python3.9[251413]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:55 np0005625204.localdomain sudo[251411]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe-merged.mount: Deactivated successfully.
Feb 20 09:29:56 np0005625204.localdomain sudo[251521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujpbtrsalvyatopyraedarplcgarukfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579796.2826567-4212-10568256223397/AnsiballZ_command.py
Feb 20 09:29:56 np0005625204.localdomain sudo[251521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fd08ae47ed869ffb6da51bce33892d8def4dc87fdb9d181db114fbf82742dcbe-merged.mount: Deactivated successfully.
Feb 20 09:29:56 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:29:56 np0005625204.localdomain python3.9[251523]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:29:56 np0005625204.localdomain sudo[251521]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:57 np0005625204.localdomain sudo[251632]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edwxgaasngyqaournrfkebzgzkfnvnei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579797.0188189-4236-77588474071753/AnsiballZ_stat.py
Feb 20 09:29:57 np0005625204.localdomain sudo[251632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:57 np0005625204.localdomain python3.9[251634]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:29:57 np0005625204.localdomain sudo[251632]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:57 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46172 DF PROTO=TCP SPT=60534 DPT=9105 SEQ=285388273 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C26680000000001030307) 
Feb 20 09:29:58 np0005625204.localdomain sudo[251744]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcaueykyxyswhqzwuwesaxzepuvflxdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579798.2514305-4260-3521014039563/AnsiballZ_command.py
Feb 20 09:29:58 np0005625204.localdomain sudo[251744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:58 np0005625204.localdomain python3.9[251746]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:29:58 np0005625204.localdomain sudo[251744]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:29:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:29:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:29:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:29:58 np0005625204.localdomain podman[251750]: 2026-02-20 09:29:58.915674169 +0000 UTC m=+0.089290174 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:29:58 np0005625204.localdomain podman[251751]: 2026-02-20 09:29:58.990579238 +0000 UTC m=+0.160040099 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 20 09:29:59 np0005625204.localdomain podman[251750]: 2026-02-20 09:29:59.01914226 +0000 UTC m=+0.192758315 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true)
Feb 20 09:29:59 np0005625204.localdomain podman[251751]: 2026-02-20 09:29:59.075781518 +0000 UTC m=+0.245242349 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:29:59 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:29:59 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:29:59 np0005625204.localdomain sudo[251902]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vefxtcqxgascynohltgpjjqauxjvkquu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579799.018697-4283-88113566904059/AnsiballZ_file.py
Feb 20 09:29:59 np0005625204.localdomain sudo[251902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:29:59 np0005625204.localdomain python3.9[251904]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:29:59 np0005625204.localdomain sudo[251902]: pam_unix(sudo:session): session closed for user root
Feb 20 09:29:59 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:29:59 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:29:59 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:29:59 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:29:59 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:29:59 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:29:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:59.926 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:29:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:59.927 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:29:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:59.928 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:29:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:59.928 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:59.929 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:29:59 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:29:59.933 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:00 np0005625204.localdomain sshd[245568]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:30:00 np0005625204.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Feb 20 09:30:00 np0005625204.localdomain systemd[1]: session-57.scope: Consumed 27.754s CPU time.
Feb 20 09:30:00 np0005625204.localdomain systemd-logind[759]: Session 57 logged out. Waiting for processes to exit.
Feb 20 09:30:00 np0005625204.localdomain systemd-logind[759]: Removed session 57.
Feb 20 09:30:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48581 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C32440000000001030307) 
Feb 20 09:30:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:30:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:30:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 20 09:30:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:30:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:30:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:03 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48583 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C3E690000000001030307) 
Feb 20 09:30:04 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:04.929 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:30:05 np0005625204.localdomain podman[251925]: 2026-02-20 09:30:05.146894187 +0000 UTC m=+0.087387039 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute)
Feb 20 09:30:05 np0005625204.localdomain podman[251925]: 2026-02-20 09:30:05.193035662 +0000 UTC m=+0.133528454 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute)
Feb 20 09:30:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 20 09:30:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207-merged.mount: Deactivated successfully.
Feb 20 09:30:05 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-9843db5111fe4d0798dc7ee0621b78c9aa48d84f800c4b3a4e56e928248a7207-merged.mount: Deactivated successfully.
Feb 20 09:30:05 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:30:05 np0005625204.localdomain sudo[251944]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:30:05 np0005625204.localdomain sudo[251944]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:30:05 np0005625204.localdomain sudo[251944]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:05 np0005625204.localdomain sudo[251962]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:30:05 np0005625204.localdomain sudo[251962]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:30:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:30:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:30:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:30:05.992 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:30:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:30:05.994 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:30:06 np0005625204.localdomain sshd[251980]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:30:06 np0005625204.localdomain sshd[251980]: Accepted publickey for zuul from 192.168.122.30 port 54596 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:30:06 np0005625204.localdomain systemd-logind[759]: New session 58 of user zuul.
Feb 20 09:30:06 np0005625204.localdomain systemd[1]: Started Session 58 of User zuul.
Feb 20 09:30:06 np0005625204.localdomain sshd[251980]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:30:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 20 09:30:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 20 09:30:06 np0005625204.localdomain sudo[251962]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:06 np0005625204.localdomain sudo[252124]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbqzvatnulehusgcjflmnopzbwfvxmpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579806.4172652-23-194397758779540/AnsiballZ_file.py
Feb 20 09:30:06 np0005625204.localdomain sudo[252124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:07 np0005625204.localdomain python3.9[252126]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config/container-startup-config/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:07 np0005625204.localdomain sudo[252124]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:07 np0005625204.localdomain sudo[252234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iybpfwwdccgepvafvmaeucgrnbbdqtka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579807.4584434-23-197063412644134/AnsiballZ_file.py
Feb 20 09:30:07 np0005625204.localdomain sudo[252234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48584 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C4E280000000001030307) 
Feb 20 09:30:07 np0005625204.localdomain python3.9[252236]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:07 np0005625204.localdomain sudo[252234]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 20 09:30:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-49cfb5b81292fa2b6de2df0d13f19f8012e724ee74b28808d4b9c5fb43e8ac1c-merged.mount: Deactivated successfully.
Feb 20 09:30:08 np0005625204.localdomain sudo[252344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzzgynxhkowqktlatyjqxuerfoozuhyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579808.0699582-23-2321071341755/AnsiballZ_file.py
Feb 20 09:30:08 np0005625204.localdomain sudo[252344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:08 np0005625204.localdomain python3.9[252346]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:08 np0005625204.localdomain sudo[252344]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:09 np0005625204.localdomain sudo[252425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:30:09 np0005625204.localdomain sudo[252425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:30:09 np0005625204.localdomain sudo[252425]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:09 np0005625204.localdomain python3.9[252472]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:09 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:09.932 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully.
Feb 20 09:30:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully.
Feb 20 09:30:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully.
Feb 20 09:30:10 np0005625204.localdomain python3.9[252558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579809.2405798-101-217869708377355/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:11 np0005625204.localdomain python3.9[252666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:11 np0005625204.localdomain python3.9[252752]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579810.4102385-101-236777048400658/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully.
Feb 20 09:30:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully.
Feb 20 09:30:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully.
Feb 20 09:30:12 np0005625204.localdomain python3.9[252860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:30:12 np0005625204.localdomain python3.9[252946]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579811.9420767-101-24524186812911/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=ef2f7fbed7b4b53fbfecfbf9796227b8acb52519 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:12 np0005625204.localdomain podman[252947]: 2026-02-20 09:30:12.89567664 +0000 UTC m=+0.080044343 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:30:12 np0005625204.localdomain podman[252947]: 2026-02-20 09:30:12.930135054 +0000 UTC m=+0.114502717 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:30:12 np0005625204.localdomain podman[252947]: unhealthy
Feb 20 09:30:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:30:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully.
Feb 20 09:30:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully.
Feb 20 09:30:13 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Main process exited, code=exited, status=1/FAILURE
Feb 20 09:30:13 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Failed with result 'exit-code'.
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.301 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.301 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.302 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:14 np0005625204.localdomain python3.9[253077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:30:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:30:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 20 09:30:14 np0005625204.localdomain python3.9[253163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579813.9847262-276-149246084429781/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=13d630d090b626c2aab1085bca0daa7abb0cabfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.936 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.938 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.938 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.938 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.995 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:14.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.321 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.321 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.321 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:30:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:30:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 20 09:30:15 np0005625204.localdomain python3.9[253272]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:30:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:15.796 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.119 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.119 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:30:16 np0005625204.localdomain sudo[253403]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjqhsbexojdyyetpqkpqvibyabppccwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579815.8722286-349-161289532009092/AnsiballZ_file.py
Feb 20 09:30:16 np0005625204.localdomain sudo[253403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48585 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599C6F690000000001030307) 
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.317 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.318 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12445MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.318 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.319 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:30:16 np0005625204.localdomain python3.9[253405]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:16 np0005625204.localdomain sudo[253403]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.493 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.493 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.493 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:30:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.543 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:30:16 np0005625204.localdomain sudo[253533]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ladlwrjtvrxtddobsnfuhtxybyxvgcqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579816.5946865-371-270750448771877/AnsiballZ_stat.py
Feb 20 09:30:16 np0005625204.localdomain sudo[253533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:16.998 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:30:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:17.006 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:30:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:17.026 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:30:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:17.029 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:30:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:17.029 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:30:17 np0005625204.localdomain python3.9[253535]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:17 np0005625204.localdomain sudo[253533]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:17 np0005625204.localdomain sudo[253592]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwfojxjiwdadztenpobbwtkxnvtazcgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579816.5946865-371-270750448771877/AnsiballZ_file.py
Feb 20 09:30:17 np0005625204.localdomain sudo[253592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully.
Feb 20 09:30:17 np0005625204.localdomain python3.9[253594]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:17 np0005625204.localdomain sudo[253592]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.026 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:30:18 np0005625204.localdomain sudo[253702]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncplwhnxtmvvykwlhzcsekrfpbmgnsxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579817.631611-371-205338826657180/AnsiballZ_stat.py
Feb 20 09:30:18 np0005625204.localdomain sudo[253702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.178 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:30:18 np0005625204.localdomain python3.9[253704]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:18 np0005625204.localdomain sudo[253702]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.620 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.636 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.636 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.637 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.637 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:18.637 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:30:19 np0005625204.localdomain sudo[253759]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-luvflhkvuloxxbabitrehaitdeevfeam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579817.631611-371-205338826657180/AnsiballZ_file.py
Feb 20 09:30:19 np0005625204.localdomain sudo[253759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:19 np0005625204.localdomain python3.9[253761]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:19 np0005625204.localdomain sudo[253759]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:30:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:30:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:30:19 np0005625204.localdomain sudo[253869]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ueawbosikczqbvvfuvzdvaqmtptzowap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579819.5275006-441-41252784607896/AnsiballZ_file.py
Feb 20 09:30:19 np0005625204.localdomain sudo[253869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:19 np0005625204.localdomain python3.9[253871]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:19.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:19.999 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:20.000 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:20 np0005625204.localdomain sudo[253869]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:20.906 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:30:21 np0005625204.localdomain sudo[253979]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-omnyzbpinbprluqljjcpekzlmdvqggol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579820.1876626-464-90021519866841/AnsiballZ_stat.py
Feb 20 09:30:21 np0005625204.localdomain sudo[253979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:30:21 np0005625204.localdomain python3.9[253981]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:21 np0005625204.localdomain sudo[253979]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:30:21 np0005625204.localdomain sudo[254036]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcltuohephkfmstohvkuenohgxiylzcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579820.1876626-464-90021519866841/AnsiballZ_file.py
Feb 20 09:30:21 np0005625204.localdomain sudo[254036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 20 09:30:21 np0005625204.localdomain python3.9[254038]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:21 np0005625204.localdomain sudo[254036]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:22 np0005625204.localdomain sudo[254146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elwanewqqztlivscddytjwvojacxhvfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579821.9431422-501-87647513251352/AnsiballZ_stat.py
Feb 20 09:30:22 np0005625204.localdomain sudo[254146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:22 np0005625204.localdomain python3.9[254148]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:22 np0005625204.localdomain sudo[254146]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:30:22 np0005625204.localdomain sudo[254203]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoqfpdsxwsyfclymylyxgpyyiwovsedb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579821.9431422-501-87647513251352/AnsiballZ_file.py
Feb 20 09:30:22 np0005625204.localdomain sudo[254203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 20 09:30:22 np0005625204.localdomain python3.9[254205]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:22 np0005625204.localdomain sudo[254203]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:23 np0005625204.localdomain sudo[254313]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgoiejaaonmomliebtazfxqfsvdhquqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579823.1086044-537-146766497915608/AnsiballZ_systemd.py
Feb 20 09:30:23 np0005625204.localdomain sudo[254313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:30:23 np0005625204.localdomain systemd[1]: tmp-crun.u5XTcn.mount: Deactivated successfully.
Feb 20 09:30:23 np0005625204.localdomain podman[254315]: 2026-02-20 09:30:23.742529746 +0000 UTC m=+0.103669472 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:30:23 np0005625204.localdomain podman[254315]: 2026-02-20 09:30:23.752147712 +0000 UTC m=+0.113287448 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:30:23 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:23 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:30:23 np0005625204.localdomain python3.9[254316]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:30:23 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:30:24 np0005625204.localdomain systemd-rc-local-generator[254358]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:30:24 np0005625204.localdomain systemd-sysv-generator[254361]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:24 np0005625204.localdomain sudo[254313]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:24 np0005625204.localdomain sudo[254483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfjfswuwxktfeeowmojtmjobofeiexgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579824.4933343-560-75724197879375/AnsiballZ_stat.py
Feb 20 09:30:24 np0005625204.localdomain sudo[254483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:24 np0005625204.localdomain python3.9[254485]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:25.000 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:25.003 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:25.003 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:25.003 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:25 np0005625204.localdomain sudo[254483]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:25.054 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:25.054 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:25 np0005625204.localdomain sudo[254540]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvncrmrvukszttnxoqqirqrhscsvuldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579824.4933343-560-75724197879375/AnsiballZ_file.py
Feb 20 09:30:25 np0005625204.localdomain sudo[254540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:25 np0005625204.localdomain python3.9[254542]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:25 np0005625204.localdomain sudo[254540]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 20 09:30:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb-merged.mount: Deactivated successfully.
Feb 20 09:30:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2e918dbe2b7e6f336ecb4cc5413e464b0e0467f389d3daf96290bbb17e0d3afb-merged.mount: Deactivated successfully.
Feb 20 09:30:26 np0005625204.localdomain sudo[254650]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkgdasroipcafsbpatvrbileatwgvfgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579825.7546942-597-191620215294345/AnsiballZ_stat.py
Feb 20 09:30:26 np0005625204.localdomain sudo[254650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:26 np0005625204.localdomain python3.9[254652]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:26 np0005625204.localdomain sudo[254650]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:30:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:30:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:30:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:30:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:30:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:30:26 np0005625204.localdomain sudo[254709]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkbewobisoipxdgodeoqwwqjwpwtlosb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579825.7546942-597-191620215294345/AnsiballZ_file.py
Feb 20 09:30:26 np0005625204.localdomain sudo[254709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:30:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 20 09:30:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:30:26 np0005625204.localdomain podman[254712]: 2026-02-20 09:30:26.775388504 +0000 UTC m=+0.087519593 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Feb 20 09:30:26 np0005625204.localdomain podman[254712]: 2026-02-20 09:30:26.793201845 +0000 UTC m=+0.105332924 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, release=1770267347, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container)
Feb 20 09:30:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:30:26 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:30:26 np0005625204.localdomain python3.9[254711]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:26 np0005625204.localdomain sudo[254709]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:27 np0005625204.localdomain sudo[254839]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-afynsbfffrbugqoqijzqhsclvwyaxree ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579827.535133-633-95650250741566/AnsiballZ_systemd.py
Feb 20 09:30:27 np0005625204.localdomain sudo[254839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 20 09:30:28 np0005625204.localdomain python3.9[254841]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:30:28 np0005625204.localdomain systemd-sysv-generator[254871]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:30:28 np0005625204.localdomain systemd-rc-local-generator[254866]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 09:30:28 np0005625204.localdomain sudo[254839]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-748996d00ab757a5bda247e45e6a81f3904e24554510d07cc1e7533917ef279a-merged.mount: Deactivated successfully.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Feb 20 09:30:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Feb 20 09:30:29 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 143381 "" "Go-http-client/1.1"
Feb 20 09:30:29 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:30:29.049Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 20 09:30:29 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:30:29.050Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 20 09:30:29 np0005625204.localdomain podman_exporter[241957]: ts=2026-02-20T09:30:29.050Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Feb 20 09:30:29 np0005625204.localdomain sudo[254994]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntqeviankwbhflwxjvpuqbffnpnqbxdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579829.0425572-663-250004290421376/AnsiballZ_file.py
Feb 20 09:30:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:30:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:30:29 np0005625204.localdomain sudo[254994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:29 np0005625204.localdomain podman[254996]: 2026-02-20 09:30:29.434858455 +0000 UTC m=+0.099732250 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:30:29 np0005625204.localdomain podman[254996]: 2026-02-20 09:30:29.47711465 +0000 UTC m=+0.141988465 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 20 09:30:29 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:30:29 np0005625204.localdomain podman[254997]: 2026-02-20 09:30:29.485652024 +0000 UTC m=+0.151064806 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:30:29 np0005625204.localdomain python3.9[254998]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:29 np0005625204.localdomain podman[254997]: 2026-02-20 09:30:29.566030165 +0000 UTC m=+0.231442947 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:30:29 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:30:29 np0005625204.localdomain sudo[254994]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:30.053 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:30 np0005625204.localdomain sudo[255146]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzhjldiyrrgnlltgosqfzpaxbcrfynte ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579830.2852588-687-245905538352798/AnsiballZ_file.py
Feb 20 09:30:30 np0005625204.localdomain sudo[255146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24047 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CA7740000000001030307) 
Feb 20 09:30:30 np0005625204.localdomain python3.9[255148]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:30:30 np0005625204.localdomain sudo[255146]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:31 np0005625204.localdomain sudo[255256]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-leewbkqkscwvsgdzydrzshlfsibcqnmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579830.9900656-710-1157605902233/AnsiballZ_stat.py
Feb 20 09:30:31 np0005625204.localdomain sudo[255256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:31 np0005625204.localdomain python3.9[255258]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:31 np0005625204.localdomain sudo[255256]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24048 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CAB680000000001030307) 
Feb 20 09:30:32 np0005625204.localdomain sudo[255344]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xdatophitgllvzlqsnuwsutzfwcwnrak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579830.9900656-710-1157605902233/AnsiballZ_copy.py
Feb 20 09:30:32 np0005625204.localdomain sudo[255344]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:32 np0005625204.localdomain python3.9[255346]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579830.9900656-710-1157605902233/.source.json _original_basename=.mtvfow7d follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:32 np0005625204.localdomain sudo[255344]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48586 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CAF680000000001030307) 
Feb 20 09:30:33 np0005625204.localdomain python3.9[255454]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24049 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CB3680000000001030307) 
Feb 20 09:30:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6767 DF PROTO=TCP SPT=37224 DPT=9102 SEQ=3285053212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CB7690000000001030307) 
Feb 20 09:30:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:35.057 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:35.059 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:35.060 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:35.060 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:35.081 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:35.082 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:35 np0005625204.localdomain sudo[255756]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ukuedabghnlirtnmbjbbbsqmdsadvddi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579835.024362-830-28328116911632/AnsiballZ_container_config_data.py
Feb 20 09:30:35 np0005625204.localdomain sudo[255756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:35 np0005625204.localdomain python3.9[255758]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Feb 20 09:30:35 np0005625204.localdomain sudo[255756]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:30:36 np0005625204.localdomain podman[255776]: 2026-02-20 09:30:36.153609995 +0000 UTC m=+0.088254946 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:30:36 np0005625204.localdomain podman[255776]: 2026-02-20 09:30:36.193223458 +0000 UTC m=+0.127868369 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Feb 20 09:30:36 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:30:36 np0005625204.localdomain sudo[255886]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhujhxdphdabkidalklperpudfjeuopk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579836.1120174-863-267762681568325/AnsiballZ_container_config_hash.py
Feb 20 09:30:36 np0005625204.localdomain sudo[255886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:36 np0005625204.localdomain python3.9[255888]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:30:36 np0005625204.localdomain sudo[255886]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:37 np0005625204.localdomain sudo[255996]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqustbbfmnnrbjpnujrohvycsevsaehw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579837.161539-893-160466574700471/AnsiballZ_edpm_container_manage.py
Feb 20 09:30:37 np0005625204.localdomain sudo[255996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24050 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CC3280000000001030307) 
Feb 20 09:30:37 np0005625204.localdomain python3[255998]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json containers=['neutron_sriov_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:30:38 np0005625204.localdomain podman[256035]: 
Feb 20 09:30:38 np0005625204.localdomain podman[256035]: 2026-02-20 09:30:38.220199492 +0000 UTC m=+0.118180961 container create ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:30:38 np0005625204.localdomain podman[256035]: 2026-02-20 09:30:38.140205921 +0000 UTC m=+0.038187430 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 20 09:30:38 np0005625204.localdomain python3[255998]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607 --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 20 09:30:38 np0005625204.localdomain sudo[255996]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:40.083 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:40.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:40.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:40.086 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:40.132 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:40.132 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:40 np0005625204.localdomain sshd[256091]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:30:40 np0005625204.localdomain sshd[256091]: Invalid user deploytest from 54.36.99.29 port 44908
Feb 20 09:30:40 np0005625204.localdomain sshd[256091]: Received disconnect from 54.36.99.29 port 44908:11: Bye Bye [preauth]
Feb 20 09:30:40 np0005625204.localdomain sshd[256091]: Disconnected from invalid user deploytest 54.36.99.29 port 44908 [preauth]
Feb 20 09:30:41 np0005625204.localdomain sudo[256183]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyyndkikbnbdszyxzfshiguorfvvlmjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579841.7321823-917-184624307836858/AnsiballZ_stat.py
Feb 20 09:30:41 np0005625204.localdomain sudo[256183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:42 np0005625204.localdomain python3.9[256185]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:30:42 np0005625204.localdomain sudo[256183]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:42 np0005625204.localdomain sudo[256295]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wulaminheoqkevbwvqotirrxtkqikzbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579842.5217912-944-265078090650254/AnsiballZ_file.py
Feb 20 09:30:42 np0005625204.localdomain sudo[256295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:43 np0005625204.localdomain python3.9[256297]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:43 np0005625204.localdomain sudo[256295]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:43 np0005625204.localdomain sudo[256350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqnwwkyzmixlcmoavfcaqomaqdbskqlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579842.5217912-944-265078090650254/AnsiballZ_stat.py
Feb 20 09:30:43 np0005625204.localdomain sudo[256350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:44 np0005625204.localdomain python3.9[256352]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:30:44 np0005625204.localdomain sudo[256350]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:30:44 np0005625204.localdomain podman[256359]: 2026-02-20 09:30:44.134057161 +0000 UTC m=+0.069826777 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:30:44 np0005625204.localdomain podman[256359]: 2026-02-20 09:30:44.175122318 +0000 UTC m=+0.110891914 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:30:44 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:30:44 np0005625204.localdomain sudo[256483]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwfafhoddiiqdxjxruzaxlkkmthibftt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579844.0837593-944-173267184265401/AnsiballZ_copy.py
Feb 20 09:30:44 np0005625204.localdomain sudo[256483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:44 np0005625204.localdomain python3.9[256485]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579844.0837593-944-173267184265401/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:44 np0005625204.localdomain sudo[256483]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:45 np0005625204.localdomain sudo[256538]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dexatehqcxtfxpyvujylkllbueprcbqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579844.0837593-944-173267184265401/AnsiballZ_systemd.py
Feb 20 09:30:45 np0005625204.localdomain sudo[256538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:45.133 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:45.134 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:45.135 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:45.135 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:45.136 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:45.138 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:45 np0005625204.localdomain python3.9[256540]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:30:45 np0005625204.localdomain systemd-rc-local-generator[256565]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:30:45 np0005625204.localdomain systemd-sysv-generator[256569]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:45 np0005625204.localdomain sudo[256538]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:45 np0005625204.localdomain sudo[256629]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjthipzgqmijztrmqkeqayfdpxquurtj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579844.0837593-944-173267184265401/AnsiballZ_systemd.py
Feb 20 09:30:45 np0005625204.localdomain sudo[256629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24051 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599CE3690000000001030307) 
Feb 20 09:30:46 np0005625204.localdomain python3.9[256631]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:30:46 np0005625204.localdomain systemd-sysv-generator[256664]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:30:46 np0005625204.localdomain systemd-rc-local-generator[256658]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: Starting neutron_sriov_agent container...
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:30:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 20 09:30:46 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:30:46 np0005625204.localdomain podman[256672]: 2026-02-20 09:30:46.741613618 +0000 UTC m=+0.112529305 container init ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent)
Feb 20 09:30:46 np0005625204.localdomain podman[256672]: 2026-02-20 09:30:46.750601457 +0000 UTC m=+0.121517134 container start ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:30:46 np0005625204.localdomain podman[256672]: neutron_sriov_agent
Feb 20 09:30:46 np0005625204.localdomain systemd[1]: Started neutron_sriov_agent container.
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + sudo -E kolla_set_configs
Feb 20 09:30:46 np0005625204.localdomain sudo[256629]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Validating config file
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Copying service configuration files
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Writing out command to execute
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: ++ cat /run_command
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + CMD=/usr/bin/neutron-sriov-nic-agent
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + ARGS=
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + sudo kolla_copy_cacerts
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + [[ ! -n '' ]]
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + . kolla_extend_start
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + umask 0022
Feb 20 09:30:46 np0005625204.localdomain neutron_sriov_agent[256688]: + exec /usr/bin/neutron-sriov-nic-agent
Feb 20 09:30:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:30:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:30:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:30:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147334 "" "Go-http-client/1.1"
Feb 20 09:30:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:30:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16341 "" "Go-http-client/1.1"
Feb 20 09:30:48 np0005625204.localdomain python3.9[256811]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.common.config [-] Logging enabled!
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.416 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.417 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.417 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005625204.localdomain'}
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.417 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] RPC agent_id: nic-switch-agent.np0005625204.localdomain
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.422 2 INFO neutron.agent.agent_extensions_manager [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Loaded agent extensions: ['qos']
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.422 2 INFO neutron.agent.agent_extensions_manager [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Initializing agent extension 'qos'
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.811 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Agent initialized successfully, now running... 
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.812 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Feb 20 09:30:48 np0005625204.localdomain neutron_sriov_agent[256688]: 2026-02-20 09:30:48.812 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-0dc45e9f-ef09-4874-b1bb-ac4faf715deb - - - - - -] Agent out of sync with plugin!
Feb 20 09:30:48 np0005625204.localdomain sudo[256920]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-knjppvrybdyppzeozxfqzrlbzynjtmrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579848.678256-1079-101791592281590/AnsiballZ_stat.py
Feb 20 09:30:48 np0005625204.localdomain sudo[256920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:49 np0005625204.localdomain python3.9[256922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:30:49 np0005625204.localdomain sudo[256920]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:49 np0005625204.localdomain sudo[257010]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnzriqmyeyatqnajpljlaufznznghpxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579848.678256-1079-101791592281590/AnsiballZ_copy.py
Feb 20 09:30:49 np0005625204.localdomain sudo[257010]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:49 np0005625204.localdomain python3.9[257012]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579848.678256-1079-101791592281590/.source.yaml _original_basename=.1ep_k65a follow=False checksum=9a7aca9285be233ff868b04cb9ff99cde755c904 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:30:49 np0005625204.localdomain sudo[257010]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:50.139 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:50 np0005625204.localdomain sudo[257120]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swfhurnguljociuoksyxepqyxcaqamzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579850.050625-1125-78697757695553/AnsiballZ_systemd.py
Feb 20 09:30:50 np0005625204.localdomain sudo[257120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:50 np0005625204.localdomain python3.9[257122]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: libpod-ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae.scope: Deactivated successfully.
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: libpod-ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae.scope: Consumed 1.757s CPU time.
Feb 20 09:30:50 np0005625204.localdomain podman[257126]: 2026-02-20 09:30:50.792898132 +0000 UTC m=+0.090756814 container died ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible)
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae-userdata-shm.mount: Deactivated successfully.
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5-merged.mount: Deactivated successfully.
Feb 20 09:30:50 np0005625204.localdomain podman[257126]: 2026-02-20 09:30:50.842057759 +0000 UTC m=+0.139916421 container cleanup ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:30:50 np0005625204.localdomain podman[257126]: neutron_sriov_agent
Feb 20 09:30:50 np0005625204.localdomain podman[257151]: 2026-02-20 09:30:50.929989075 +0000 UTC m=+0.058524818 container cleanup ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_sriov_agent, managed_by=edpm_ansible)
Feb 20 09:30:50 np0005625204.localdomain podman[257151]: neutron_sriov_agent
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Feb 20 09:30:50 np0005625204.localdomain systemd[1]: Starting neutron_sriov_agent container...
Feb 20 09:30:51 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:30:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 20 09:30:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e496c54f14bf596613a8cada291e8713a6dd66f18dbce74a7c5398584f1c7f5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:30:51 np0005625204.localdomain podman[257162]: 2026-02-20 09:30:51.095654759 +0000 UTC m=+0.121694068 container init ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:30:51 np0005625204.localdomain podman[257162]: 2026-02-20 09:30:51.105791882 +0000 UTC m=+0.131831191 container start ff9c4b7bbd2338a9386dde3f6ec38a7db7a4a7d7e93247124adbf5b6c96011ae (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-51295668f6dfcad7486165eac1f01cf55360fd71c95127b27f4adf1e00d53607'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Feb 20 09:30:51 np0005625204.localdomain podman[257162]: neutron_sriov_agent
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + sudo -E kolla_set_configs
Feb 20 09:30:51 np0005625204.localdomain systemd[1]: Started neutron_sriov_agent container.
Feb 20 09:30:51 np0005625204.localdomain sudo[257120]: pam_unix(sudo:session): session closed for user root
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Validating config file
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Copying service configuration files
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Writing out command to execute
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: ++ cat /run_command
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + CMD=/usr/bin/neutron-sriov-nic-agent
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + ARGS=
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + sudo kolla_copy_cacerts
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + [[ ! -n '' ]]
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + . kolla_extend_start
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + umask 0022
Feb 20 09:30:51 np0005625204.localdomain neutron_sriov_agent[257177]: + exec /usr/bin/neutron-sriov-nic-agent
Feb 20 09:30:51 np0005625204.localdomain sshd[251980]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:30:51 np0005625204.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Feb 20 09:30:51 np0005625204.localdomain systemd[1]: session-58.scope: Consumed 23.096s CPU time.
Feb 20 09:30:51 np0005625204.localdomain systemd-logind[759]: Session 58 logged out. Waiting for processes to exit.
Feb 20 09:30:51 np0005625204.localdomain systemd-logind[759]: Removed session 58.
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.941 2 INFO neutron.common.config [-] Logging enabled!
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.941 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005625204.localdomain'}
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.942 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] RPC agent_id: nic-switch-agent.np0005625204.localdomain
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.947 2 INFO neutron.agent.agent_extensions_manager [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Loaded agent extensions: ['qos']
Feb 20 09:30:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:52.948 2 INFO neutron.agent.agent_extensions_manager [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Initializing agent extension 'qos'
Feb 20 09:30:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:53.067 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Agent initialized successfully, now running... 
Feb 20 09:30:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:53.067 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Feb 20 09:30:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:30:53.068 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-a9bfe3af-8c61-4b05-928a-e4ab98dda12b - - - - - -] Agent out of sync with plugin!
Feb 20 09:30:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:30:54 np0005625204.localdomain podman[257210]: 2026-02-20 09:30:54.150381923 +0000 UTC m=+0.084979054 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:30:54 np0005625204.localdomain podman[257210]: 2026-02-20 09:30:54.164099617 +0000 UTC m=+0.098696758 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:30:54 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:30:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:55.141 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:55.143 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:30:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:55.143 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:30:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:55.144 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:55.180 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:30:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:30:55.181 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:30:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:30:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:30:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:30:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:30:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:30:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:30:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:30:57 np0005625204.localdomain systemd[1]: tmp-crun.3uclED.mount: Deactivated successfully.
Feb 20 09:30:57 np0005625204.localdomain podman[257233]: 2026-02-20 09:30:57.141951968 +0000 UTC m=+0.077966578 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:30:57 np0005625204.localdomain podman[257233]: 2026-02-20 09:30:57.151221964 +0000 UTC m=+0.087236614 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.openshift.tags=minimal rhel9, distribution-scope=public)
Feb 20 09:30:57 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:30:57 np0005625204.localdomain sshd[257253]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:30:57 np0005625204.localdomain sshd[257253]: Accepted publickey for zuul from 192.168.122.30 port 41390 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:30:57 np0005625204.localdomain systemd-logind[759]: New session 59 of user zuul.
Feb 20 09:30:57 np0005625204.localdomain systemd[1]: Started Session 59 of User zuul.
Feb 20 09:30:57 np0005625204.localdomain sshd[257253]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:30:58 np0005625204.localdomain python3.9[257364]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:30:59 np0005625204.localdomain sudo[257476]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozpaqawvyavrracdylxiirpbooqwxakr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579859.4003513-62-165552272315564/AnsiballZ_setup.py
Feb 20 09:30:59 np0005625204.localdomain sudo[257476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:30:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:30:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:30:59 np0005625204.localdomain podman[257479]: 2026-02-20 09:30:59.763407735 +0000 UTC m=+0.084541121 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:30:59 np0005625204.localdomain podman[257480]: 2026-02-20 09:30:59.774845238 +0000 UTC m=+0.094436227 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:30:59 np0005625204.localdomain podman[257480]: 2026-02-20 09:30:59.778992716 +0000 UTC m=+0.098583765 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:30:59 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:30:59 np0005625204.localdomain podman[257479]: 2026-02-20 09:30:59.838024108 +0000 UTC m=+0.159157504 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller)
Feb 20 09:30:59 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:30:59 np0005625204.localdomain python3.9[257478]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:31:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:00.181 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:00.184 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:00 np0005625204.localdomain sudo[257476]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44627 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D1CA40000000001030307) 
Feb 20 09:31:00 np0005625204.localdomain sudo[257580]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzxqkdokfmlyvnuexhnaauxhwejuhtlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579859.4003513-62-165552272315564/AnsiballZ_dnf.py
Feb 20 09:31:00 np0005625204.localdomain sudo[257580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:00 np0005625204.localdomain python3.9[257582]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:31:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44628 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D20A90000000001030307) 
Feb 20 09:31:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24052 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D23680000000001030307) 
Feb 20 09:31:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44629 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D28A90000000001030307) 
Feb 20 09:31:04 np0005625204.localdomain sudo[257580]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48587 DF PROTO=TCP SPT=49850 DPT=9102 SEQ=920512435 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D2D680000000001030307) 
Feb 20 09:31:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:05.184 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:05.184 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:05.185 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:31:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:05.185 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:05.186 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:05.188 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:05 np0005625204.localdomain sudo[257692]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lypoprgvubjucceyxatvxnaltpgwwrit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579864.6118162-98-155596377588055/AnsiballZ_systemd.py
Feb 20 09:31:05 np0005625204.localdomain sudo[257692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:05 np0005625204.localdomain python3.9[257694]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 20 09:31:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:31:05.994 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:31:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:31:05.994 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:31:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:31:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:31:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:31:06 np0005625204.localdomain sudo[257692]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:07 np0005625204.localdomain systemd[1]: tmp-crun.JUuFlV.mount: Deactivated successfully.
Feb 20 09:31:07 np0005625204.localdomain podman[257698]: 2026-02-20 09:31:07.050405051 +0000 UTC m=+0.110190792 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:31:07 np0005625204.localdomain podman[257698]: 2026-02-20 09:31:07.064066229 +0000 UTC m=+0.123851930 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:31:07 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:31:07 np0005625204.localdomain sudo[257824]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrvyvllxpmbfgtekxxmcdkhwvonuwqyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579867.2692423-125-170970648720377/AnsiballZ_file.py
Feb 20 09:31:07 np0005625204.localdomain sudo[257824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44630 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D38680000000001030307) 
Feb 20 09:31:07 np0005625204.localdomain python3.9[257826]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/container-startup-config setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:07 np0005625204.localdomain sudo[257824]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:08 np0005625204.localdomain sshd[257914]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:31:08 np0005625204.localdomain sudo[257935]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkbgiezhbusyoncjvzuazgksnbxuzkgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579868.0208204-125-258365367603082/AnsiballZ_file.py
Feb 20 09:31:08 np0005625204.localdomain sudo[257935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:08 np0005625204.localdomain sshd[257914]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 09:31:08 np0005625204.localdomain sshd[257914]: Connection closed by 167.172.180.30 port 57648
Feb 20 09:31:08 np0005625204.localdomain python3.9[257937]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:08 np0005625204.localdomain sudo[257935]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:09 np0005625204.localdomain sudo[258045]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwuoxmjeilqfbjvftpkrkegvbgyzxour ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579868.922858-125-207699068952780/AnsiballZ_file.py
Feb 20 09:31:09 np0005625204.localdomain sudo[258045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:09 np0005625204.localdomain python3.9[258047]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:09 np0005625204.localdomain sudo[258045]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:09 np0005625204.localdomain sudo[258065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:31:09 np0005625204.localdomain sudo[258065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:31:09 np0005625204.localdomain sudo[258065]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:09 np0005625204.localdomain sudo[258121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:31:09 np0005625204.localdomain sudo[258121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:31:09 np0005625204.localdomain sudo[258191]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvuwvhitykwqctzrygxjuuxvbeoxwald ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579869.5581346-125-59458062085793/AnsiballZ_file.py
Feb 20 09:31:09 np0005625204.localdomain sudo[258191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:10 np0005625204.localdomain python3.9[258193]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:10 np0005625204.localdomain sudo[258191]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:10.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:10.192 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:10.192 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:31:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:10.192 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:10.240 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:10.240 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:10 np0005625204.localdomain sudo[258121]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:10 np0005625204.localdomain sudo[258332]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kimamempsmwrdttknxyrbhpvwzebnlsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579870.2764494-125-268601259832767/AnsiballZ_file.py
Feb 20 09:31:10 np0005625204.localdomain sudo[258332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:10 np0005625204.localdomain python3.9[258334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:10 np0005625204.localdomain sudo[258332]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:11 np0005625204.localdomain sudo[258417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:31:11 np0005625204.localdomain sudo[258417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:31:11 np0005625204.localdomain sudo[258417]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:11 np0005625204.localdomain sudo[258460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sixlatkefzfdjftfaecfmhqxtbizldma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579870.8523521-125-162521870386118/AnsiballZ_file.py
Feb 20 09:31:11 np0005625204.localdomain sudo[258460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:11 np0005625204.localdomain python3.9[258462]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:11 np0005625204.localdomain sudo[258460]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:11 np0005625204.localdomain sudo[258570]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrnqnwbvlokcppqwpquatvgiehjnknxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579871.4509163-125-109427811685312/AnsiballZ_file.py
Feb 20 09:31:11 np0005625204.localdomain sudo[258570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:11 np0005625204.localdomain python3.9[258572]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:11 np0005625204.localdomain sudo[258570]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:12 np0005625204.localdomain sudo[258680]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydofwiinqqywpljylhrtqzfonsibtnyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579872.2013342-275-8837160581450/AnsiballZ_stat.py
Feb 20 09:31:12 np0005625204.localdomain sudo[258680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:12 np0005625204.localdomain python3.9[258682]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:12 np0005625204.localdomain sudo[258680]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:13 np0005625204.localdomain sudo[258768]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhallhcfzpecorrgczpmyuswxppnpxdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579872.2013342-275-8837160581450/AnsiballZ_copy.py
Feb 20 09:31:13 np0005625204.localdomain sudo[258768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:13 np0005625204.localdomain python3.9[258770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579872.2013342-275-8837160581450/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=472c5e922ae22c8bdcaef73d1ca73ce5597b440e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:13 np0005625204.localdomain sudo[258768]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:14 np0005625204.localdomain python3.9[258878]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:31:15 np0005625204.localdomain podman[258962]: 2026-02-20 09:31:15.148411832 +0000 UTC m=+0.083599160 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:31:15 np0005625204.localdomain podman[258962]: 2026-02-20 09:31:15.155783292 +0000 UTC m=+0.090970650 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:31:15 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.241 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.243 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:15 np0005625204.localdomain python3.9[258965]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579874.2924902-320-135871475757954/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.275 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.275 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.319 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.319 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.320 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.321 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.736 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.807 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:31:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:15.808 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.047 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.049 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12284MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.050 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.051 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.171 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.172 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.173 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:31:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44631 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D59680000000001030307) 
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.228 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:31:16 np0005625204.localdomain python3.9[259118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.705 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.711 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.726 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.728 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:31:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:16.729 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:31:16 np0005625204.localdomain python3.9[259224]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579875.393824-320-76366033268633/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:17 np0005625204.localdomain python3.9[259334]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:31:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:31:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:31:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147332 "" "Go-http-client/1.1"
Feb 20 09:31:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:17.729 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:17.730 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:31:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16350 "" "Go-http-client/1.1"
Feb 20 09:31:17 np0005625204.localdomain python3.9[259421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579877.022893-320-135267564593864/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=6db054ed7c6b84ef126ce933bbe7fb92f050e130 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.203 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.204 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.218 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.219 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '531d82e3-5967-440e-9426-4ead6decbd01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.204455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8d4698a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '99cae4041ed73933b7b5bade35c2f711763d32a5afa27892604c80c81fd86468'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.204455', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8d48064-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': 'bace37cdafdc19835fa07ceb94dd3250728241e0420f50d2d75186bcfc9dfbe0'}]}, 'timestamp': '2026-02-20 09:31:18.220021', '_unique_id': '0e90f5dd52d041e88a4fa4890f8818e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.221 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.223 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.227 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2daf9df5-7d8c-45c4-9131-6f1df2e6a472', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.223211', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8d5aab6-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '287615887a73820205a6366b254d425838c151ffb3083b1927a21b5c9ebc3ad4'}]}, 'timestamp': '2026-02-20 09:31:18.227765', '_unique_id': '8e360c5a43b949ecb7d0b0c47a2c14a5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.229 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.230 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.231 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ced7a21-0f42-4a5e-914a-ad26b17aee42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.230611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8d63274-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '5847cd8758f1c3856022ee86865163365bbaf01467263c08fe161f9c76333c51'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.230611', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8d643fe-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '7bb701560840439bd996e7241176c86e57d636893f5de86e6e7ec430bd1e9fcf'}]}, 'timestamp': '2026-02-20 09:31:18.231592', '_unique_id': '6e4bec5b43354a50bcbae985bfe875a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.232 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.234 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fcf87186-e01d-49a4-97d8-6d4e55b7833c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.234121', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8d6b9e2-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '1bc8486e5e6c3c51abbc4b7f27a0c958bf920500083b3c3c961dbb8656708949'}]}, 'timestamp': '2026-02-20 09:31:18.234607', '_unique_id': 'acb5c910c28f405fa0e88694cdac6657'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.235 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.236 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.236 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.275 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '844f284a-cca3-49e7-929b-9dcf92fa2aeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.237028', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8dcff0a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '34739f25bd2249ea1ff99ef0f8aac2b05b0ff1add913b03e7336e765d74292fe'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.237028', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8dd19f4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '6e3653412230c3a8f410aacb5e76322dd4c83f84675fd69809cd50f40bc86f4f'}]}, 'timestamp': '2026-02-20 09:31:18.276484', '_unique_id': '4fb935d9531544d3ae9919d06e4501a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.278 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.279 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f546e369-193c-4565-904b-6efd443020fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.279805', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8ddb71a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': '6f4af09f8753669acdc94901ee1fcf0fb97bfffa2bc0817f84b1fade4742575d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.279805', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8ddcc00-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.443663818, 'message_signature': 'c41734e1535c3d902c6597040acb33ebf463c9ccbc491d501829282fdab17984'}]}, 'timestamp': '2026-02-20 09:31:18.280962', '_unique_id': 'fd246f75139848af8859ba313bd3f6b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.283 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51fa73f1-9fc1-4849-a979-7d5439d901f1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.283481', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8de4748-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '236024e03471189dfcc04cdbcc664eee90287d704132a2c8a37c87acfe38bbe5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.283481', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8de5c42-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '967281a09b76292031704f79d938ed62534ea0fd03e9139d76390d1ad5b6624b'}]}, 'timestamp': '2026-02-20 09:31:18.284611', '_unique_id': 'b680e3958f4944369f73ae3d48b9cffc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.286 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '07461f73-e353-4aca-9e52-7176219690d5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.286844', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8dec556-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '1444d7206779fda231c00bc06699a15ba7a9490e413297cffe1640cf7aee81d8'}]}, 'timestamp': '2026-02-20 09:31:18.287327', '_unique_id': '54f141135e444396aeb45cce9d9a04c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.289 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d44cc1e-ac3a-4d39-aca3-7e0fc0238e58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.289847', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8df3f36-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': 'aa09228a2eb024e551c5be8124bd39286805d20b8ad3364293c3fff752381f4d'}]}, 'timestamp': '2026-02-20 09:31:18.290448', '_unique_id': '6546353599194a809dad3c731cd37085'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b04d965-dfdd-4c9f-9970-d158abc4bf11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.292953', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8dfb3d0-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '02d4dbe53779ec08dbabb3c5f3154cbeb835aed4212a47fac1cb4450d936c5db'}]}, 'timestamp': '2026-02-20 09:31:18.293430', '_unique_id': '609914e2e8754724b8bfe353889478eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:18.296 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:18.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:18.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:18.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 60080000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68c37424-e1da-4a6b-89e2-98e84665558d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 60080000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:31:18.295544', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'e8e44a80-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.562021775, 'message_signature': '2a36048e9dd0bea7f634cc3963b1ce73a3de36d6e6f45438cc6f22df2cba3f55'}]}, 'timestamp': '2026-02-20 09:31:18.323533', '_unique_id': '4636f2fff18840bc9ea3c74998ca9881'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 88 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1282cf69-4393-47b9-aa68-25fe63ed5a32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 88, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.326098', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e4c2d0-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '7887e433aacfcb6ee4a21333155d41fff97b04b90a20a3e68bfb69ba23216c0b'}]}, 'timestamp': '2026-02-20 09:31:18.326599', '_unique_id': 'fbd9bf9b8e72474c8b8fb34a2dc1a31d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c355ec3d-f46d-4bae-9e59-c97c2fe6be7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:31:18.329008', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'e8e53490-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.562021775, 'message_signature': '9de4f74a1b869fe1b37e03e237470502975649ff41f4425596c3549bb51fb4d7'}]}, 'timestamp': '2026-02-20 09:31:18.329496', '_unique_id': 'cc570bceee8b427e93be81c20531843f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6ff050d-29e0-498a-8141-a014840d0936', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.331955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e5a772-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '3f90e05e70ca8a4796058c2472fb71a344a5bd9e5f7b05fda3fb03b5e92ac37a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.331955', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e5bc08-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': 'a9883a030ce3e744efba5cc302396451c3bd060c951c0304c2a2da7d80896a7a'}]}, 'timestamp': '2026-02-20 09:31:18.333015', '_unique_id': 'ecf16736064d49448b031b82a44d119f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e567638b-5f74-4ca6-ac71-1ba9e2ed519d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.335368', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e62d28-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': 'a81f28a46599d16b6cf78bf269bbbf7125bc589aac44367c44bf5ae5173a1ace'}]}, 'timestamp': '2026-02-20 09:31:18.335909', '_unique_id': '59f5e9c781a34785904f34703035e613'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8617e1b4-2079-4ae9-a05a-bf0b214ce0b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.338319', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e6a05a-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '4d9ec339e013dcb43b2fc46556dcfd81c276e54e0257974b337efb9fdf31d0b7'}]}, 'timestamp': '2026-02-20 09:31:18.338865', '_unique_id': '4fbaa9325a5e42fca8a938bfbf99199a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.341 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad9c339d-7dc1-49ac-8c1a-33752f0a42df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.341241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e7156c-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '3e67649ded89e3fa9c1b4dde7cbd139cdd482aa2a655a6c82f7cea78682fc23b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.341241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e72930-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '3c4d6d2ac99eed3e9c5e76d9089b882ac0502400b5a291deabd075aeb51822ad'}]}, 'timestamp': '2026-02-20 09:31:18.342299', '_unique_id': '2566463369904443b8c6014ee8769ecb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.344 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3eef6a48-b54c-4ece-9e07-05400eb1cf21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.344686', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e79960-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '015055ce0b62218d8af6e4e72aff84a61a3e8bf3ad40c312aaf16faa06926d95'}]}, 'timestamp': '2026-02-20 09:31:18.345202', '_unique_id': 'bccfdaa13bfe45b5a290ae2780c56ff7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.346 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.347 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.347 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.348 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc4923db-7801-4d2d-a55b-45d0ead1f0a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.347543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e80b84-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': 'bb238fad1ad6e5500ab1cb88a2450dc29c3ab64f9f32a6757413099a84bb8273'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.347543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e823e4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': 'b10e6db4746d002c7077d2dea7bbedef772d01731edefe1c47a44a741af664bf'}]}, 'timestamp': '2026-02-20 09:31:18.348766', '_unique_id': 'aac100ad402c458e9029c9dd55f1570f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.349 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.350 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 8991 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5cfd6ae-ddff-487e-a495-6133027ab61a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 8991, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:31:18.351081', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'e8e892e8-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.462444925, 'message_signature': '05b4397309f6fed456b7eb1b24112dbc75c8e082352ec717241dfe9d1b9422e7'}]}, 'timestamp': '2026-02-20 09:31:18.351708', '_unique_id': 'bf1edc6df27a4d2eadf497e5a247703b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.353 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.353 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.353 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3646bcc4-4d88-4678-8fb9-5c982e7f3808', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:31:18.353771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'e8e8f774-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '7087c7e586754e2bbd32ba36af811441a67c73edd1e3898802ea37b25fcc5c86'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:31:18.353771', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'e8e901c4-0e3e-11f1-9294-fa163ef029e2', 'monotonic_time': 10437.476253538, 'message_signature': '550b88a63520892d92505b5c41409e060c7bfa2d7b4744eddd88fe5cbf319c75'}]}, 'timestamp': '2026-02-20 09:31:18.354298', '_unique_id': '45cd21cd13fc410582d1deb52806b384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:31:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:31:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:31:19 np0005625204.localdomain python3.9[259529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.571 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.571 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.572 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:31:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:19.572 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:31:19 np0005625204.localdomain python3.9[259615]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579878.7480402-494-147774350184856/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=13d630d090b626c2aab1085bca0daa7abb0cabfd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:20.075 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:31:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:20.196 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:31:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:20.197 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:31:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:20.277 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:20 np0005625204.localdomain python3.9[259723]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:21 np0005625204.localdomain python3.9[259809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579879.8442059-539-95077197649537/.source follow=False _original_basename=haproxy.j2 checksum=eddfecb822bb60e7241db0fd719c7552d2d25452 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:21 np0005625204.localdomain python3.9[259917]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:22 np0005625204.localdomain python3.9[260003]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579881.1687446-539-14955454410103/.source follow=False _original_basename=dnsmasq.j2 checksum=a6b8b2fb47e7419d250eaee9e3565b13fff8f42e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:22 np0005625204.localdomain python3.9[260111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:23 np0005625204.localdomain python3.9[260166]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:23 np0005625204.localdomain python3.9[260274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:24 np0005625204.localdomain python3.9[260360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771579883.2872195-626-97379393248581/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:31:25 np0005625204.localdomain systemd[1]: tmp-crun.Va7dpx.mount: Deactivated successfully.
Feb 20 09:31:25 np0005625204.localdomain podman[260378]: 2026-02-20 09:31:25.149012584 +0000 UTC m=+0.080846504 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:31:25 np0005625204.localdomain podman[260378]: 2026-02-20 09:31:25.183925277 +0000 UTC m=+0.115759127 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:31:25 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:31:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:25.280 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:25.282 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:25.283 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:31:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:25.283 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:25.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:25.315 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:25 np0005625204.localdomain python3.9[260491]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:31:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:31:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:31:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:31:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:31:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:31:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:31:26 np0005625204.localdomain sudo[260601]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzsscaicwbyavsdksvmjhhgduqlvlbpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579885.8418636-732-76137677610627/AnsiballZ_file.py
Feb 20 09:31:26 np0005625204.localdomain sudo[260601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:27 np0005625204.localdomain python3.9[260603]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:27 np0005625204.localdomain sudo[260601]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:27 np0005625204.localdomain sudo[260711]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbaskedrtyksjkrnnmmpoiwlbgpjevsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579887.201416-755-102591020429455/AnsiballZ_stat.py
Feb 20 09:31:27 np0005625204.localdomain sudo[260711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:31:27 np0005625204.localdomain podman[260713]: 2026-02-20 09:31:27.533049207 +0000 UTC m=+0.086785860 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1770267347, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, version=9.7, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal)
Feb 20 09:31:27 np0005625204.localdomain podman[260713]: 2026-02-20 09:31:27.549002596 +0000 UTC m=+0.102739269 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., distribution-scope=public, config_id=openstack_network_exporter, vcs-type=git)
Feb 20 09:31:27 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:31:27 np0005625204.localdomain python3.9[260714]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:27 np0005625204.localdomain sudo[260711]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:27 np0005625204.localdomain sudo[260788]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzbueqjozuuxzkibmdggeekmtwtyuimr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579887.201416-755-102591020429455/AnsiballZ_file.py
Feb 20 09:31:27 np0005625204.localdomain sudo[260788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:28 np0005625204.localdomain python3.9[260790]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:28 np0005625204.localdomain sudo[260788]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:28 np0005625204.localdomain sudo[260898]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tclogmkpissuleaetvnscmiebwgwgfzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579888.1574798-755-120213148677039/AnsiballZ_stat.py
Feb 20 09:31:28 np0005625204.localdomain sudo[260898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:28 np0005625204.localdomain python3.9[260900]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:28 np0005625204.localdomain sudo[260898]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:28 np0005625204.localdomain sudo[260955]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzkilbfhlxatemawqipgairhroztqise ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579888.1574798-755-120213148677039/AnsiballZ_file.py
Feb 20 09:31:28 np0005625204.localdomain sudo[260955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:29 np0005625204.localdomain python3.9[260957]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:29 np0005625204.localdomain sudo[260955]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:29 np0005625204.localdomain sudo[261065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxhmpgjymwhyeujzbbtcdtegmgubvnzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579889.2812223-825-61469842876151/AnsiballZ_file.py
Feb 20 09:31:29 np0005625204.localdomain sudo[261065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:29 np0005625204.localdomain python3.9[261067]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:29 np0005625204.localdomain sudo[261065]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:31:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:31:30 np0005625204.localdomain podman[261144]: 2026-02-20 09:31:30.131089593 +0000 UTC m=+0.069799758 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 20 09:31:30 np0005625204.localdomain podman[261144]: 2026-02-20 09:31:30.136465992 +0000 UTC m=+0.075176147 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:31:30 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:31:30 np0005625204.localdomain sudo[261201]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrgmzwgyojlrewddcgnllmaumlnndczc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579889.9089615-848-80678141930714/AnsiballZ_stat.py
Feb 20 09:31:30 np0005625204.localdomain sudo[261201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:30 np0005625204.localdomain podman[261141]: 2026-02-20 09:31:30.192549858 +0000 UTC m=+0.130740486 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:31:30 np0005625204.localdomain podman[261141]: 2026-02-20 09:31:30.235343099 +0000 UTC m=+0.173533757 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:31:30 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:31:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:30.313 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:30 np0005625204.localdomain python3.9[261203]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:30 np0005625204.localdomain sudo[261201]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23771 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D91D40000000001030307) 
Feb 20 09:31:30 np0005625204.localdomain sudo[261274]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llqoouwwauocyooqdjnmpcdkxwwsdaol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579889.9089615-848-80678141930714/AnsiballZ_file.py
Feb 20 09:31:30 np0005625204.localdomain sudo[261274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:30 np0005625204.localdomain python3.9[261276]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:30 np0005625204.localdomain sudo[261274]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:31 np0005625204.localdomain sudo[261384]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofteoobodcxhuxkenouamchxcsexojso ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579891.0398219-885-45933704527010/AnsiballZ_stat.py
Feb 20 09:31:31 np0005625204.localdomain sudo[261384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:31 np0005625204.localdomain python3.9[261386]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:31 np0005625204.localdomain sudo[261384]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:31 np0005625204.localdomain sudo[261441]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frzdkslaylljrphiqzcaccygmvuhsinn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579891.0398219-885-45933704527010/AnsiballZ_file.py
Feb 20 09:31:31 np0005625204.localdomain sudo[261441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23772 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D95E80000000001030307) 
Feb 20 09:31:31 np0005625204.localdomain python3.9[261443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:31 np0005625204.localdomain sudo[261441]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:32 np0005625204.localdomain sshd[261515]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:31:32 np0005625204.localdomain sudo[261553]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbipbkqkitkmkglywsfoqfkyctpdnskk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579892.0796025-921-196869748427435/AnsiballZ_systemd.py
Feb 20 09:31:32 np0005625204.localdomain sudo[261553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:32 np0005625204.localdomain sshd[261515]: Invalid user andy from 18.221.252.160 port 33192
Feb 20 09:31:32 np0005625204.localdomain sshd[261515]: Received disconnect from 18.221.252.160 port 33192:11: Bye Bye [preauth]
Feb 20 09:31:32 np0005625204.localdomain sshd[261515]: Disconnected from invalid user andy 18.221.252.160 port 33192 [preauth]
Feb 20 09:31:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44632 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D99680000000001030307) 
Feb 20 09:31:32 np0005625204.localdomain python3.9[261555]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:31:32 np0005625204.localdomain systemd-rc-local-generator[261579]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:31:32 np0005625204.localdomain systemd-sysv-generator[261585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:32 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:33 np0005625204.localdomain sudo[261553]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:33 np0005625204.localdomain sudo[261700]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-djgcjotwhitewwutzsvnhiuwzebrbbmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579893.2067425-945-174745422515794/AnsiballZ_stat.py
Feb 20 09:31:33 np0005625204.localdomain sudo[261700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:33 np0005625204.localdomain python3.9[261702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:33 np0005625204.localdomain sudo[261700]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23773 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599D9DE80000000001030307) 
Feb 20 09:31:33 np0005625204.localdomain sudo[261757]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kcwfezsrnkizhahwhrwbqybtutubrhyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579893.2067425-945-174745422515794/AnsiballZ_file.py
Feb 20 09:31:33 np0005625204.localdomain sudo[261757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:34 np0005625204.localdomain python3.9[261759]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:34 np0005625204.localdomain sudo[261757]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:34 np0005625204.localdomain sudo[261867]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrjmtuafavnpiugtqmbdawtbyfwockvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579894.3278568-981-279651210075779/AnsiballZ_stat.py
Feb 20 09:31:34 np0005625204.localdomain sudo[261867]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24053 DF PROTO=TCP SPT=33514 DPT=9102 SEQ=2505530680 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599DA1680000000001030307) 
Feb 20 09:31:34 np0005625204.localdomain python3.9[261869]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:34 np0005625204.localdomain sudo[261867]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:35.316 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:35 np0005625204.localdomain sudo[261924]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmaphfzftdqjazkugevujhixukwrbszc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579894.3278568-981-279651210075779/AnsiballZ_file.py
Feb 20 09:31:35 np0005625204.localdomain sudo[261924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:35 np0005625204.localdomain python3.9[261926]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:35 np0005625204.localdomain sudo[261924]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:36 np0005625204.localdomain sudo[262034]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbhjoklpjczddrxryzejjmcwhwccqytn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579895.892911-1016-88657628512496/AnsiballZ_systemd.py
Feb 20 09:31:36 np0005625204.localdomain sudo[262034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:36 np0005625204.localdomain python3.9[262036]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:31:36 np0005625204.localdomain systemd-rc-local-generator[262061]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:31:36 np0005625204.localdomain systemd-sysv-generator[262064]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: Starting Create netns directory...
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 20 09:31:36 np0005625204.localdomain systemd[1]: Finished Create netns directory.
Feb 20 09:31:36 np0005625204.localdomain sudo[262034]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:37 np0005625204.localdomain sshd[262097]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:31:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23774 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599DADA80000000001030307) 
Feb 20 09:31:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:31:38 np0005625204.localdomain podman[262143]: 2026-02-20 09:31:38.152518464 +0000 UTC m=+0.086784259 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 20 09:31:38 np0005625204.localdomain podman[262143]: 2026-02-20 09:31:38.167190015 +0000 UTC m=+0.101455800 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 20 09:31:38 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:31:38 np0005625204.localdomain sudo[262208]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dudzckmabyqaybgscemwadoctpkxxqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579897.9705-1047-146583421802384/AnsiballZ_file.py
Feb 20 09:31:38 np0005625204.localdomain sudo[262208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:38 np0005625204.localdomain python3.9[262210]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:38 np0005625204.localdomain sudo[262208]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:38 np0005625204.localdomain sshd[262097]: Invalid user bot3 from 188.166.218.64 port 46682
Feb 20 09:31:38 np0005625204.localdomain sudo[262318]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uibdkowmehbsouskikbqudgxseyyktcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579898.6638858-1070-156640506618467/AnsiballZ_file.py
Feb 20 09:31:38 np0005625204.localdomain sudo[262318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:39 np0005625204.localdomain python3.9[262320]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:31:39 np0005625204.localdomain sshd[262097]: Received disconnect from 188.166.218.64 port 46682:11: Bye Bye [preauth]
Feb 20 09:31:39 np0005625204.localdomain sshd[262097]: Disconnected from invalid user bot3 188.166.218.64 port 46682 [preauth]
Feb 20 09:31:39 np0005625204.localdomain sudo[262318]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:39 np0005625204.localdomain sudo[262428]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjdbedllcftinqrowdywleogfjhqnvff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579899.3380752-1094-67246515093766/AnsiballZ_stat.py
Feb 20 09:31:39 np0005625204.localdomain sudo[262428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:39 np0005625204.localdomain python3.9[262430]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:39 np0005625204.localdomain sudo[262428]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:40 np0005625204.localdomain sudo[262516]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grxoyzcodfequffeaflnosmipnourgcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579899.3380752-1094-67246515093766/AnsiballZ_copy.py
Feb 20 09:31:40 np0005625204.localdomain sudo[262516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:40 np0005625204.localdomain python3.9[262518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579899.3380752-1094-67246515093766/.source.json _original_basename=.b13xauaa follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:40 np0005625204.localdomain sudo[262516]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:40.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:40.321 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:40.321 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:31:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:40.321 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:40.356 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:40.356 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:41 np0005625204.localdomain python3.9[262626]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:42 np0005625204.localdomain sudo[262928]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xattkexrrxmhrpxeexejlbnhexbrrued ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579902.5431852-1214-277952299477683/AnsiballZ_container_config_data.py
Feb 20 09:31:42 np0005625204.localdomain sudo[262928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:43 np0005625204.localdomain python3.9[262930]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Feb 20 09:31:43 np0005625204.localdomain sudo[262928]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:43 np0005625204.localdomain sudo[263038]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xasjcgiqjbyawtxqntrotibifwbenvag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579903.600321-1247-241054624979481/AnsiballZ_container_config_hash.py
Feb 20 09:31:44 np0005625204.localdomain sudo[263038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:44 np0005625204.localdomain python3.9[263040]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:31:44 np0005625204.localdomain sudo[263038]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:45 np0005625204.localdomain sudo[263148]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldfsxmvkthuaeczccskbmfksprjjcekz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771579904.6114802-1277-138001812557541/AnsiballZ_edpm_container_manage.py
Feb 20 09:31:45 np0005625204.localdomain sudo[263148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:45 np0005625204.localdomain python3[263150]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:31:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:45.358 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:45.359 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:45.360 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:31:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:45.360 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:45.410 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:45.411 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:31:45 np0005625204.localdomain podman[263186]: 
Feb 20 09:31:45 np0005625204.localdomain podman[263186]: 2026-02-20 09:31:45.592065707 +0000 UTC m=+0.066922577 container create 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, config_id=neutron_dhcp, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Feb 20 09:31:45 np0005625204.localdomain podman[263186]: 2026-02-20 09:31:45.559977352 +0000 UTC m=+0.034834222 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:31:45 np0005625204.localdomain python3[263150]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:31:45 np0005625204.localdomain sudo[263148]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23775 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599DCD690000000001030307) 
Feb 20 09:31:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:31:46 np0005625204.localdomain podman[263279]: 2026-02-20 09:31:46.164862701 +0000 UTC m=+0.092294102 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:31:46 np0005625204.localdomain podman[263279]: 2026-02-20 09:31:46.180004615 +0000 UTC m=+0.107436036 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:31:46 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:31:46 np0005625204.localdomain sudo[263354]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txuleqfkzfilrcgzdecxojpdddpnaqhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579906.0108082-1301-57656433287627/AnsiballZ_stat.py
Feb 20 09:31:46 np0005625204.localdomain sudo[263354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:46 np0005625204.localdomain python3.9[263356]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:31:46 np0005625204.localdomain sudo[263354]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:47 np0005625204.localdomain sudo[263466]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jeqmvymlxunyjxaqsaiaqzscitbgcxsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579906.7764351-1328-213832990634066/AnsiballZ_file.py
Feb 20 09:31:47 np0005625204.localdomain sudo[263466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:47 np0005625204.localdomain python3.9[263468]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:47 np0005625204.localdomain sudo[263466]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:31:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:31:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:31:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149681 "" "Go-http-client/1.1"
Feb 20 09:31:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:31:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16662 "" "Go-http-client/1.1"
Feb 20 09:31:48 np0005625204.localdomain sudo[263521]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swouokikweuhyozhjdkkitscsrvlvykm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579906.7764351-1328-213832990634066/AnsiballZ_stat.py
Feb 20 09:31:48 np0005625204.localdomain sudo[263521]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:48 np0005625204.localdomain python3.9[263523]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:31:48 np0005625204.localdomain sudo[263521]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:48 np0005625204.localdomain sudo[263630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-waywuqsmanelbmbxvhrjwciaydpfwscx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579908.2732587-1328-186789700356022/AnsiballZ_copy.py
Feb 20 09:31:48 np0005625204.localdomain sudo[263630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:48 np0005625204.localdomain python3.9[263632]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771579908.2732587-1328-186789700356022/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:48 np0005625204.localdomain sudo[263630]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:49 np0005625204.localdomain sudo[263685]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrohixiebziawlojmmsxiwxcuumvjyey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579908.2732587-1328-186789700356022/AnsiballZ_systemd.py
Feb 20 09:31:49 np0005625204.localdomain sudo[263685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:49 np0005625204.localdomain python3.9[263687]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:31:49 np0005625204.localdomain systemd-sysv-generator[263712]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:31:49 np0005625204.localdomain systemd-rc-local-generator[263708]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:49 np0005625204.localdomain sudo[263685]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:50 np0005625204.localdomain sudo[263776]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlzfbxrwrdwcwoziylaldxltxvxieadc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579908.2732587-1328-186789700356022/AnsiballZ_systemd.py
Feb 20 09:31:50 np0005625204.localdomain sudo[263776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:50 np0005625204.localdomain python3.9[263778]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:31:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:50.411 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:50.418 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:31:51 np0005625204.localdomain systemd-rc-local-generator[263805]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:31:51 np0005625204.localdomain systemd-sysv-generator[263811]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:31:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 20 09:31:51 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:31:51 np0005625204.localdomain podman[263819]: 2026-02-20 09:31:51.981606419 +0000 UTC m=+0.124749130 container init 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 09:31:51 np0005625204.localdomain podman[263819]: 2026-02-20 09:31:51.993206332 +0000 UTC m=+0.136349093 container start 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=neutron_dhcp, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:31:51 np0005625204.localdomain podman[263819]: neutron_dhcp_agent
Feb 20 09:31:51 np0005625204.localdomain neutron_dhcp_agent[263834]: + sudo -E kolla_set_configs
Feb 20 09:31:51 np0005625204.localdomain systemd[1]: Started neutron_dhcp_agent container.
Feb 20 09:31:52 np0005625204.localdomain sudo[263776]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Validating config file
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Copying service configuration files
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Writing out command to execute
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: ++ cat /run_command
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + CMD=/usr/bin/neutron-dhcp-agent
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + ARGS=
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + sudo kolla_copy_cacerts
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + [[ ! -n '' ]]
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + . kolla_extend_start
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: Running command: '/usr/bin/neutron-dhcp-agent'
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + umask 0022
Feb 20 09:31:52 np0005625204.localdomain neutron_dhcp_agent[263834]: + exec /usr/bin/neutron-dhcp-agent
Feb 20 09:31:52 np0005625204.localdomain python3.9[263956]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:31:53 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.274 263838 INFO neutron.common.config [-] Logging enabled!
Feb 20 09:31:53 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.274 263838 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44
Feb 20 09:31:53 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.636 263838 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 20 09:31:53 np0005625204.localdomain sudo[264065]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbgaplhadumtfguikxyhftrbmvjhajmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579913.429464-1463-94152542863959/AnsiballZ_stat.py
Feb 20 09:31:53 np0005625204.localdomain sudo[264065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:31:53.908 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:31:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:31:53.910 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:31:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:31:53.911 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:31:53 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.911 263838 INFO neutron.agent.dhcp.agent [None req-bf66d6ec-c731-433c-b729-240ac9f8124c - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:31:53 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.911 263838 INFO neutron.agent.dhcp.agent [None req-bf66d6ec-c731-433c-b729-240ac9f8124c - - - - - -] Synchronizing state complete
Feb 20 09:31:53 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:53.940 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:53 np0005625204.localdomain python3.9[264067]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:31:53 np0005625204.localdomain sudo[264065]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:53 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:53.994 263838 INFO neutron.agent.dhcp.agent [None req-bf66d6ec-c731-433c-b729-240ac9f8124c - - - - - -] DHCP agent started
Feb 20 09:31:54 np0005625204.localdomain sudo[264155]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxroeucxezpwdqpieinrekjdsrtzrryg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579913.429464-1463-94152542863959/AnsiballZ_copy.py
Feb 20 09:31:54 np0005625204.localdomain sudo[264155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:54 np0005625204.localdomain python3.9[264157]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771579913.429464-1463-94152542863959/.source.yaml _original_basename=.4a29isd8 follow=False checksum=b9ca88bcb32671aca7ddecc5a041bae0cf925d73 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:31:54 np0005625204.localdomain sudo[264155]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:55 np0005625204.localdomain sudo[264265]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sovjxvepnwmuumzwgazihzgbeimtnmrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579914.7368805-1509-130722303059071/AnsiballZ_systemd.py
Feb 20 09:31:55 np0005625204.localdomain sudo[264265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:31:55 np0005625204.localdomain python3.9[264267]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:31:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:31:55 np0005625204.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Feb 20 09:31:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:31:55.449 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:31:55 np0005625204.localdomain podman[264269]: 2026-02-20 09:31:55.471514275 +0000 UTC m=+0.126499525 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:31:55 np0005625204.localdomain podman[264269]: 2026-02-20 09:31:55.550757497 +0000 UTC m=+0.205742817 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:31:55 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[263834]: 2026-02-20 09:31:56.061 263838 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: libpod-0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19.scope: Deactivated successfully.
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: libpod-0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19.scope: Consumed 2.011s CPU time.
Feb 20 09:31:56 np0005625204.localdomain podman[264277]: 2026-02-20 09:31:56.394377314 +0000 UTC m=+1.012619532 container died 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=neutron_dhcp, tcib_managed=true, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:31:56 np0005625204.localdomain podman[264277]: 2026-02-20 09:31:56.446625561 +0000 UTC m=+1.064867749 container cleanup 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=neutron_dhcp, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_dhcp_agent)
Feb 20 09:31:56 np0005625204.localdomain podman[264277]: neutron_dhcp_agent
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8-merged.mount: Deactivated successfully.
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19-userdata-shm.mount: Deactivated successfully.
Feb 20 09:31:56 np0005625204.localdomain podman[264334]: error opening file `/run/crun/0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19/status`: No such file or directory
Feb 20 09:31:56 np0005625204.localdomain podman[264322]: 2026-02-20 09:31:56.555896704 +0000 UTC m=+0.071099469 container cleanup 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:31:56 np0005625204.localdomain podman[264322]: neutron_dhcp_agent
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Feb 20 09:31:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:31:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:31:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:31:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:31:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:31:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:31:56 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 20 09:31:56 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bfb93fe3e758efebe66e9989b3ef4123f931108273ce599b2cb25c542451ccd8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:31:56 np0005625204.localdomain podman[264336]: 2026-02-20 09:31:56.737571045 +0000 UTC m=+0.118048189 container init 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_id=neutron_dhcp, tcib_managed=true, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:31:56 np0005625204.localdomain podman[264336]: 2026-02-20 09:31:56.746288138 +0000 UTC m=+0.126765282 container start 0ad46df0635296053bfdef8c50a6fa738709c97175a36e9a764ef8cb68a63f19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-9f31e638626c4a14991950d8eefa3df7aba4d14ad29977578a80924e617afe93'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=neutron_dhcp_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:31:56 np0005625204.localdomain podman[264336]: neutron_dhcp_agent
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + sudo -E kolla_set_configs
Feb 20 09:31:56 np0005625204.localdomain systemd[1]: Started neutron_dhcp_agent container.
Feb 20 09:31:56 np0005625204.localdomain sudo[264265]: pam_unix(sudo:session): session closed for user root
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Validating config file
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Copying service configuration files
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Writing out command to execute
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: ++ cat /run_command
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + CMD=/usr/bin/neutron-dhcp-agent
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + ARGS=
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + sudo kolla_copy_cacerts
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + [[ ! -n '' ]]
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + . kolla_extend_start
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: Running command: '/usr/bin/neutron-dhcp-agent'
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + umask 0022
Feb 20 09:31:56 np0005625204.localdomain neutron_dhcp_agent[264351]: + exec /usr/bin/neutron-dhcp-agent
Feb 20 09:31:57 np0005625204.localdomain sshd[257253]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:31:57 np0005625204.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Feb 20 09:31:57 np0005625204.localdomain systemd[1]: session-59.scope: Consumed 35.188s CPU time.
Feb 20 09:31:57 np0005625204.localdomain systemd-logind[759]: Session 59 logged out. Waiting for processes to exit.
Feb 20 09:31:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:31:57 np0005625204.localdomain systemd-logind[759]: Removed session 59.
Feb 20 09:31:57 np0005625204.localdomain podman[264383]: 2026-02-20 09:31:57.929756212 +0000 UTC m=+0.085489899 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1770267347, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:31:57 np0005625204.localdomain podman[264383]: 2026-02-20 09:31:57.945923658 +0000 UTC m=+0.101657375 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1770267347, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter)
Feb 20 09:31:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:31:57.955 264355 INFO neutron.common.config [-] Logging enabled!
Feb 20 09:31:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:31:57.955 264355 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44
Feb 20 09:31:57 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:31:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.312 264355 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 20 09:31:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.739 264355 INFO neutron.agent.dhcp.agent [None req-165f8f8a-8282-4a41-9f70-f0b00322bdc7 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:31:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.740 264355 INFO neutron.agent.dhcp.agent [None req-165f8f8a-8282-4a41-9f70-f0b00322bdc7 - - - - - -] Synchronizing state complete
Feb 20 09:31:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:31:58.768 264355 INFO neutron.agent.dhcp.agent [None req-165f8f8a-8282-4a41-9f70-f0b00322bdc7 - - - - - -] DHCP agent started
Feb 20 09:31:59 np0005625204.localdomain sshd[264404]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:32:00 np0005625204.localdomain sshd[264404]: Invalid user sol from 45.148.10.240 port 43436
Feb 20 09:32:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:32:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:32:00 np0005625204.localdomain podman[264406]: 2026-02-20 09:32:00.345152256 +0000 UTC m=+0.103443871 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:32:00 np0005625204.localdomain podman[264406]: 2026-02-20 09:32:00.378100219 +0000 UTC m=+0.136391834 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 20 09:32:00 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:32:00 np0005625204.localdomain sshd[264404]: Connection closed by invalid user sol 45.148.10.240 port 43436 [preauth]
Feb 20 09:32:00 np0005625204.localdomain podman[264422]: 2026-02-20 09:32:00.438418649 +0000 UTC m=+0.084700316 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127)
Feb 20 09:32:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:00.452 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:00.454 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:00.454 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:00.455 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:00.486 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:00.486 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:00 np0005625204.localdomain podman[264422]: 2026-02-20 09:32:00.53011469 +0000 UTC m=+0.176396317 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller)
Feb 20 09:32:00 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:32:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61346 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E07040000000001030307) 
Feb 20 09:32:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61347 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E0B280000000001030307) 
Feb 20 09:32:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23776 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E0D680000000001030307) 
Feb 20 09:32:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61348 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E13280000000001030307) 
Feb 20 09:32:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44633 DF PROTO=TCP SPT=44986 DPT=9102 SEQ=3212856262 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E17680000000001030307) 
Feb 20 09:32:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:05.487 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:05.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:05.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:05.489 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:05.490 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:05.494 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:32:05.995 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:32:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:32:05.995 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:32:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:32:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:32:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61349 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E22E80000000001030307) 
Feb 20 09:32:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:32:09 np0005625204.localdomain podman[264451]: 2026-02-20 09:32:09.137231748 +0000 UTC m=+0.076163687 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 20 09:32:09 np0005625204.localdomain podman[264451]: 2026-02-20 09:32:09.150182943 +0000 UTC m=+0.089114892 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:32:09 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:32:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:10.495 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:10.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:10.497 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:10.498 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:10.539 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:10.540 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:11 np0005625204.localdomain sudo[264470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:32:11 np0005625204.localdomain sudo[264470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:32:11 np0005625204.localdomain sudo[264470]: pam_unix(sudo:session): session closed for user root
Feb 20 09:32:11 np0005625204.localdomain sudo[264488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:32:11 np0005625204.localdomain sudo[264488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:32:11 np0005625204.localdomain sudo[264488]: pam_unix(sudo:session): session closed for user root
Feb 20 09:32:15 np0005625204.localdomain sudo[264539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:32:15 np0005625204.localdomain sudo[264539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:32:15 np0005625204.localdomain sudo[264539]: pam_unix(sudo:session): session closed for user root
Feb 20 09:32:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:15.541 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:15.543 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:15.544 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:15.544 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:15.587 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:15.588 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61350 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E43680000000001030307) 
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.335 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.335 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.335 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.336 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.336 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.816 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.870 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:32:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:16.871 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:32:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.078 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.080 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12195MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.080 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.080 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.150 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.150 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.150 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:32:17 np0005625204.localdomain systemd[1]: tmp-crun.6h2fVm.mount: Deactivated successfully.
Feb 20 09:32:17 np0005625204.localdomain podman[264579]: 2026-02-20 09:32:17.157711168 +0000 UTC m=+0.090033132 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:32:17 np0005625204.localdomain podman[264579]: 2026-02-20 09:32:17.169208858 +0000 UTC m=+0.101530802 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:32:17 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.188 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.656 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.664 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.684 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.687 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:32:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:17.687 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:32:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:32:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:32:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:32:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 20 09:32:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:32:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1"
Feb 20 09:32:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:18.687 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:18.688 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:18.688 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:19.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.295 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.312 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.312 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.588 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.590 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.590 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.591 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.592 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:20.596 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.301 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.301 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.500 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.501 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.501 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.502 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.899 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.914 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:32:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:21.915 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:32:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:32:24Z|00047|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 20 09:32:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:25.597 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:25.599 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:25.600 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:25.600 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:25.614 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:25.615 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:32:26 np0005625204.localdomain podman[264624]: 2026-02-20 09:32:26.148910688 +0000 UTC m=+0.080190323 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:32:26 np0005625204.localdomain podman[264624]: 2026-02-20 09:32:26.158954033 +0000 UTC m=+0.090233678 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:32:26 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:32:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:32:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:32:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:32:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:32:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:32:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:32:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:32:28 np0005625204.localdomain podman[264648]: 2026-02-20 09:32:28.146105442 +0000 UTC m=+0.080868094 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, architecture=x86_64, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:32:28 np0005625204.localdomain podman[264648]: 2026-02-20 09:32:28.158913714 +0000 UTC m=+0.093676336 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, release=1770267347, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=ubi9/ubi-minimal)
Feb 20 09:32:28 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:32:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:30.616 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:30.618 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:30.618 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:30.619 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8307 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E7C350000000001030307) 
Feb 20 09:32:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:30.647 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:30.648 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:32:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:32:30 np0005625204.localdomain podman[264668]: 2026-02-20 09:32:30.853401541 +0000 UTC m=+0.065219365 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:32:30 np0005625204.localdomain podman[264668]: 2026-02-20 09:32:30.890976327 +0000 UTC m=+0.102794171 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 20 09:32:30 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:32:30 np0005625204.localdomain podman[264669]: 2026-02-20 09:32:30.9651085 +0000 UTC m=+0.174575490 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:32:30 np0005625204.localdomain podman[264669]: 2026-02-20 09:32:30.974994419 +0000 UTC m=+0.184461479 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:32:30 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:32:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8308 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E80290000000001030307) 
Feb 20 09:32:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61351 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E83680000000001030307) 
Feb 20 09:32:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8309 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E88280000000001030307) 
Feb 20 09:32:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23777 DF PROTO=TCP SPT=41506 DPT=9102 SEQ=1882956386 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E8B680000000001030307) 
Feb 20 09:32:35 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:35.645 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8310 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599E97E80000000001030307) 
Feb 20 09:32:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:32:40 np0005625204.localdomain podman[264710]: 2026-02-20 09:32:40.143941787 +0000 UTC m=+0.081882046 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:32:40 np0005625204.localdomain podman[264710]: 2026-02-20 09:32:40.154373044 +0000 UTC m=+0.092313293 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:32:40 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:32:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:40.649 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:40.651 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:40.651 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:40.652 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:40.669 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:40 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:40.670 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:45.671 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:45.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:45.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:45.673 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:45.714 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:45 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:45.714 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8311 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EB7680000000001030307) 
Feb 20 09:32:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:32:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:32:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:32:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 20 09:32:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:32:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16786 "" "Go-http-client/1.1"
Feb 20 09:32:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:32:48 np0005625204.localdomain systemd[1]: tmp-crun.b1T5dK.mount: Deactivated successfully.
Feb 20 09:32:48 np0005625204.localdomain podman[264728]: 2026-02-20 09:32:48.132719044 +0000 UTC m=+0.072417760 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:32:48 np0005625204.localdomain podman[264728]: 2026-02-20 09:32:48.141320153 +0000 UTC m=+0.081018859 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:32:48 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:32:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:50.715 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:50.717 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:50.717 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:50.718 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:50.749 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:50 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:50.750 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:52 np0005625204.localdomain sshd[264751]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:32:53 np0005625204.localdomain sshd[264751]: Invalid user brandon from 27.112.79.3 port 34644
Feb 20 09:32:53 np0005625204.localdomain sshd[264751]: Received disconnect from 27.112.79.3 port 34644:11: Bye Bye [preauth]
Feb 20 09:32:53 np0005625204.localdomain sshd[264751]: Disconnected from invalid user brandon 27.112.79.3 port 34644 [preauth]
Feb 20 09:32:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:55.751 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:55.753 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:32:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:55.754 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:32:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:55.754 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:55.790 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:32:55 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:32:55.791 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:32:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:32:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:32:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:32:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:32:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:32:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:32:56 np0005625204.localdomain sshd[264753]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:32:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:32:57 np0005625204.localdomain sshd[264753]: Accepted publickey for zuul from 192.168.122.30 port 55090 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:32:57 np0005625204.localdomain systemd-logind[759]: New session 60 of user zuul.
Feb 20 09:32:57 np0005625204.localdomain systemd[1]: Started Session 60 of User zuul.
Feb 20 09:32:57 np0005625204.localdomain systemd[1]: tmp-crun.H7KjML.mount: Deactivated successfully.
Feb 20 09:32:57 np0005625204.localdomain podman[264755]: 2026-02-20 09:32:57.152838141 +0000 UTC m=+0.091137306 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:32:57 np0005625204.localdomain sshd[264753]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:32:57 np0005625204.localdomain podman[264755]: 2026-02-20 09:32:57.16397661 +0000 UTC m=+0.102275785 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:32:57 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:32:58 np0005625204.localdomain python3.9[264885]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:32:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:32:59 np0005625204.localdomain podman[264923]: 2026-02-20 09:32:59.141727906 +0000 UTC m=+0.074872327 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:32:59 np0005625204.localdomain podman[264923]: 2026-02-20 09:32:59.158043196 +0000 UTC m=+0.091187587 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, distribution-scope=public, version=9.7, vendor=Red Hat, Inc., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:32:59 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:32:59 np0005625204.localdomain python3.9[265016]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:32:59 np0005625204.localdomain network[265033]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:32:59 np0005625204.localdomain network[265034]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:32:59 np0005625204.localdomain network[265035]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:33:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58671 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EF1640000000001030307) 
Feb 20 09:33:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:00.792 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:00.794 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:00.794 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:00.794 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:00.834 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:00 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:00.835 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:33:00 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:33:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:33:01 np0005625204.localdomain podman[265075]: 2026-02-20 09:33:01.049154248 +0000 UTC m=+0.091828707 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:33:01 np0005625204.localdomain podman[265089]: 2026-02-20 09:33:01.113737562 +0000 UTC m=+0.085279224 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:33:01 np0005625204.localdomain podman[265075]: 2026-02-20 09:33:01.133721347 +0000 UTC m=+0.176395846 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:33:01 np0005625204.localdomain podman[265089]: 2026-02-20 09:33:01.144269358 +0000 UTC m=+0.115810980 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 20 09:33:01 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:33:01 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:33:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58672 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EF5680000000001030307) 
Feb 20 09:33:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8312 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EF7680000000001030307) 
Feb 20 09:33:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58673 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599EFD690000000001030307) 
Feb 20 09:33:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61352 DF PROTO=TCP SPT=52246 DPT=9102 SEQ=4102847731 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F01680000000001030307) 
Feb 20 09:33:05 np0005625204.localdomain sudo[265310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qnsrpdybdofmkqkpitlvsngvgtgskydg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579985.475247-98-120479780986195/AnsiballZ_setup.py
Feb 20 09:33:05 np0005625204.localdomain sudo[265310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:05.836 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:05.839 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:05.839 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:05.839 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:05.879 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:05 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:05.879 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:33:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:33:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:33:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:33:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:33:05.998 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:33:06 np0005625204.localdomain python3.9[265312]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 20 09:33:06 np0005625204.localdomain sudo[265310]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:06 np0005625204.localdomain sudo[265373]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucozlfvppxjlpfdyrysgpuiwkgjfalig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579985.475247-98-120479780986195/AnsiballZ_dnf.py
Feb 20 09:33:06 np0005625204.localdomain sudo[265373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:07 np0005625204.localdomain python3.9[265375]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:33:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58674 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F0D280000000001030307) 
Feb 20 09:33:10 np0005625204.localdomain sudo[265373]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:10.880 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:10.882 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:10.883 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:10.883 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:10.911 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:10 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:10.912 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:33:11 np0005625204.localdomain podman[265466]: 2026-02-20 09:33:11.163971748 +0000 UTC m=+0.095095430 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:33:11 np0005625204.localdomain podman[265466]: 2026-02-20 09:33:11.208100671 +0000 UTC m=+0.139224343 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 20 09:33:11 np0005625204.localdomain sudo[265504]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvsoxdztfupknxapkixonljbqruqebfb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579990.5325758-134-90971544949889/AnsiballZ_stat.py
Feb 20 09:33:11 np0005625204.localdomain sudo[265504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:11 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:33:11 np0005625204.localdomain python3.9[265506]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:33:11 np0005625204.localdomain sudo[265504]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:12 np0005625204.localdomain sudo[265615]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iajerjufbdvufedjzrdgitkdzgwgxlhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579991.7031553-165-255968340055023/AnsiballZ_command.py
Feb 20 09:33:12 np0005625204.localdomain sudo[265615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:12 np0005625204.localdomain python3.9[265617]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:33:12 np0005625204.localdomain sudo[265615]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:13 np0005625204.localdomain sudo[265726]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nveqqrcqkumsedungyufadlawazhjids ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579993.4611177-195-251080283493281/AnsiballZ_stat.py
Feb 20 09:33:13 np0005625204.localdomain sudo[265726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:14 np0005625204.localdomain python3.9[265728]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:33:14 np0005625204.localdomain sudo[265726]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:15 np0005625204.localdomain sudo[265819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:33:15 np0005625204.localdomain sudo[265819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:33:15 np0005625204.localdomain sudo[265819]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:15 np0005625204.localdomain sudo[265837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 09:33:15 np0005625204.localdomain sudo[265837]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:33:15 np0005625204.localdomain sudo[265874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsuptzmfquotqqqsmfbdvtvkwsgzzpwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579994.409323-228-4440456135628/AnsiballZ_lineinfile.py
Feb 20 09:33:15 np0005625204.localdomain sudo[265874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:15 np0005625204.localdomain python3.9[265876]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:15 np0005625204.localdomain sudo[265874]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:15 np0005625204.localdomain sudo[265837]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:15 np0005625204.localdomain sudo[265915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:33:15 np0005625204.localdomain sudo[265915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:33:15 np0005625204.localdomain sudo[265915]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:15 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:15.937 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:15 np0005625204.localdomain sudo[265933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:33:15 np0005625204.localdomain sudo[265933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:33:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58675 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F2D680000000001030307) 
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.353 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.355 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.356 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.356 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.357 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:33:16 np0005625204.localdomain sudo[265933]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:16 np0005625204.localdomain sudo[266093]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwitrujtpyaospyegtbxocyrqneifohk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579996.02628-255-150403473715124/AnsiballZ_systemd_service.py
Feb 20 09:33:16 np0005625204.localdomain sudo[266093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:16.830 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:33:16 np0005625204.localdomain python3.9[266095]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.016 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.017 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:33:17 np0005625204.localdomain sudo[266093]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.203 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.204 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12150MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.204 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.205 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:33:17 np0005625204.localdomain sudo[266117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:33:17 np0005625204.localdomain sudo[266117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:33:17 np0005625204.localdomain sudo[266117]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.304 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.305 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.305 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.348 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:33:17 np0005625204.localdomain sudo[266245]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktijgkcehmbowuzbebyaobvhxjedffou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771579997.263741-279-265164351929285/AnsiballZ_systemd_service.py
Feb 20 09:33:17 np0005625204.localdomain sudo[266245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:33:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:33:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:33:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 20 09:33:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:33:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1"
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.842 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.494s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.851 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:33:17 np0005625204.localdomain python3.9[266247]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.866 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.868 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:33:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:17.869 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:33:17 np0005625204.localdomain sudo[266245]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.202 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.207 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 93 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5be686c-ccca-4abf-802e-52fead763937', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 93, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.204273', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '305952f2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'c984a7f767a135ff181801d95ea22426530434aa279863fbef10421406d5064b'}]}, 'timestamp': '2026-02-20 09:33:18.208851', '_unique_id': 'b233ebc65907420cb30ad19525dcce4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.210 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.211 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fea40532-8df8-40a3-90a7-9f1cc54e7211', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.211691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '305f90fe-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '182272a1572c9d36e179103ff6a542b3fa86b75d1a6eaa2c3a38de6036e73e7c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.211691', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '305fa670-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'e4274fd0e753dee990e005c22c96c244766eb9719a36e8fb80aebc263f6dd8db'}]}, 'timestamp': '2026-02-20 09:33:18.250166', '_unique_id': 'd3201ef5809842e5b076f11fe4385805'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '50cd560c-3c5d-4453-9a95-4196af5eebcf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.253030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306028b6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'eab1c20a5e6df18bbe7a4c1ddb3f018505bdaeefecb140d385f26a7d968371d4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.253030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '30603a9a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '0bda803e95b23cb5192c856143e8f03ae39f0076d8b3613f387919dc268be7ed'}]}, 'timestamp': '2026-02-20 09:33:18.253947', '_unique_id': 'd2a5feab5ba44e83922652a75b1800c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72b8f962-c07a-4c18-b6ef-1068480f2cd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.256217', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '3060a520-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '540bf68fb6ce6970cd2c452d08bdca89109ca10936f9d168194e6e70361ef0fe'}]}, 'timestamp': '2026-02-20 09:33:18.256743', '_unique_id': '703d54ef39e14937baa8047a0965c455'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2f825e1-6d0e-4030-8e63-2da35cb1cb6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:33:18.258927', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '30641e94-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.518038794, 'message_signature': '58be7d3e722b4d98a3a486ca94a2f3c8c23c56a2171d42c36975b5d36996d743'}]}, 'timestamp': '2026-02-20 09:33:18.279462', '_unique_id': 'b3f417aeeabf406e9010a3d3b479e3b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd264440a-0741-465b-b055-82f47fd397da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.281707', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '30648884-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'de1e408e8fbafbee57093afa7d3dce5af375d58d40c39c4604e6b642d74c6dcb'}]}, 'timestamp': '2026-02-20 09:33:18.282184', '_unique_id': '946dbee8e6384ba290cb0a7e3bfd5595'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.283 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7baa03fa-6fde-456d-a84c-24789f32d7fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.284405', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '3064f2d8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '1a273062535fa3d1edf4a283b3028ee56710221636cc44756030902a5f69c68e'}]}, 'timestamp': '2026-02-20 09:33:18.284909', '_unique_id': '40f981c3aecc4e259d0c8209e87128d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 61130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '946d53bb-463c-4f44-b2c6-edd8a6de83a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 61130000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:33:18.287052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '306558b8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.518038794, 'message_signature': '5f330bb91c7ef305285fb1cdfe805420c308b4f3837bd0c24f443ab0dd5bf3e3'}]}, 'timestamp': '2026-02-20 09:33:18.287496', '_unique_id': '260b382542b84f1da57adf0dc37dff57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.289 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '93459998-0cb0-4ed8-8cca-d3fb5e0805a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.290129', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '3065d1b2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'a8af50ad4be9ca208485529fea8254215654d5c333a97e9ddcc515450fc23192'}]}, 'timestamp': '2026-02-20 09:33:18.290609', '_unique_id': '93433b16cd6f4be0934a1b401af68a7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3cb83d1e-c5c1-43c0-9ca6-f897377e97a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.292790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306638f0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'd7d6a6b4506a23d7cb3820b4d818a83b3bb0b5df26538b8c5d360cd900a43a68'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.292790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306649b2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '9e677835955acf553570d54394cfc17ce1cd577cdc75acfe3e15f3405cdd7e1e'}]}, 'timestamp': '2026-02-20 09:33:18.293678', '_unique_id': 'c56bad40d7894feeb6426da77475f8e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94509075-082f-4de0-b8b9-eebcd1f55eac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.295975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3066b578-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '767cc8ce06aa9e8150c94f99f814d074fbaad38a7ba8b30973466568c3225c89'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.295975', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3066c5f4-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '4f9da4db6dd42a012d3276ef51311edea6dae254d604089c153ceb4eaec784b1'}]}, 'timestamp': '2026-02-20 09:33:18.296860', '_unique_id': '4bf1b26eb3df465ba4af31a1d5fcd91b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b73d3d59-25de-4e78-a246-e701bccbfc11', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 446, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.299012', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '30672c38-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'bbf995a2d3adadbaba20923a961ba26bb3c55e2ab7764c97116e116503dc1d90'}]}, 'timestamp': '2026-02-20 09:33:18.299479', '_unique_id': '6e16ba6fbc42499ea8c07d86c4da093b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8ff73fba-3ca0-4de8-a06e-b54190abc7bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.301838', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '3069e8e2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '1e4951343a73009f8e713a7777fbd031e201d22d2297f7a6a50283873781676e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.301838', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '3069fc1a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '5c5ec7b5ac4b07e7abaa35d9357d77edbe79e5e9f9ae108e9f55a058c52ce0a3'}]}, 'timestamp': '2026-02-20 09:33:18.317895', '_unique_id': 'e95298657fe34a51902bdac39799e829'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c599811-144e-4842-badb-9df62c63433a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.320215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306a689e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '35f94e474f2504c3415de75f47f1a1889dcf0eedcbb14297fea8ab1ac007adc3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.320215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306a7abe-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': 'e284ae35b6fe3de4098a39353b19dea92fc177ab5b380ab6a065ba3cd67ed866'}]}, 'timestamp': '2026-02-20 09:33:18.321128', '_unique_id': '0b509875283a438da1a63079193ed2b2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.323 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '39a59869-206d-4ab7-b179-db2a8a802850', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.323347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306ae2ce-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '755283869f54fb5a096f8405c593ab53744fa1cdb9a523eca318f0404532ce35'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.323347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306af6c4-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '9ac27200c0a449951e20650aaa25fe76719d76f2615c747086c8e06cfb16fdbb'}]}, 'timestamp': '2026-02-20 09:33:18.324301', '_unique_id': 'c1d1c7f7eca3469c9fa17765dca573f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6b654ba-21ae-416f-9941-3110d8bbfb2c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.326670', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306b6596-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'c7ff80cb2ef4c0beca3687363bcb8f6c4fa15d4f8db2116bb0361e32b143416e'}]}, 'timestamp': '2026-02-20 09:33:18.327162', '_unique_id': 'f5582a73788045f4aa859464352b9c8c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '203b8679-49e7-4432-be39-8cf2bc7e6346', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.329339', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306bcce8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '467bb59919beb077614727815768d1d917ecfd6e2a60de7c1301d1b56699c377'}]}, 'timestamp': '2026-02-20 09:33:18.329844', '_unique_id': '0f390656fb4b41bd8feab2407ff4fa70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b4169477-f5e1-4abd-8ab7-51045d60d99a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.332036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306c36d8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': '1279598e86ebedd13f794bc43a63a36b99e56f71d9ffbcb5f50a9e20ded83549'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.332036', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306c48c6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.541047576, 'message_signature': 'a56d213f62dd778c114b189d76dc082e4e4c0b11bd66213ebee9b430934cb073'}]}, 'timestamp': '2026-02-20 09:33:18.332953', '_unique_id': '42d223afb20243c290e14576bd8107f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17ba3e1d-9732-495d-8461-7031c2abc42e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.335157', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306cb068-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': '5abfbe06d062a269960ba5ab481c09c75e5e889ac31c531208bcbfc15505fac1'}]}, 'timestamp': '2026-02-20 09:33:18.335664', '_unique_id': '160311d14912418da848f8428fa015ea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.337 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9a7b48f7-3cda-4516-9bb1-5d3c31dcfd20', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:33:18.337828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '306d185a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': '432b10e0016047247f95394e36fd3c6e4baba78bdd2a7807e8fbbbf0b1187f13'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:33:18.337828', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '306d293a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.450938037, 'message_signature': 'e8d805505b9bc88577a245421d7fe8b7b6d93f1771a5524f1fdf71acbda7c9b1'}]}, 'timestamp': '2026-02-20 09:33:18.338736', '_unique_id': '440f4b798b9545a694ef3ebbaf875bf3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.340 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 9437 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81204abe-78c0-43f6-993a-c4b24b9fae7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9437, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:33:18.340958', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '306d92f8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10557.443465995, 'message_signature': 'cf4a60eebe80ac8d8bff8da9c914ad9e72564810dd1b5385780bf23b6a623996'}]}, 'timestamp': '2026-02-20 09:33:18.341436', '_unique_id': 'f3250cd36c994224b01220f36e2862de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:33:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:33:18.342 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:33:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:18.869 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:18.870 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:18.870 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:19 np0005625204.localdomain python3.9[266359]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:33:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:33:19 np0005625204.localdomain network[266382]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:33:19 np0005625204.localdomain network[266383]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:33:19 np0005625204.localdomain network[266384]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:33:19 np0005625204.localdomain podman[266368]: 2026-02-20 09:33:19.157940157 +0000 UTC m=+0.086471248 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:33:19 np0005625204.localdomain podman[266368]: 2026-02-20 09:33:19.168982088 +0000 UTC m=+0.097513149 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:33:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:19.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:19 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.295 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:20 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.940 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.942 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.942 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.942 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.985 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:20.986 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:22.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:22.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.374 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.374 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.374 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:33:23 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:23.375 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:33:23 np0005625204.localdomain sudo[266630]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjiajnzrjkzgnmmyjsschuqpxepzlrcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580003.262013-347-87813340249663/AnsiballZ_dnf.py
Feb 20 09:33:23 np0005625204.localdomain sudo[266630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:23 np0005625204.localdomain python3.9[266632]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:33:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:24.072 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:33:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:24.092 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:33:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:24.092 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:33:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:25.988 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:25.990 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:25.991 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:25.991 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:25.992 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:25.993 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:33:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:33:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:33:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:33:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:33:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:33:27 np0005625204.localdomain sudo[266630]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:33:28 np0005625204.localdomain systemd[1]: tmp-crun.c7tFEi.mount: Deactivated successfully.
Feb 20 09:33:28 np0005625204.localdomain podman[266706]: 2026-02-20 09:33:28.166557879 +0000 UTC m=+0.099867333 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:33:28 np0005625204.localdomain podman[266706]: 2026-02-20 09:33:28.181012456 +0000 UTC m=+0.114321940 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:33:28 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:33:28 np0005625204.localdomain sudo[266764]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsxkqhgcbzkcdxgaodgydfsxsedwmzax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580007.788052-374-111261019277943/AnsiballZ_file.py
Feb 20 09:33:28 np0005625204.localdomain sudo[266764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:28 np0005625204.localdomain python3.9[266766]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 09:33:28 np0005625204.localdomain sudo[266764]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:29 np0005625204.localdomain sudo[266874]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjnsfdjclcjfvrrobgcpogxsfpzrtqwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580008.738559-399-37113431935900/AnsiballZ_modprobe.py
Feb 20 09:33:29 np0005625204.localdomain sudo[266874]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:29 np0005625204.localdomain python3.9[266876]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 20 09:33:29 np0005625204.localdomain sudo[266874]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:29 np0005625204.localdomain sudo[266984]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlpzaskmbscqtfakkggrcopgkghkqvvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580009.5219681-423-208306427800011/AnsiballZ_stat.py
Feb 20 09:33:29 np0005625204.localdomain sudo[266984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:33:29 np0005625204.localdomain podman[266987]: 2026-02-20 09:33:29.955105465 +0000 UTC m=+0.079126620 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 20 09:33:29 np0005625204.localdomain podman[266987]: 2026-02-20 09:33:29.972140162 +0000 UTC m=+0.096161357 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, name=ubi9/ubi-minimal, release=1770267347, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:33:29 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:33:30 np0005625204.localdomain python3.9[266986]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:33:30 np0005625204.localdomain sudo[266984]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:30 np0005625204.localdomain sudo[267059]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmvuhdgzuhqyuhuamhwduaitexkdpkfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580009.5219681-423-208306427800011/AnsiballZ_file.py
Feb 20 09:33:30 np0005625204.localdomain sudo[267059]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:30 np0005625204.localdomain sshd[267062]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:33:30 np0005625204.localdomain python3.9[267061]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:30 np0005625204.localdomain sudo[267059]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18810 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F66950000000001030307) 
Feb 20 09:33:30 np0005625204.localdomain sshd[267062]: Received disconnect from 54.36.99.29 port 37702:11: Bye Bye [preauth]
Feb 20 09:33:30 np0005625204.localdomain sshd[267062]: Disconnected from authenticating user root 54.36.99.29 port 37702 [preauth]
Feb 20 09:33:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:30.994 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:30.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:30.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:30 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:30.996 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:31.021 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:31.022 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:31 np0005625204.localdomain sudo[267171]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvugjrltipsehhftiqzwtictchpzvwya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580010.915089-461-249464242779364/AnsiballZ_lineinfile.py
Feb 20 09:33:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:33:31 np0005625204.localdomain sudo[267171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:33:31 np0005625204.localdomain podman[267175]: 2026-02-20 09:33:31.315709964 +0000 UTC m=+0.100695768 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 20 09:33:31 np0005625204.localdomain podman[267173]: 2026-02-20 09:33:31.342203214 +0000 UTC m=+0.127057544 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:33:31 np0005625204.localdomain podman[267175]: 2026-02-20 09:33:31.353170643 +0000 UTC m=+0.138156457 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:33:31 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:33:31 np0005625204.localdomain python3.9[267174]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:31 np0005625204.localdomain sudo[267171]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:31 np0005625204.localdomain podman[267173]: 2026-02-20 09:33:31.413110999 +0000 UTC m=+0.197965279 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Feb 20 09:33:31 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:33:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18811 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F6AA80000000001030307) 
Feb 20 09:33:31 np0005625204.localdomain sudo[267325]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwboxqsqchgjycrtgggcakblxmmbfgrx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580011.6618998-489-222496472653373/AnsiballZ_command.py
Feb 20 09:33:31 np0005625204.localdomain sudo[267325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:32 np0005625204.localdomain python3.9[267327]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:33:32 np0005625204.localdomain sudo[267325]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58676 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F6D690000000001030307) 
Feb 20 09:33:32 np0005625204.localdomain sudo[267436]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izvdqxmgkvmikqczmhkelbopogfyvlbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580012.3359897-513-98678847626996/AnsiballZ_command.py
Feb 20 09:33:32 np0005625204.localdomain sudo[267436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:32 np0005625204.localdomain python3.9[267438]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:33:32 np0005625204.localdomain sudo[267436]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:33 np0005625204.localdomain sudo[267547]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyabsjszuzlavkzbipikblpsdoxgliqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580013.1223047-540-88492971365743/AnsiballZ_stat.py
Feb 20 09:33:33 np0005625204.localdomain sudo[267547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:33 np0005625204.localdomain python3.9[267549]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:33:33 np0005625204.localdomain sudo[267547]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18812 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F72A90000000001030307) 
Feb 20 09:33:34 np0005625204.localdomain sudo[267659]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opokrqmbbptutpfubvtzeknnffnnmfbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580013.930674-570-20406878475604/AnsiballZ_command.py
Feb 20 09:33:34 np0005625204.localdomain sudo[267659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:34 np0005625204.localdomain python3.9[267661]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:33:34 np0005625204.localdomain sudo[267659]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8313 DF PROTO=TCP SPT=53598 DPT=9102 SEQ=1136214941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F75690000000001030307) 
Feb 20 09:33:35 np0005625204.localdomain sudo[267770]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isvyqazqgeuqkefaxbfkukkuboeukyuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580014.735051-600-211990262301839/AnsiballZ_replace.py
Feb 20 09:33:35 np0005625204.localdomain sudo[267770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:35 np0005625204.localdomain python3.9[267772]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:35 np0005625204.localdomain sudo[267770]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:35 np0005625204.localdomain sudo[267880]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ythivrxookjfnibobiwhodcendjhnzmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580015.5749176-627-266011446798197/AnsiballZ_lineinfile.py
Feb 20 09:33:35 np0005625204.localdomain sudo[267880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:36 np0005625204.localdomain python3.9[267882]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:36.023 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:36.024 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:36.025 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:36.025 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:36 np0005625204.localdomain sudo[267880]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:36.056 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:36.057 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:36 np0005625204.localdomain sudo[267990]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aclexmmhdynzgjmpdwjrexfethmkesgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580016.2361543-627-184734003308415/AnsiballZ_lineinfile.py
Feb 20 09:33:36 np0005625204.localdomain sudo[267990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:36 np0005625204.localdomain python3.9[267992]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:36 np0005625204.localdomain sudo[267990]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:37 np0005625204.localdomain sudo[268100]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyxvxrppyirxinrpkytvikwltkmduluh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580016.8331056-627-62693575121370/AnsiballZ_lineinfile.py
Feb 20 09:33:37 np0005625204.localdomain sudo[268100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:37 np0005625204.localdomain python3.9[268102]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:37 np0005625204.localdomain sudo[268100]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:37 np0005625204.localdomain sudo[268210]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfzpickkrusttnzbufphddmfpmgzytfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580017.428362-627-113329198016678/AnsiballZ_lineinfile.py
Feb 20 09:33:37 np0005625204.localdomain sudo[268210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18813 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599F82680000000001030307) 
Feb 20 09:33:37 np0005625204.localdomain python3.9[268212]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:37 np0005625204.localdomain sudo[268210]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:38 np0005625204.localdomain sudo[268320]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tabeyxcomkzbhvxtnickpjuuerypoqhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580018.10668-714-145483876129245/AnsiballZ_stat.py
Feb 20 09:33:38 np0005625204.localdomain sudo[268320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:38 np0005625204.localdomain python3.9[268322]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:33:38 np0005625204.localdomain sudo[268320]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:39 np0005625204.localdomain sudo[268432]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uawvuryjmrkuhuaopmakdjhvnaitnwpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580018.9705682-744-119338828610835/AnsiballZ_systemd_service.py
Feb 20 09:33:39 np0005625204.localdomain sudo[268432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:39 np0005625204.localdomain python3.9[268434]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:33:39 np0005625204.localdomain sudo[268432]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:40 np0005625204.localdomain sudo[268544]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbluphbsnxbszkttflcohhcomonvnfit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580019.8102899-768-215346726686452/AnsiballZ_systemd_service.py
Feb 20 09:33:40 np0005625204.localdomain sudo[268544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:40 np0005625204.localdomain python3.9[268546]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:33:40 np0005625204.localdomain sudo[268544]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:41 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:41.057 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:33:42 np0005625204.localdomain systemd[1]: tmp-crun.dKL4Yr.mount: Deactivated successfully.
Feb 20 09:33:42 np0005625204.localdomain podman[268612]: 2026-02-20 09:33:42.152627103 +0000 UTC m=+0.093856626 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:33:42 np0005625204.localdomain podman[268612]: 2026-02-20 09:33:42.166983298 +0000 UTC m=+0.108212831 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute)
Feb 20 09:33:42 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:33:42 np0005625204.localdomain sudo[268675]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymdhtkblnibvjjrdcmwbnxvqnjthvwhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580021.9552832-804-146023783550074/AnsiballZ_file.py
Feb 20 09:33:42 np0005625204.localdomain sudo[268675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:42 np0005625204.localdomain python3.9[268677]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 20 09:33:42 np0005625204.localdomain sudo[268675]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:42 np0005625204.localdomain sudo[268785]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wghknfdebzdbfehutsyioqijnrfseavk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580022.6382012-827-279727971663009/AnsiballZ_modprobe.py
Feb 20 09:33:42 np0005625204.localdomain sudo[268785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:43 np0005625204.localdomain python3.9[268787]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 20 09:33:43 np0005625204.localdomain sudo[268785]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:43 np0005625204.localdomain sudo[268895]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olktsssuzmncrlbjwxevpfirsttvjwwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580023.413093-852-89887713421439/AnsiballZ_stat.py
Feb 20 09:33:43 np0005625204.localdomain sudo[268895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:43 np0005625204.localdomain python3.9[268897]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:33:43 np0005625204.localdomain sudo[268895]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:44 np0005625204.localdomain sudo[268952]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frgqqawzigpjkddrsdmrolvixmwafmuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580023.413093-852-89887713421439/AnsiballZ_file.py
Feb 20 09:33:44 np0005625204.localdomain sudo[268952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:44 np0005625204.localdomain python3.9[268954]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:44 np0005625204.localdomain sudo[268952]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:45 np0005625204.localdomain sudo[269062]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwtahfifkapmtstkjmxvjkebiuwyqxou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580024.912477-890-101264874927695/AnsiballZ_lineinfile.py
Feb 20 09:33:45 np0005625204.localdomain sudo[269062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:45 np0005625204.localdomain python3.9[269064]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:45 np0005625204.localdomain sudo[269062]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:46.059 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:46.062 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:46.062 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:46.062 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:46.106 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:46.107 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18814 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FA3680000000001030307) 
Feb 20 09:33:46 np0005625204.localdomain sudo[269172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tognvjqscpxpnluewxkodkxhiuetmhnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580025.705978-918-119812943151748/AnsiballZ_dnf.py
Feb 20 09:33:46 np0005625204.localdomain sudo[269172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:46 np0005625204.localdomain python3.9[269174]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 20 09:33:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:33:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:33:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:33:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 20 09:33:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:33:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16794 "" "Go-http-client/1.1"
Feb 20 09:33:49 np0005625204.localdomain sudo[269172]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:33:50 np0005625204.localdomain systemd[1]: tmp-crun.GELx2G.mount: Deactivated successfully.
Feb 20 09:33:50 np0005625204.localdomain podman[269194]: 2026-02-20 09:33:50.154732127 +0000 UTC m=+0.091432111 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:33:50 np0005625204.localdomain podman[269194]: 2026-02-20 09:33:50.188761351 +0000 UTC m=+0.125461305 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:33:50 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:33:50 np0005625204.localdomain python3.9[269308]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 20 09:33:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:51.108 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:51.110 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:51.110 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:51.110 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:51.150 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:51.151 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:51 np0005625204.localdomain sudo[269420]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzyyoixcygazsejnrmrvjlxtihxgcumq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580031.2641227-970-3599544808212/AnsiballZ_file.py
Feb 20 09:33:51 np0005625204.localdomain sudo[269420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:51 np0005625204.localdomain python3.9[269422]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:33:51 np0005625204.localdomain sudo[269420]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:52 np0005625204.localdomain sudo[269530]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwtpqrisawmitjkgdvgeywvehsahxkmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580032.1845918-1003-53948160835705/AnsiballZ_systemd_service.py
Feb 20 09:33:52 np0005625204.localdomain sudo[269530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:52 np0005625204.localdomain python3.9[269532]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:33:52 np0005625204.localdomain systemd-rc-local-generator[269558]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:33:52 np0005625204.localdomain systemd-sysv-generator[269564]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:52 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:33:53 np0005625204.localdomain sudo[269530]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:53 np0005625204.localdomain python3.9[269676]: ansible-ansible.builtin.service_facts Invoked
Feb 20 09:33:54 np0005625204.localdomain network[269693]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 20 09:33:54 np0005625204.localdomain network[269694]: 'network-scripts' will be removed from distribution in near future.
Feb 20 09:33:54 np0005625204.localdomain network[269695]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 20 09:33:55 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:33:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:56.153 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:56.155 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:33:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:56.155 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:33:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:56.156 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:56.185 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:33:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:33:56.186 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:33:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:33:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:33:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:33:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:33:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:33:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:33:58 np0005625204.localdomain sshd[269893]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:33:58 np0005625204.localdomain sudo[269927]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iizsvkarvisjwyjxmymuokocyxqyhpqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580038.4018111-1060-226378563639877/AnsiballZ_systemd_service.py
Feb 20 09:33:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:33:58 np0005625204.localdomain sudo[269927]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:58 np0005625204.localdomain systemd[1]: tmp-crun.AaLob2.mount: Deactivated successfully.
Feb 20 09:33:58 np0005625204.localdomain podman[269929]: 2026-02-20 09:33:58.795234014 +0000 UTC m=+0.096814068 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:33:58 np0005625204.localdomain podman[269929]: 2026-02-20 09:33:58.812918831 +0000 UTC m=+0.114498845 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:33:58 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:33:58 np0005625204.localdomain sshd[269893]: Invalid user  from 64.62.197.211 port 38223
Feb 20 09:33:59 np0005625204.localdomain python3.9[269930]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:33:59 np0005625204.localdomain sudo[269927]: pam_unix(sudo:session): session closed for user root
Feb 20 09:33:59 np0005625204.localdomain sudo[270061]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcfbhupufpcopzaioifkaoxrbxpkcqoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580039.1551454-1060-29336454353751/AnsiballZ_systemd_service.py
Feb 20 09:33:59 np0005625204.localdomain sudo[270061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:33:59 np0005625204.localdomain python3.9[270063]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:33:59 np0005625204.localdomain sudo[270061]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:34:00 np0005625204.localdomain sudo[270172]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkyytcmhctakwfdsfqwfyoxtpskhcrlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580039.8388746-1060-40171157279024/AnsiballZ_systemd_service.py
Feb 20 09:34:00 np0005625204.localdomain sudo[270172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:00 np0005625204.localdomain systemd[1]: tmp-crun.BG06HK.mount: Deactivated successfully.
Feb 20 09:34:00 np0005625204.localdomain podman[270174]: 2026-02-20 09:34:00.158720433 +0000 UTC m=+0.091310038 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, version=9.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z)
Feb 20 09:34:00 np0005625204.localdomain podman[270174]: 2026-02-20 09:34:00.175945746 +0000 UTC m=+0.108535361 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git)
Feb 20 09:34:00 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:34:00 np0005625204.localdomain python3.9[270180]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:34:00 np0005625204.localdomain sudo[270172]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59202 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FDBC40000000001030307) 
Feb 20 09:34:00 np0005625204.localdomain sudo[270305]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tedgpeohgazvsbeppehmfvcvrtlgaxnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580040.5036516-1060-32416889521901/AnsiballZ_systemd_service.py
Feb 20 09:34:00 np0005625204.localdomain sudo[270305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:01 np0005625204.localdomain python3.9[270307]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:34:01 np0005625204.localdomain sudo[270305]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:01.187 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:01.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:01.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:01.189 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:01.223 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:01.224 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:01 np0005625204.localdomain sudo[270416]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnuynjtrfkkakzkggltkwovuuykfdqjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580041.2068095-1060-86980877168181/AnsiballZ_systemd_service.py
Feb 20 09:34:01 np0005625204.localdomain sudo[270416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:34:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:34:01 np0005625204.localdomain podman[270420]: 2026-02-20 09:34:01.562970033 +0000 UTC m=+0.080608997 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:34:01 np0005625204.localdomain podman[270420]: 2026-02-20 09:34:01.568973838 +0000 UTC m=+0.086612782 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 20 09:34:01 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:34:01 np0005625204.localdomain podman[270419]: 2026-02-20 09:34:01.613482886 +0000 UTC m=+0.132932696 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:34:01 np0005625204.localdomain podman[270419]: 2026-02-20 09:34:01.654079063 +0000 UTC m=+0.173528833 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:34:01 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:34:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59203 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FDFE80000000001030307) 
Feb 20 09:34:01 np0005625204.localdomain python3.9[270418]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:34:01 np0005625204.localdomain sudo[270416]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:02 np0005625204.localdomain sudo[270568]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvasolubhgshkrghtnlecgtrrhwkvxjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580041.928343-1060-201780762137809/AnsiballZ_systemd_service.py
Feb 20 09:34:02 np0005625204.localdomain sudo[270568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:02 np0005625204.localdomain python3.9[270570]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:34:02 np0005625204.localdomain sshd[269893]: Connection closed by invalid user  64.62.197.211 port 38223 [preauth]
Feb 20 09:34:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18815 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FE3680000000001030307) 
Feb 20 09:34:03 np0005625204.localdomain sudo[270568]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59204 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FE7E80000000001030307) 
Feb 20 09:34:03 np0005625204.localdomain sudo[270679]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkzajgdhqzfycyifzdkwkgxkduaturwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580043.6759467-1060-109873861856254/AnsiballZ_systemd_service.py
Feb 20 09:34:03 np0005625204.localdomain sudo[270679]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:04 np0005625204.localdomain python3.9[270681]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:34:04 np0005625204.localdomain sudo[270679]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58677 DF PROTO=TCP SPT=42326 DPT=9102 SEQ=1501729732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FEB690000000001030307) 
Feb 20 09:34:04 np0005625204.localdomain sudo[270790]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlhpbkfenwcdhfbitdvpuhhwksjednvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580044.4164326-1060-191383793014907/AnsiballZ_systemd_service.py
Feb 20 09:34:04 np0005625204.localdomain sudo[270790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:04 np0005625204.localdomain python3.9[270792]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:34:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:34:05.996 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:34:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:34:05.998 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:34:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:34:06.000 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:34:06 np0005625204.localdomain sudo[270790]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:06.224 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:06.226 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:06.227 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:06.227 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:06.256 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:06.256 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:07 np0005625204.localdomain sudo[270901]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvgbqqmwazdnnskhbttbronqjlrqlsoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580046.5449593-1237-159243107158146/AnsiballZ_file.py
Feb 20 09:34:07 np0005625204.localdomain sudo[270901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:07 np0005625204.localdomain python3.9[270903]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:07 np0005625204.localdomain sudo[270901]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59205 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A599FF7A80000000001030307) 
Feb 20 09:34:08 np0005625204.localdomain sudo[271011]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sfxnmjrobvwtlbijyxvixqpbsbkazthn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580047.8057728-1237-66264912993740/AnsiballZ_file.py
Feb 20 09:34:08 np0005625204.localdomain sudo[271011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:08 np0005625204.localdomain python3.9[271013]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:08 np0005625204.localdomain sudo[271011]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:08 np0005625204.localdomain sudo[271121]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oaobabyjvezothmgnvgnzbuvvyseuytc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580048.4061542-1237-136215960804671/AnsiballZ_file.py
Feb 20 09:34:08 np0005625204.localdomain sudo[271121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:08 np0005625204.localdomain python3.9[271123]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:09 np0005625204.localdomain sudo[271121]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:09 np0005625204.localdomain sudo[271231]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkcobruczwmqeumlwohiwweezevwtnfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580049.3171835-1237-243226452534842/AnsiballZ_file.py
Feb 20 09:34:09 np0005625204.localdomain sudo[271231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:09 np0005625204.localdomain python3.9[271233]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:09 np0005625204.localdomain sudo[271231]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:10 np0005625204.localdomain sudo[271341]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yloercczbbcgvgufvlluilftvocstsiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580049.8358047-1237-160979560707114/AnsiballZ_file.py
Feb 20 09:34:10 np0005625204.localdomain sudo[271341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:10 np0005625204.localdomain python3.9[271343]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:10 np0005625204.localdomain sudo[271341]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:10 np0005625204.localdomain sudo[271451]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqdfaxjbphptjxntxygechismlvdroju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580050.410036-1237-105010806171495/AnsiballZ_file.py
Feb 20 09:34:10 np0005625204.localdomain sudo[271451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:10 np0005625204.localdomain python3.9[271453]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:10 np0005625204.localdomain sudo[271451]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:11 np0005625204.localdomain sudo[271561]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yaxgnltydzwztzenhoowrykwydupjcwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580050.951743-1237-160567452366520/AnsiballZ_file.py
Feb 20 09:34:11 np0005625204.localdomain sudo[271561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:11.257 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:11.259 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:11.259 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:11.259 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:11.260 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:11.263 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:11 np0005625204.localdomain python3.9[271563]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:11 np0005625204.localdomain sudo[271561]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:11 np0005625204.localdomain sudo[271671]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttaysbjrsvpqnzftykewzhcldtvhdbqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580051.5664124-1237-135971759304360/AnsiballZ_file.py
Feb 20 09:34:11 np0005625204.localdomain sudo[271671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:12 np0005625204.localdomain python3.9[271673]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:12 np0005625204.localdomain sudo[271671]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:12 np0005625204.localdomain sudo[271781]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhewxdbfbtmbkjhsumozlkakqadsnveu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580052.2857285-1407-154893023928634/AnsiballZ_file.py
Feb 20 09:34:12 np0005625204.localdomain sudo[271781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:34:12 np0005625204.localdomain systemd[1]: tmp-crun.u7w7SS.mount: Deactivated successfully.
Feb 20 09:34:12 np0005625204.localdomain podman[271784]: 2026-02-20 09:34:12.633227387 +0000 UTC m=+0.095372874 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 20 09:34:12 np0005625204.localdomain podman[271784]: 2026-02-20 09:34:12.643930387 +0000 UTC m=+0.106075894 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:34:12 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:34:12 np0005625204.localdomain python3.9[271783]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:12 np0005625204.localdomain sudo[271781]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:13 np0005625204.localdomain sudo[271910]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqwvtnbhdhcxjilhplbdwwheqrnqsrva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580052.8558173-1407-31148404682148/AnsiballZ_file.py
Feb 20 09:34:13 np0005625204.localdomain sudo[271910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:13 np0005625204.localdomain python3.9[271912]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:13 np0005625204.localdomain sudo[271910]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:13 np0005625204.localdomain sudo[272020]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qplhnjycjvwwlltkhzxqcqhsbpdrzcat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580053.5414937-1407-142097232754715/AnsiballZ_file.py
Feb 20 09:34:13 np0005625204.localdomain sudo[272020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:14 np0005625204.localdomain python3.9[272022]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:14 np0005625204.localdomain sudo[272020]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:14.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:14 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:14.300 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 09:34:14 np0005625204.localdomain sudo[272130]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exofeljagklmsliytkgxmlkibarcoeua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580054.4017346-1407-107224181514176/AnsiballZ_file.py
Feb 20 09:34:14 np0005625204.localdomain sudo[272130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:14 np0005625204.localdomain python3.9[272132]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:14 np0005625204.localdomain sudo[272130]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:15 np0005625204.localdomain sudo[272240]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syzpytdfbziwgxyvgycyawvmpicwlgwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580054.994983-1407-261094500193095/AnsiballZ_file.py
Feb 20 09:34:15 np0005625204.localdomain sudo[272240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:15 np0005625204.localdomain python3.9[272242]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:15 np0005625204.localdomain sudo[272240]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59206 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A017680000000001030307) 
Feb 20 09:34:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:16.296 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:16.298 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:16.299 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:16.299 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:16.300 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:16.301 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:16 np0005625204.localdomain sudo[272350]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bvuhgnsmpqselcuyvpkawdnoywssaqat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580055.635769-1407-9469667908944/AnsiballZ_file.py
Feb 20 09:34:16 np0005625204.localdomain sudo[272350]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:16 np0005625204.localdomain python3.9[272352]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:16 np0005625204.localdomain sudo[272350]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:16 np0005625204.localdomain sudo[272460]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-biakwzzouzaqzqbcwxsghqlfhmaskkcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580056.6518538-1407-198520308394837/AnsiballZ_file.py
Feb 20 09:34:16 np0005625204.localdomain sudo[272460]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:17 np0005625204.localdomain python3.9[272462]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:17 np0005625204.localdomain sudo[272460]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:17 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:17.327 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:17 np0005625204.localdomain sudo[272534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:34:17 np0005625204.localdomain sudo[272534]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:34:17 np0005625204.localdomain sudo[272534]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:17 np0005625204.localdomain sudo[272569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:34:17 np0005625204.localdomain sudo[272569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:34:17 np0005625204.localdomain sudo[272604]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryicttaapcggtqlrpragqpjqxeyusinp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580057.2271278-1407-99230041054621/AnsiballZ_file.py
Feb 20 09:34:17 np0005625204.localdomain sudo[272604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:34:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:34:17 np0005625204.localdomain python3.9[272608]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:17 np0005625204.localdomain sudo[272604]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:34:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 20 09:34:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:34:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16797 "" "Go-http-client/1.1"
Feb 20 09:34:18 np0005625204.localdomain sudo[272569]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.320 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.321 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.321 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.321 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.322 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.786 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.844 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:34:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:18.845 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:34:18 np0005625204.localdomain sudo[272695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:34:18 np0005625204.localdomain sudo[272695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:34:18 np0005625204.localdomain sudo[272695]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.062 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.064 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12183MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.064 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.065 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:34:19 np0005625204.localdomain sudo[272787]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynnkqtbxjozupzclgludjoqeuiuwytjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580058.8988948-1582-30658730776466/AnsiballZ_command.py
Feb 20 09:34:19 np0005625204.localdomain sudo[272787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.256 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.258 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.258 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.332 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:34:19 np0005625204.localdomain python3.9[272789]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:19 np0005625204.localdomain sudo[272787]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.399 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.400 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.415 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.435 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_IDE,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE4A,COMPUTE_RESCUE_BFV,HW_CPU_X86_AVX2,HW_CPU_X86_MMX,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_F16C,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_NET_VIF_MODEL_VMXNET3,HW_CPU_X86_SSE41,HW_CPU_X86_SVM,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AVX,COMPUTE_NODE,HW_CPU_X86_ABM,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AMD_SVM,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_ACCELERATORS,COMPUTE_NET_VIF_MODEL_VIRTIO,HW_CPU_X86_BMI,HW_CPU_X86_SSE42,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_CLMUL,HW_CPU_X86_AESNI,HW_CPU_X86_SSE2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSSE3,HW_CPU_X86_SHA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_FMA3,COMPUTE_VOLUME_EXTEND,COMPUTE_SECURITY_TPM_2_0,COMPUTE_STORAGE_BUS_FDC,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_IMAGE_TYPE_QCOW2 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.470 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.923 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.930 230556 DEBUG nova.compute.provider_tree [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.946 230556 DEBUG nova.scheduler.client.report [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.949 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.949 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:34:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:19.950 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:20 np0005625204.localdomain python3.9[272921]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 20 09:34:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:20.957 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:20 np0005625204.localdomain sudo[273029]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jstadolibxqhxdmklrybgwqjrdffhjtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580060.7185538-1636-167868117139355/AnsiballZ_systemd_service.py
Feb 20 09:34:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:20.976 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:20 np0005625204.localdomain sudo[273029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:34:21 np0005625204.localdomain podman[273032]: 2026-02-20 09:34:21.092577075 +0000 UTC m=+0.087924853 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:34:21 np0005625204.localdomain podman[273032]: 2026-02-20 09:34:21.105540717 +0000 UTC m=+0.100888485 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.302 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.303 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.303 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.304 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.305 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:21 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:21.307 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:21 np0005625204.localdomain python3.9[273031]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:34:21 np0005625204.localdomain systemd-rc-local-generator[273077]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:34:21 np0005625204.localdomain systemd-sysv-generator[273085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:34:21 np0005625204.localdomain sudo[273029]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:22 np0005625204.localdomain sudo[273199]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlsogvweqgletsrcyhlbjvrowfeobvvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580061.9045272-1660-270246464835836/AnsiballZ_command.py
Feb 20 09:34:22 np0005625204.localdomain sudo[273199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:22.300 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:22.302 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:34:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:22.302 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:22.303 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 09:34:22 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:22.315 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 09:34:22 np0005625204.localdomain python3.9[273201]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:22 np0005625204.localdomain sudo[273199]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:22 np0005625204.localdomain sudo[273310]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhcucgcxcglwvenxfetohhxswywwcwce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580062.528298-1660-110286225950027/AnsiballZ_command.py
Feb 20 09:34:22 np0005625204.localdomain sudo[273310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:22 np0005625204.localdomain python3.9[273312]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:22 np0005625204.localdomain sudo[273310]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:23 np0005625204.localdomain sudo[273421]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xppqhahsangcetvldwyycvvfotvxvxjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580063.131807-1660-145414936190482/AnsiballZ_command.py
Feb 20 09:34:23 np0005625204.localdomain sudo[273421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:23 np0005625204.localdomain python3.9[273423]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:23 np0005625204.localdomain sudo[273421]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:23 np0005625204.localdomain sudo[273532]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oqpodotkwvzjcbxlaillciaskbojhwus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580063.6980565-1660-139734087968706/AnsiballZ_command.py
Feb 20 09:34:23 np0005625204.localdomain sudo[273532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:24 np0005625204.localdomain python3.9[273534]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:24 np0005625204.localdomain sudo[273532]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.315 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.316 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.316 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:34:24 np0005625204.localdomain sudo[273643]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qforsfcbzqjepwkwpgjuskqynpiqfgsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580064.3041449-1660-122633804551460/AnsiballZ_command.py
Feb 20 09:34:24 np0005625204.localdomain sudo[273643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.741 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.742 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.742 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:34:24 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:24.742 230556 DEBUG nova.objects.instance [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:34:24 np0005625204.localdomain python3.9[273645]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:24 np0005625204.localdomain sudo[273643]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:25.099 230556 DEBUG nova.network.neutron [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:34:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:25.116 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:34:25 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:25.116 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:34:25 np0005625204.localdomain sudo[273754]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uteeorppdwjmqnykkvjcgoesnwdimbvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580064.9237566-1660-237086331282901/AnsiballZ_command.py
Feb 20 09:34:25 np0005625204.localdomain sudo[273754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:25 np0005625204.localdomain python3.9[273756]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:25 np0005625204.localdomain sudo[273754]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:26 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:26.307 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:26 np0005625204.localdomain sudo[273865]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acmeejvpbzqifienghogpaeuxhhjkeua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580066.1893215-1660-193400167391168/AnsiballZ_command.py
Feb 20 09:34:26 np0005625204.localdomain sudo[273865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:34:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:34:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:34:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:34:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:34:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:34:26 np0005625204.localdomain python3.9[273867]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:26 np0005625204.localdomain sudo[273865]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:27 np0005625204.localdomain sudo[273976]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opwmyqdubkatxouyzkgrbcggpugmuexj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580066.8092842-1660-90749899208314/AnsiballZ_command.py
Feb 20 09:34:27 np0005625204.localdomain sudo[273976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:27 np0005625204.localdomain python3.9[273978]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:34:27 np0005625204.localdomain sudo[273976]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:34:29 np0005625204.localdomain podman[273997]: 2026-02-20 09:34:29.137828095 +0000 UTC m=+0.078329756 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:34:29 np0005625204.localdomain podman[273997]: 2026-02-20 09:34:29.155202462 +0000 UTC m=+0.095704143 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:34:29 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:34:30 np0005625204.localdomain sshd[274019]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:34:30 np0005625204.localdomain sshd[274019]: Invalid user steam from 18.221.252.160 port 44548
Feb 20 09:34:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:34:30 np0005625204.localdomain sshd[274019]: Received disconnect from 18.221.252.160 port 44548:11: Bye Bye [preauth]
Feb 20 09:34:30 np0005625204.localdomain sshd[274019]: Disconnected from invalid user steam 18.221.252.160 port 44548 [preauth]
Feb 20 09:34:30 np0005625204.localdomain podman[274021]: 2026-02-20 09:34:30.372763944 +0000 UTC m=+0.079503742 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 20 09:34:30 np0005625204.localdomain podman[274021]: 2026-02-20 09:34:30.413557327 +0000 UTC m=+0.120297085 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Feb 20 09:34:30 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:34:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32280 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A050F40000000001030307) 
Feb 20 09:34:31 np0005625204.localdomain sudo[274131]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fijndpblnclaecfwnwhbfelyxakvnpon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580070.7710025-1867-272027564469561/AnsiballZ_file.py
Feb 20 09:34:31 np0005625204.localdomain sudo[274131]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:31 np0005625204.localdomain python3.9[274133]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:31.310 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:31.311 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:31 np0005625204.localdomain sudo[274131]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:31.312 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:31.312 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:31.312 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:31 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:31.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32281 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A054E80000000001030307) 
Feb 20 09:34:31 np0005625204.localdomain sudo[274241]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbbbqhldprxhnyqablakyspryibkpvvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580071.4242692-1867-224982587336893/AnsiballZ_file.py
Feb 20 09:34:31 np0005625204.localdomain sudo[274241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:34:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:34:31 np0005625204.localdomain podman[274245]: 2026-02-20 09:34:31.785593119 +0000 UTC m=+0.077356395 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:34:31 np0005625204.localdomain podman[274245]: 2026-02-20 09:34:31.792418831 +0000 UTC m=+0.084182127 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:34:31 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:34:31 np0005625204.localdomain podman[274244]: 2026-02-20 09:34:31.840725136 +0000 UTC m=+0.132361568 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 20 09:34:31 np0005625204.localdomain podman[274244]: 2026-02-20 09:34:31.901221849 +0000 UTC m=+0.192858241 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:34:31 np0005625204.localdomain python3.9[274243]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:31 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:34:31 np0005625204.localdomain sudo[274241]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59207 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A057680000000001030307) 
Feb 20 09:34:32 np0005625204.localdomain sudo[274393]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tsfyamkkreqsughjmujcraucprjcgsaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580072.1981263-1912-172434310063641/AnsiballZ_file.py
Feb 20 09:34:32 np0005625204.localdomain sudo[274393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:32 np0005625204.localdomain python3.9[274395]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:32 np0005625204.localdomain sudo[274393]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:33 np0005625204.localdomain sudo[274503]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcynnznjxnehtmfiyknfuwlcnzadnzpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580072.946512-1912-154792482074877/AnsiballZ_file.py
Feb 20 09:34:33 np0005625204.localdomain sudo[274503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:33 np0005625204.localdomain python3.9[274505]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:33 np0005625204.localdomain sudo[274503]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32282 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A05CE80000000001030307) 
Feb 20 09:34:33 np0005625204.localdomain sudo[274613]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hslkzqlkbxaysypzqbotkzqjiipjepml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580073.5061448-1912-238555582654926/AnsiballZ_file.py
Feb 20 09:34:33 np0005625204.localdomain sudo[274613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:34 np0005625204.localdomain python3.9[274615]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:34 np0005625204.localdomain sudo[274613]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:34 np0005625204.localdomain sudo[274723]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpqdccfumpyljovqiwukdgbpghncjovv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580074.1716866-1912-227151149096363/AnsiballZ_file.py
Feb 20 09:34:34 np0005625204.localdomain sudo[274723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:34 np0005625204.localdomain python3.9[274725]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:34 np0005625204.localdomain sudo[274723]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18816 DF PROTO=TCP SPT=54286 DPT=9102 SEQ=2611316665 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A061680000000001030307) 
Feb 20 09:34:35 np0005625204.localdomain sudo[274833]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbuurevxtctjqqptqhzrokywhmjxwhcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580074.7614815-1912-36446162811111/AnsiballZ_file.py
Feb 20 09:34:35 np0005625204.localdomain sudo[274833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:35 np0005625204.localdomain python3.9[274835]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:35 np0005625204.localdomain sudo[274833]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:35 np0005625204.localdomain sudo[274943]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqvdatlxtpsrwvzdptqofbwqdoreobep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580075.404372-1912-36281531049319/AnsiballZ_file.py
Feb 20 09:34:35 np0005625204.localdomain sudo[274943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:35 np0005625204.localdomain python3.9[274945]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:35 np0005625204.localdomain sudo[274943]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:36 np0005625204.localdomain sudo[275053]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uneleccxkllylhjompvgsdltuipiwuiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580075.9856436-1912-260430783827899/AnsiballZ_file.py
Feb 20 09:34:36 np0005625204.localdomain sudo[275053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:36 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:36.314 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:36 np0005625204.localdomain python3.9[275055]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:36 np0005625204.localdomain sudo[275053]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32283 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A06CA80000000001030307) 
Feb 20 09:34:38 np0005625204.localdomain sshd[275073]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:34:39 np0005625204.localdomain sshd[275073]: Invalid user funded from 45.148.10.240 port 57718
Feb 20 09:34:39 np0005625204.localdomain sshd[275073]: Connection closed by invalid user funded 45.148.10.240 port 57718 [preauth]
Feb 20 09:34:41 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:41.316 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:41 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:41.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:41 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:41.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:41 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:41.318 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:41 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:41.319 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:34:43 np0005625204.localdomain podman[275075]: 2026-02-20 09:34:43.364210409 +0000 UTC m=+0.301516854 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Feb 20 09:34:43 np0005625204.localdomain podman[275075]: 2026-02-20 09:34:43.377176251 +0000 UTC m=+0.314482656 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:34:43 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:34:43 np0005625204.localdomain sudo[275184]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bralrxixxijsaxcemzwkmkxweljtcerd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580083.1046574-2277-204037362246895/AnsiballZ_getent.py
Feb 20 09:34:43 np0005625204.localdomain sudo[275184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:43 np0005625204.localdomain python3.9[275186]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 20 09:34:43 np0005625204.localdomain sudo[275184]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:44 np0005625204.localdomain sshd[275205]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:34:45 np0005625204.localdomain sshd[275205]: Accepted publickey for zuul from 192.168.122.30 port 41620 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:34:45 np0005625204.localdomain systemd-logind[759]: New session 61 of user zuul.
Feb 20 09:34:45 np0005625204.localdomain systemd[1]: Started Session 61 of User zuul.
Feb 20 09:34:45 np0005625204.localdomain sshd[275205]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:34:45 np0005625204.localdomain sshd[275208]: Received disconnect from 192.168.122.30 port 41620:11: disconnected by user
Feb 20 09:34:45 np0005625204.localdomain sshd[275208]: Disconnected from user zuul 192.168.122.30 port 41620
Feb 20 09:34:45 np0005625204.localdomain sshd[275205]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:34:45 np0005625204.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Feb 20 09:34:45 np0005625204.localdomain systemd-logind[759]: Session 61 logged out. Waiting for processes to exit.
Feb 20 09:34:45 np0005625204.localdomain systemd-logind[759]: Removed session 61.
Feb 20 09:34:45 np0005625204.localdomain python3.9[275316]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:34:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32284 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A08D690000000001030307) 
Feb 20 09:34:46 np0005625204.localdomain python3.9[275371]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:46 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:46.320 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:46 np0005625204.localdomain python3.9[275479]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:34:47 np0005625204.localdomain python3.9[275565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580086.314077-2358-206837844614125/.source _original_basename=ssh-config follow=False checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:34:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:34:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:34:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149682 "" "Go-http-client/1.1"
Feb 20 09:34:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:34:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16786 "" "Go-http-client/1.1"
Feb 20 09:34:47 np0005625204.localdomain python3.9[275673]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:34:48 np0005625204.localdomain python3.9[275759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580087.419933-2358-125558617568500/.source.py _original_basename=nova_statedir_ownership.py follow=False checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:48 np0005625204.localdomain python3.9[275867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:34:49 np0005625204.localdomain python3.9[275953]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580088.4403605-2358-212203156006371/.source _original_basename=run-on-host follow=False checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:50 np0005625204.localdomain python3.9[276061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:34:51 np0005625204.localdomain python3.9[276147]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1771580090.1671562-2520-205150396176672/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=50598ea057afd85a1f5b995974d61e2c257c9737 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:51 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:51.322 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:34:52 np0005625204.localdomain sudo[276255]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-instchnyvwiblfiaucfxjeoorjvfoztu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580091.8268323-2565-211510712313723/AnsiballZ_file.py
Feb 20 09:34:52 np0005625204.localdomain sudo[276255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:52 np0005625204.localdomain systemd[1]: tmp-crun.eoincw.mount: Deactivated successfully.
Feb 20 09:34:52 np0005625204.localdomain podman[276257]: 2026-02-20 09:34:52.203951865 +0000 UTC m=+0.138520430 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:34:52 np0005625204.localdomain podman[276257]: 2026-02-20 09:34:52.213941883 +0000 UTC m=+0.148510458 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:34:52 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:34:52 np0005625204.localdomain python3.9[276263]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:52 np0005625204.localdomain sudo[276255]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:52 np0005625204.localdomain sudo[276386]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctogkxhdabdgfzjhbrbuahfxhwprghuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580092.496523-2589-115537077809058/AnsiballZ_copy.py
Feb 20 09:34:52 np0005625204.localdomain sudo[276386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:52 np0005625204.localdomain python3.9[276388]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:52 np0005625204.localdomain sudo[276386]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:53 np0005625204.localdomain sudo[276496]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqimcoeezpijhrgmrgjehccnpjaoarfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580093.162168-2613-117071294284043/AnsiballZ_stat.py
Feb 20 09:34:53 np0005625204.localdomain sudo[276496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:53 np0005625204.localdomain python3.9[276498]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:34:53 np0005625204.localdomain sudo[276496]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:54 np0005625204.localdomain sudo[276608]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prdfztwdcguayeuhuffkrlapjgkfgatv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580093.9494948-2641-3940944282493/AnsiballZ_file.py
Feb 20 09:34:54 np0005625204.localdomain sudo[276608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:54 np0005625204.localdomain python3.9[276610]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:54 np0005625204.localdomain sudo[276608]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:55 np0005625204.localdomain python3.9[276718]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:34:56 np0005625204.localdomain sudo[276828]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyccptaldxcvhzeiuxiepsajrfofkkvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580095.71665-2697-97490027711719/AnsiballZ_file.py
Feb 20 09:34:56 np0005625204.localdomain sudo[276828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:56 np0005625204.localdomain python3.9[276830]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:56 np0005625204.localdomain sudo[276828]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:56.327 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:56.329 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:34:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:56.329 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:34:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:56.329 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:56.370 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:34:56 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:34:56.371 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:34:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:34:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:34:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:34:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:34:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:34:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:34:56 np0005625204.localdomain sudo[276938]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqgfpljoctyidabstljgaegjkortvspx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580096.4789555-2721-84387846716273/AnsiballZ_file.py
Feb 20 09:34:56 np0005625204.localdomain sudo[276938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:56 np0005625204.localdomain python3.9[276940]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:34:56 np0005625204.localdomain sudo[276938]: pam_unix(sudo:session): session closed for user root
Feb 20 09:34:57 np0005625204.localdomain python3.9[277048]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute_init state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:34:59 np0005625204.localdomain sudo[277352]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyhekttnmvzwicabobwlqmvpttqdqpib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580099.521993-2822-130480995781308/AnsiballZ_container_config_data.py
Feb 20 09:34:59 np0005625204.localdomain sudo[277352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:34:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:35:00 np0005625204.localdomain systemd[1]: tmp-crun.FZtexQ.mount: Deactivated successfully.
Feb 20 09:35:00 np0005625204.localdomain podman[277355]: 2026-02-20 09:35:00.010063741 +0000 UTC m=+0.100336347 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:35:00 np0005625204.localdomain podman[277355]: 2026-02-20 09:35:00.017703287 +0000 UTC m=+0.107975883 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:35:00 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:35:00 np0005625204.localdomain python3.9[277354]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute_init config_pattern=*.json debug=False
Feb 20 09:35:00 np0005625204.localdomain sudo[277352]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14677 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0C6260000000001030307) 
Feb 20 09:35:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:35:00 np0005625204.localdomain podman[277433]: 2026-02-20 09:35:00.874222713 +0000 UTC m=+0.087452148 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, release=1770267347, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:35:00 np0005625204.localdomain podman[277433]: 2026-02-20 09:35:00.889417883 +0000 UTC m=+0.102647298 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:35:00 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:35:00 np0005625204.localdomain sudo[277504]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muxxcgjuxlvvvofmlcvxdsctokhoxgqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580100.6040487-2855-185735570219526/AnsiballZ_container_config_hash.py
Feb 20 09:35:00 np0005625204.localdomain sudo[277504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:01 np0005625204.localdomain python3.9[277506]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:35:01 np0005625204.localdomain sudo[277504]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.306 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.324 230556 DEBUG nova.compute.manager [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.325 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.325 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.368 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.371 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.374 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.374 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.374 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.394 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:01 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:01.395 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14678 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0CA280000000001030307) 
Feb 20 09:35:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:35:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:35:02 np0005625204.localdomain systemd[1]: tmp-crun.BFztpR.mount: Deactivated successfully.
Feb 20 09:35:02 np0005625204.localdomain podman[277525]: 2026-02-20 09:35:02.155742174 +0000 UTC m=+0.084337782 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:35:02 np0005625204.localdomain podman[277525]: 2026-02-20 09:35:02.164084612 +0000 UTC m=+0.092680250 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 20 09:35:02 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:35:02 np0005625204.localdomain podman[277524]: 2026-02-20 09:35:02.261765956 +0000 UTC m=+0.193803800 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:35:02 np0005625204.localdomain podman[277524]: 2026-02-20 09:35:02.302219759 +0000 UTC m=+0.234257583 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:35:02 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:35:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32285 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0CD680000000001030307) 
Feb 20 09:35:02 np0005625204.localdomain sudo[277655]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icpxoeprorcbxuefxwjziaitzombyyzt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771580102.3829174-2885-204299333515725/AnsiballZ_edpm_container_manage.py
Feb 20 09:35:02 np0005625204.localdomain sudo[277655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:03 np0005625204.localdomain python3[277657]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute_init config_id=nova_compute_init config_overrides={} config_patterns=*.json containers=['nova_compute_init'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:35:03 np0005625204.localdomain python3[277657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",
                                                                    "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:31:38.534497001Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1214548351,
                                                                    "VirtualSize": 1214548351,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",
                                                                              "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:39.234075496Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.686286019Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.133364958Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:10.283411186Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:19.407054412Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:42.656365894Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:37.451289936Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.151652427Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532191009Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532298572Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:44.609081717Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:35:03 np0005625204.localdomain podman[277710]: 2026-02-20 09:35:03.441358191 +0000 UTC m=+0.081374900 container remove 8898fbe269462bce4b6d58449ba1759f0da98926b61b93f89f015a2633a6e6c4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': 'b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=nova_compute_init, container_name=nova_compute_init)
Feb 20 09:35:03 np0005625204.localdomain python3[277657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute_init
Feb 20 09:35:03 np0005625204.localdomain podman[277723]: 
Feb 20 09:35:03 np0005625204.localdomain podman[277723]: 2026-02-20 09:35:03.555918558 +0000 UTC m=+0.093047971 container create d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:35:03 np0005625204.localdomain podman[277723]: 2026-02-20 09:35:03.512566606 +0000 UTC m=+0.049696049 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:35:03 np0005625204.localdomain python3[277657]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --env EDPM_CONFIG_HASH=9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b --label config_id=nova_compute_init --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 20 09:35:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14679 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0D2280000000001030307) 
Feb 20 09:35:03 np0005625204.localdomain sudo[277655]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59208 DF PROTO=TCP SPT=35380 DPT=9102 SEQ=2629619504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0D5680000000001030307) 
Feb 20 09:35:04 np0005625204.localdomain sudo[277868]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsvbldhggmrehwzahnvusauuypzdcziq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580104.3843052-2910-162083870490124/AnsiballZ_stat.py
Feb 20 09:35:04 np0005625204.localdomain sudo[277868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:04 np0005625204.localdomain python3.9[277870]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:35:04 np0005625204.localdomain sudo[277868]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:05 np0005625204.localdomain python3.9[277980]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:35:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:35:05.997 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:35:05.998 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:35:06.000 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:06.395 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:06.397 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:06.397 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:35:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:06.398 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:06.437 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:06 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:06.438 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:06 np0005625204.localdomain sudo[278088]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ysmquzhvbkmpwosjrwoywkojuiinkwdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580106.6821008-2991-231221965954345/AnsiballZ_stat.py
Feb 20 09:35:06 np0005625204.localdomain sudo[278088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:07 np0005625204.localdomain python3.9[278090]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:35:07 np0005625204.localdomain sudo[278088]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:07 np0005625204.localdomain sudo[278178]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzamosfauwxuzrzvumrijhlwmwskcvyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580106.6821008-2991-231221965954345/AnsiballZ_copy.py
Feb 20 09:35:07 np0005625204.localdomain sudo[278178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14680 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A0E1E80000000001030307) 
Feb 20 09:35:07 np0005625204.localdomain python3.9[278180]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771580106.6821008-2991-231221965954345/.source.yaml _original_basename=.n49b82ef follow=False checksum=4d557a266f0e30e386f17a3d7c6078d564f9be8b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:07 np0005625204.localdomain sudo[278178]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:08 np0005625204.localdomain sudo[278288]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqnvairsiwnnlbettvarimfegrtsdfwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580108.4258273-3043-122505390104170/AnsiballZ_file.py
Feb 20 09:35:08 np0005625204.localdomain sudo[278288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:08 np0005625204.localdomain python3.9[278290]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:08 np0005625204.localdomain sudo[278288]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:09 np0005625204.localdomain sudo[278398]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrxsdecltidikdjeeuvnuqabdtxbqzzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580109.4509368-3067-137607552665524/AnsiballZ_file.py
Feb 20 09:35:09 np0005625204.localdomain sudo[278398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:09 np0005625204.localdomain python3.9[278400]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 20 09:35:10 np0005625204.localdomain sudo[278398]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:10 np0005625204.localdomain sshd[278434]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:35:10 np0005625204.localdomain sudo[278510]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enoncarqalaaqfelbtefdkeibdjcpfaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580110.2402372-3089-42582711647597/AnsiballZ_stat.py
Feb 20 09:35:10 np0005625204.localdomain sudo[278510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:10 np0005625204.localdomain python3.9[278512]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:35:10 np0005625204.localdomain sudo[278510]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:10 np0005625204.localdomain sudo[278567]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agglubldhbonbuavivqeqizgwmlncwsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580110.2402372-3089-42582711647597/AnsiballZ_file.py
Feb 20 09:35:10 np0005625204.localdomain sudo[278567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:11 np0005625204.localdomain python3.9[278569]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/nova_compute.json _original_basename=.co0ftjja recurse=False state=file path=/var/lib/kolla/config_files/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:11 np0005625204.localdomain sudo[278567]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:11.439 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:11.441 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:11.442 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:35:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:11.442 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:11 np0005625204.localdomain sshd[278434]: Invalid user nexus from 188.166.218.64 port 38512
Feb 20 09:35:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:11.479 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:11 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:11.481 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:11 np0005625204.localdomain sshd[278434]: Received disconnect from 188.166.218.64 port 38512:11: Bye Bye [preauth]
Feb 20 09:35:11 np0005625204.localdomain sshd[278434]: Disconnected from invalid user nexus 188.166.218.64 port 38512 [preauth]
Feb 20 09:35:11 np0005625204.localdomain python3.9[278677]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/nova_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:13 np0005625204.localdomain sudo[278981]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryjyipyajsuskatraijjgzcibtzotacp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580113.6126304-3201-7258047943801/AnsiballZ_container_config_data.py
Feb 20 09:35:13 np0005625204.localdomain sudo[278981]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:35:13 np0005625204.localdomain podman[278984]: 2026-02-20 09:35:13.99299988 +0000 UTC m=+0.088254894 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 20 09:35:14 np0005625204.localdomain podman[278984]: 2026-02-20 09:35:14.032171792 +0000 UTC m=+0.127426836 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 20 09:35:14 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:35:14 np0005625204.localdomain python3.9[278983]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/nova_compute config_pattern=*.json debug=False
Feb 20 09:35:14 np0005625204.localdomain sudo[278981]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:14 np0005625204.localdomain sudo[279108]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcdbyxaouardxkbvfitqjhsevpicmkqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580114.5779185-3234-258449195467718/AnsiballZ_container_config_hash.py
Feb 20 09:35:14 np0005625204.localdomain sudo[279108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:15 np0005625204.localdomain python3.9[279110]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 20 09:35:15 np0005625204.localdomain sudo[279108]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:15 np0005625204.localdomain sudo[279218]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhmxecbqorhvekfzmnarjdbiwfvuxkyh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1771580115.49314-3263-76025273598503/AnsiballZ_edpm_container_manage.py
Feb 20 09:35:15 np0005625204.localdomain sudo[279218]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14681 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A101680000000001030307) 
Feb 20 09:35:16 np0005625204.localdomain python3[279220]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/nova_compute config_id=nova_compute config_overrides={} config_patterns=*.json containers=['nova_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 20 09:35:16 np0005625204.localdomain python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",
                                                                    "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:31:38.534497001Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1214548351,
                                                                    "VirtualSize": 1214548351,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",
                                                                              "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:39.234075496Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.686286019Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.133364958Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:10.283411186Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:19.407054412Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:42.656365894Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:37.451289936Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.151652427Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532191009Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532298572Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:44.609081717Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.435 230556 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.530 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.533 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.533 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5052 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.533 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:16 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:16.535 230556 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:35:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:35:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:35:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149688 "" "Go-http-client/1.1"
Feb 20 09:35:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:35:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16661 "" "Go-http-client/1.1"
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.204 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.209 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 130 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '117c1219-0ba1-4428-ad43-4deaa8ea6aa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 130, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.204937', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77e01b9c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'eb557e1d85b064cd8efc0375bbd9c69b892493731233198ac408b3dcd755d132'}]}, 'timestamp': '2026-02-20 09:35:18.210355', '_unique_id': '54fb405872cd4847a2ba0e5ed9de29aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.213 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bcf3083f-728a-491e-95b7-589e5f47273a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.213364', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77e0a6ca-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'dcdfbddae3d80a1452fd5673725190965b96a3d489ef78e0ba06c9bef95de973'}]}, 'timestamp': '2026-02-20 09:35:18.213883', '_unique_id': 'dfb771435e5b4afcbea0744223f7d934'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.214 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.216 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8931ba44-6206-4888-b684-b0b3ee2e4ba1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.216024', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77e10ea8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'a4c0cc1012cb8cf8c64464f5ca93ab6be6b6aae5656ab6ac4cc0dffb00ee311e'}]}, 'timestamp': '2026-02-20 09:35:18.216495', '_unique_id': 'c3eca172041c4e6392754a8fa727d56c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.217 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.218 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 214846202 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 31640964 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3db1e5a3-3b74-4676-b99b-e5c5cdba6b28', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 214846202, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.218790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77e6befc-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '9a056927dafaf180cc009b64a3827d6410523d541e6fa66abe151456b5a4c4c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 31640964, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.218790', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77e6d59a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '27bf41283f6adcf56dd09ab37f98a59a69e7a1b41f2dcb1b79f03d45c7f1ee40'}]}, 'timestamp': '2026-02-20 09:35:18.254346', '_unique_id': 'fb18afe3c3ca4d09870275748d765997'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b05d704-4195-4c8d-a73c-eb251d4e15ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.257037', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77e97f48-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'f5cd4f0d8eca889a63ea38352ecd204e37be7d85e9c5d63cae70089a7e18b743'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.257037', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77e99212-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'cc17747136debeffb05ece6d8dc4d49c4984c3141d7bd3edace0fe9abdbcfafe'}]}, 'timestamp': '2026-02-20 09:35:18.272263', '_unique_id': 'ddc17cc48f89433999407c5605ca1290'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.273 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.274 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.274 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1064 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.275 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 222 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d15c49-a8d1-4719-ae49-6fee83158766', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1064, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.274937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77ea0b5c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '9b20040ee81058a50824a3f83b08614ef0232617b656c80df8d41b1d5e7d15a3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 222, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.274937', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77ea1bce-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'b432806ef93ac4936ddd495c6d9b1a8797a6b3d56da2eb1d61bb5b2ad4d1743d'}]}, 'timestamp': '2026-02-20 09:35:18.275817', '_unique_id': 'b26b29b560ed4910b2b85bb8276c9b8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.278 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.297 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 62150000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bdfd1c40-3a68-4739-859b-68fc1e2eddb2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 62150000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:35:18.278178', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '77ed7fd0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.536396081, 'message_signature': 'd78580b6c94b57a82c943ab67b7df3729bb1daf004cc25057eb6364c2380068b'}]}, 'timestamp': '2026-02-20 09:35:18.298087', '_unique_id': 'e74492e769c84e6787da2e2572c085a3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.299 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.300 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 1363170250 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 199987534 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95be735f-df03-45b6-a940-192350157131', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1363170250, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.300606', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77edf758-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '305f987621cf9f0db859f4eb600deed25ec8aede171c47d3a2d787af10338e37'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199987534, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.300606', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77ee0806-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'd7a699ddaa9e1f1a1a01b0c9ae176c1973354ea2e8c1a58c6bb9dca6dd0ff762'}]}, 'timestamp': '2026-02-20 09:35:18.301490', '_unique_id': 'a15f7b89224b45d5aed74c37c2e31841'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.303 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '58b9c59e-26a3-441f-a002-4907607ec8ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.303891', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77ee7688-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': '63f512d4be2348f8d5e8c0c95eb16baf1cdb31106e1d1dc1cf7c87bd86ed9539'}]}, 'timestamp': '2026-02-20 09:35:18.304349', '_unique_id': '2cea341da9e945b39fcac3b20d2760f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29130240 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 4300800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '381b02b6-c2bd-455f-88d3-7205103941b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29130240, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.306411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77eed880-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'c4d5319bbad592b32bd670762947fa30d70007779b0cdfd93f88ddc4edd36608'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4300800, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.306411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77eee9b0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '888cde16890c966f4acebf19247d72725d7e725bef4996ed3b4adcf962d74edc'}]}, 'timestamp': '2026-02-20 09:35:18.307271', '_unique_id': 'b3b7b1cbd5ef416abb060db7fcd5ca5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa80a170-b318-4f40-97ec-1e7c35ad3db2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.309517', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77ef535a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'ad4efc634abecb7dddb6b662944e203473f4ed8ff6f1039c08b01fa5be458338'}]}, 'timestamp': '2026-02-20 09:35:18.310003', '_unique_id': '8c264a18bd1648f18b2229eb1b78c7ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.312 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae917ae7-973f-49b1-87dc-c84ba0f4dacc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.312036', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77efb462-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'd7998f5001ab99e7a706ecf54a5e3bcd99126d3a9d3a6328cee672608daed54c'}]}, 'timestamp': '2026-02-20 09:35:18.312485', '_unique_id': 'cb78c4f9e1e6407bbe72a2b2dba6cd04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.313 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f70f3cf3-adcd-4fb6-aa5b-ac1f6c0fbda5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.314536', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f0172c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': '94b9b8b7dc6bb3d1ec5e36f3f492de0315e8d88f9c9a7790698f8f37d9a3249c'}]}, 'timestamp': '2026-02-20 09:35:18.315014', '_unique_id': '30803500d497459fa431e91809f598e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 11314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1732ae1f-4241-49cc-95e4-d57f8613204d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 11314, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.317063', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f078de-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': '5add133ed3b0b0f903e128323923d3d762253f43dbf98069d4277613ac6abc05'}]}, 'timestamp': '2026-02-20 09:35:18.317512', '_unique_id': '65b2655cda5d4caf8c16357d81b1e56d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:18.319 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94679d5a-2470-4a9e-93aa-16fca343d3b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.319516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f0d9be-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'b56181c53f9b03e79dc2611e26c93284bb5c81fff005af32ff2743071aa52bdf'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.319516', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f0e97c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': 'baf022f792a6022b137439267b4568624ed5967cd93a0211c95d72b61f986915'}]}, 'timestamp': '2026-02-20 09:35:18.320367', '_unique_id': 'f357daabc311462e9dda56f7df1cf3c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 73912320 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceb88bdf-3c1b-4b66-b820-fd7740e08509', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73912320, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.322446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f14bf6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '555c569d78587584c3f34c6f9e36713a0e84c748981b89aa256fa576a709e581'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.322446', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f15bbe-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '5b5b954600680d718a16f79ec28cbd83696bf402a1caaa3f6d4b8345649f1841'}]}, 'timestamp': '2026-02-20 09:35:18.323292', '_unique_id': 'd3f363064e9945ecbfbe725b005b8313'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 52.328125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '775f73df-0865-4619-ba11-752b2be86046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 52.328125, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:35:18.325510', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '77f1c3d8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.536396081, 'message_signature': '3b6b4b1cc4f0ebb8befcaaf21b3115f6cc404fc8632ebe3001beaf4fa41b423e'}]}, 'timestamp': '2026-02-20 09:35:18.325971', '_unique_id': '7ce709db5b014e86b45069216a1f3ec8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '336d8d12-4fdf-4bb8-869f-8cdeb595bd65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.327985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f22346-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': '91eb5eddd7a2a488311413aa8cb0731dccae0ec3f66238146904282ebec5a47a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.327985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f23304-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.49627156, 'message_signature': '090faad3b4274dbaaf963f5b0fa6ae7ca441d85e9a23b94072ad6190c43eaae6'}]}, 'timestamp': '2026-02-20 09:35:18.328855', '_unique_id': 'fa2aca54c2004e178a367c079a70115d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 9437 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25741b18-3767-40bb-b9a4-0804124bfe1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9437, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.330961', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f297b8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'd62f403fd6e9ec16f2bc4a0bbd7759ce870acf4741ed1a2aa3bdb2275bf852f6'}]}, 'timestamp': '2026-02-20 09:35:18.331412', '_unique_id': 'fef8d67b0e3744f59649c9fca5b73afe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.333 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 93 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1728fd36-207a-45fb-9e98-8dc4ac5e68e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 93, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:35:18.333329', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '77f2f14a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.444136095, 'message_signature': 'd363b1d494a53ab334bed6e36750a811e8a21c6715664caffd8d3ecab0f55593'}]}, 'timestamp': '2026-02-20 09:35:18.333627', '_unique_id': 'ad708ea6982b4d179f4424c85a023214'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 523 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.335 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84272e9a-844e-458b-964a-5fa86e9a0d2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 523, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:35:18.335049', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '77f334ac-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': '4d0b2f2fb1841bcd32ed3397ad71d1febe28e64ebf4195bf1d1003fcc67f96f6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:35:18.335049', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '77f33f24-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10677.457989614, 'message_signature': 'cdcf2b4d089afb286366176f125b39b2d16d3eca95b4e34ee159a49b1d9d4124'}]}, 'timestamp': '2026-02-20 09:35:18.335604', '_unique_id': '7b85ec4dbfaf445ab126fc0dfa8f41a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:35:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:35:18.336 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:35:19 np0005625204.localdomain sudo[279281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:35:19 np0005625204.localdomain sudo[279281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:35:19 np0005625204.localdomain sudo[279281]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:19 np0005625204.localdomain sudo[279299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:35:19 np0005625204.localdomain sudo[279299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.299 230556 DEBUG oslo_service.periodic_task [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.323 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.324 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.324 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.324 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.325 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.774 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.829 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:35:19 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:19.830 230556 DEBUG nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.018 230556 WARNING nova.virt.libvirt.driver [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.020 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12147MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.021 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.022 230556 DEBUG oslo_concurrency.lockutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:20 np0005625204.localdomain podman[279412]: 2026-02-20 09:35:20.063709715 +0000 UTC m=+0.093927009 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, CEPH_POINT_RELEASE=)
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.092 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.093 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.094 230556 DEBUG nova.compute.resource_tracker [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:35:20 np0005625204.localdomain podman[279412]: 2026-02-20 09:35:20.125069635 +0000 UTC m=+0.155286909 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_BRANCH=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, RELEASE=main, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.expose-services=)
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.152 230556 DEBUG oslo_concurrency.processutils [None req-ec218237-3f88-4f98-a98a-96f834732acf - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.312 230556 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.315 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.315 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:35:20 np0005625204.localdomain nova_compute[230552]: 2026-02-20 09:35:20.316 230556 DEBUG oslo_concurrency.lockutils [None req-38f1b164-cb78-47e7-941a-8e7362f8c3e1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:35:20 np0005625204.localdomain sudo[279299]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:20 np0005625204.localdomain sudo[279498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:35:20 np0005625204.localdomain sudo[279498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:35:20 np0005625204.localdomain sudo[279498]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:20 np0005625204.localdomain sudo[279516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:35:20 np0005625204.localdomain sudo[279516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:35:20 np0005625204.localdomain systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Deactivated successfully.
Feb 20 09:35:20 np0005625204.localdomain virtqemud[206495]: End of file while reading data: Input/output error
Feb 20 09:35:20 np0005625204.localdomain systemd[1]: libpod-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782.scope: Consumed 19.442s CPU time.
Feb 20 09:35:20 np0005625204.localdomain podman[279270]: 2026-02-20 09:35:20.742998833 +0000 UTC m=+4.380518095 container died 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=nova_compute, org.label-schema.schema-version=1.0)
Feb 20 09:35:20 np0005625204.localdomain systemd[1]: tmp-crun.kJAYWo.mount: Deactivated successfully.
Feb 20 09:35:20 np0005625204.localdomain podman[279270]: 2026-02-20 09:35:20.879709515 +0000 UTC m=+4.517228807 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:35:20 np0005625204.localdomain python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman stop nova_compute
Feb 20 09:35:20 np0005625204.localdomain podman[279536]: 2026-02-20 09:35:20.894879995 +0000 UTC m=+0.147115775 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=nova_compute, container_name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:35:20 np0005625204.localdomain podman[279590]: error opening file `/run/crun/299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782/status`: No such file or directory
Feb 20 09:35:21 np0005625204.localdomain podman[279562]: 2026-02-20 09:35:21.010072021 +0000 UTC m=+0.092183675 container cleanup 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:35:21 np0005625204.localdomain podman[279562]: nova_compute
Feb 20 09:35:21 np0005625204.localdomain podman[279555]: 2026-02-20 09:35:21.035308832 +0000 UTC m=+0.138271192 container remove 299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-b8ba4dc25157d88622fd7931c0da23ab126113e1d8333c6e27a9572e79d3a69a'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=nova_compute, container_name=nova_compute)
Feb 20 09:35:21 np0005625204.localdomain python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2c4c13f30a5f59752e0d06bbc9e2966e63f759ecf4dffcdf0aa522fe6def4ac1-merged.mount: Deactivated successfully.
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-299f3f9b6f0069a9da447daf5f06ccc3009be73d16ee714cbbfd4548506c1782-userdata-shm.mount: Deactivated successfully.
Feb 20 09:35:21 np0005625204.localdomain podman[279594]: Error: no container with name or ID "nova_compute" found: no such container
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: edpm_nova_compute.service: Control process exited, code=exited, status=125/n/a
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: edpm_nova_compute.service: Failed with result 'exit-code'.
Feb 20 09:35:21 np0005625204.localdomain podman[279602]: 
Feb 20 09:35:21 np0005625204.localdomain podman[279602]: 2026-02-20 09:35:21.147273218 +0000 UTC m=+0.092017249 container create 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=nova_compute, tcib_managed=true, container_name=nova_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 20 09:35:21 np0005625204.localdomain podman[279602]: 2026-02-20 09:35:21.115013629 +0000 UTC m=+0.059757620 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 20 09:35:21 np0005625204.localdomain python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b --label config_id=nova_compute --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: edpm_nova_compute.service: Scheduled restart job, restart counter is at 1.
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: Stopped nova_compute container.
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: Starting nova_compute container...
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: Started libpod-conmon-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope.
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:35:21 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:21 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:21 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:21 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:21 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:21 np0005625204.localdomain podman[279624]: 2026-02-20 09:35:21.291956366 +0000 UTC m=+0.127493987 container init 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:35:21 np0005625204.localdomain podman[279624]: 2026-02-20 09:35:21.303483633 +0000 UTC m=+0.139021244 container start 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + sudo -E kolla_set_configs
Feb 20 09:35:21 np0005625204.localdomain python3[279220]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman start nova_compute
Feb 20 09:35:21 np0005625204.localdomain systemd[1]: Started nova_compute container.
Feb 20 09:35:21 np0005625204.localdomain sudo[279516]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Validating config file
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying service configuration files
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Deleting /etc/ceph
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Creating directory /etc/ceph
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Writing out command to execute
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: ++ cat /run_command
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + CMD=nova-compute
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + ARGS=
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + sudo kolla_copy_cacerts
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + [[ ! -n '' ]]
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + . kolla_extend_start
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: Running command: 'nova-compute'
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + echo 'Running command: '\''nova-compute'\'''
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + umask 0022
Feb 20 09:35:21 np0005625204.localdomain nova_compute[279644]: + exec nova-compute
Feb 20 09:35:21 np0005625204.localdomain sudo[279218]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:21 np0005625204.localdomain sudo[279718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:35:21 np0005625204.localdomain sudo[279718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:35:21 np0005625204.localdomain sudo[279718]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:22 np0005625204.localdomain sudo[279827]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzigjdkfarmhekpfkycugetzxvsvpkpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580122.6198468-3288-58012695756627/AnsiballZ_stat.py
Feb 20 09:35:22 np0005625204.localdomain sudo[279827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:35:22 np0005625204.localdomain podman[279831]: 2026-02-20 09:35:22.980134636 +0000 UTC m=+0.073340041 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:35:22 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:22.998 279667 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:35:22 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:22.999 279667 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:35:22 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:22.999 279667 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:35:22 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:22.999 279667 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 20 09:35:23 np0005625204.localdomain podman[279831]: 2026-02-20 09:35:23.015812451 +0000 UTC m=+0.109017826 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:35:23 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:35:23 np0005625204.localdomain python3.9[279830]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:35:23 np0005625204.localdomain sudo[279827]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.115 279667 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.143 279667 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.143 279667 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.556 279667 INFO nova.virt.driver [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.668 279667 INFO nova.compute.provider_config [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.682 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.682 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.683 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.683 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.683 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.684 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.685 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.686 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console_host                   = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.687 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.688 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.688 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.688 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.689 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.690 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.691 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.692 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.693 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.694 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.695 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.696 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.697 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.698 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.699 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.700 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.701 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.702 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.703 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.704 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.705 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.706 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.707 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.708 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.709 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.710 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.711 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.712 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.713 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.714 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.715 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.716 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.717 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.718 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.719 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.720 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.721 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.722 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.723 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.724 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.725 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.726 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.727 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.728 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.729 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.730 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.731 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.732 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.733 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.734 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.735 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.736 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.737 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.738 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.739 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.740 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.741 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.742 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.743 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.744 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.745 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.746 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.747 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.748 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.749 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.750 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.751 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.752 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.753 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.754 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.755 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.756 279667 WARNING oslo_config.cfg [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: and ``live_migration_inbound_addr`` respectively.
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: ).  Its value may be silently ignored in the future.
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.756 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.757 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.758 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_secret_uuid        = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.759 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.760 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.761 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.762 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.763 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.764 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.765 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.766 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.767 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.768 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.769 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.770 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.771 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.772 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.773 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.774 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.775 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.776 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.777 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.778 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.779 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.780 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.781 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.782 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain sudo[279966]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sowocrkndypvxbyjcneyyfblxcdipkfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580123.4979415-3314-224604561936569/AnsiballZ_file.py
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.783 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.784 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.785 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.786 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain sudo[279966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.787 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.788 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.789 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.790 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.791 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.792 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.793 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.794 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.795 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.796 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.797 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.798 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.799 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.800 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.801 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.802 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.803 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.804 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.805 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.806 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.807 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.808 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.809 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.810 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.811 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.812 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.813 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.814 279667 DEBUG oslo_service.service [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.815 279667 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.826 279667 INFO nova.virt.node [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.827 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.839 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fc2c2716520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.841 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fc2c2716520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.842 279667 INFO nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Connection event '1' reason 'None'
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.846 279667 INFO nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host capabilities <capabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <host>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <uuid>f44a30b3-674b-4e65-a07d-fb3d71d4ae11</uuid>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <arch>x86_64</arch>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model>EPYC-Rome-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <vendor>AMD</vendor>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <microcode version='16777317'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <signature family='23' model='49' stepping='0'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='x2apic'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='tsc-deadline'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='osxsave'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='hypervisor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='tsc_adjust'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='spec-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='stibp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='arch-capabilities'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='cmp_legacy'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='topoext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='virt-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='lbrv'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='tsc-scale'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='vmcb-clean'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='pause-filter'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='pfthreshold'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='svme-addr-chk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='rdctl-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='skip-l1dfl-vmentry'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='mds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature name='pschange-mc-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <pages unit='KiB' size='4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <pages unit='KiB' size='2048'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <pages unit='KiB' size='1048576'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <power_management>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <suspend_mem/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <suspend_disk/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <suspend_hybrid/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </power_management>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <iommu support='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <migration_features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <live/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <uri_transports>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <uri_transport>tcp</uri_transport>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <uri_transport>rdma</uri_transport>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </uri_transports>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </migration_features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <topology>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <cells num='1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <cell id='0'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           <memory unit='KiB'>16116612</memory>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           <pages unit='KiB' size='2048'>0</pages>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           <distances>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <sibling id='0' value='10'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           </distances>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           <cpus num='8'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:           </cpus>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         </cell>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </cells>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </topology>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <cache>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </cache>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <secmodel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model>selinux</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <doi>0</doi>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </secmodel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <secmodel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model>dac</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <doi>0</doi>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </secmodel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </host>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <guest>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <os_type>hvm</os_type>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <arch name='i686'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <wordsize>32</wordsize>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <domain type='qemu'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <domain type='kvm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </arch>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <pae/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <nonpae/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <acpi default='on' toggle='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <apic default='on' toggle='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <cpuselection/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <deviceboot/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <externalSnapshot/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </guest>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <guest>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <os_type>hvm</os_type>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <arch name='x86_64'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <wordsize>64</wordsize>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <domain type='qemu'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <domain type='kvm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </arch>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <acpi default='on' toggle='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <apic default='on' toggle='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <cpuselection/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <deviceboot/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <externalSnapshot/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </guest>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: </capabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.852 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.854 279667 DEBUG nova.virt.libvirt.volume.mount [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.856 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: <domainCapabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <domain>kvm</domain>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <arch>i686</arch>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <vcpu max='1024'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <iothreads supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <os supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <enum name='firmware'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <loader supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>rom</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pflash</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='readonly'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>yes</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='secure'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </loader>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </os>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='maximumMigratable'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <vendor>AMD</vendor>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='succor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='custom' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <memoryBacking supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <enum name='sourceType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>file</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>anonymous</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>memfd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </memoryBacking>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <devices>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <disk supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='diskDevice'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>disk</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>cdrom</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>floppy</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>lun</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>fdc</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>sata</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </disk>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <graphics supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vnc</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>egl-headless</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </graphics>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <video supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='modelType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vga</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>cirrus</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>none</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>bochs</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>ramfb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </video>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <hostdev supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='mode'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>subsystem</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='startupPolicy'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>mandatory</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>requisite</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>optional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='subsysType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pci</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='capsType'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='pciBackend'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </hostdev>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <rng supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>random</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>egd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </rng>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <filesystem supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='driverType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>path</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>handle</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtiofs</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </filesystem>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <tpm supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tpm-tis</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tpm-crb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>emulator</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>external</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendVersion'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>2.0</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </tpm>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <redirdev supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </redirdev>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <channel supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </channel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <crypto supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>qemu</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </crypto>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <interface supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>passt</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </interface>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <panic supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>isa</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>hyperv</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </panic>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <console supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>null</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vc</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>dev</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>file</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pipe</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>stdio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>udp</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tcp</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>qemu-vdagent</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </console>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </devices>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <gic supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <genid supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <backup supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <async-teardown supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <s390-pv supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <ps2 supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <tdx supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <sev supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <sgx supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <hyperv supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='features'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>relaxed</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vapic</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>spinlocks</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vpindex</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>runtime</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>synic</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>stimer</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>reset</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vendor_id</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>frequencies</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>reenlightenment</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tlbflush</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>ipi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>avic</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>emsr_bitmap</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>xmm_input</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <defaults>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </defaults>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </hyperv>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <launchSecurity supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: </domainCapabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.876 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: <domainCapabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <domain>kvm</domain>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <arch>i686</arch>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <vcpu max='240'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <iothreads supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <os supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <enum name='firmware'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <loader supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>rom</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pflash</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='readonly'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>yes</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='secure'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </loader>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </os>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='maximumMigratable'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <vendor>AMD</vendor>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='succor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='custom' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <memoryBacking supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <enum name='sourceType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>file</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>anonymous</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>memfd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </memoryBacking>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <devices>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <disk supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='diskDevice'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>disk</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>cdrom</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>floppy</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>lun</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>ide</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>fdc</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>sata</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </disk>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <graphics supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vnc</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>egl-headless</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </graphics>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <video supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='modelType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vga</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>cirrus</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>none</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>bochs</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>ramfb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </video>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <hostdev supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='mode'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>subsystem</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='startupPolicy'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>mandatory</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>requisite</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>optional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='subsysType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pci</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='capsType'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='pciBackend'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </hostdev>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <rng supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>random</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>egd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </rng>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <filesystem supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='driverType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>path</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>handle</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>virtiofs</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </filesystem>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <tpm supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tpm-tis</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tpm-crb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>emulator</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>external</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendVersion'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>2.0</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </tpm>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <redirdev supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </redirdev>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <channel supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </channel>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <crypto supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>qemu</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </crypto>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <interface supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='backendType'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>passt</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </interface>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <panic supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>isa</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>hyperv</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </panic>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <console supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>null</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vc</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>dev</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>file</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pipe</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>stdio</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>udp</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tcp</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>qemu-vdagent</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </console>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </devices>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <gic supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <genid supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <backup supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <async-teardown supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <s390-pv supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <ps2 supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <tdx supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <sev supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <sgx supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <hyperv supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='features'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>relaxed</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vapic</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>spinlocks</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vpindex</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>runtime</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>synic</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>stimer</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>reset</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>vendor_id</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>frequencies</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>reenlightenment</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>tlbflush</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>ipi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>avic</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>emsr_bitmap</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>xmm_input</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <defaults>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </defaults>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </hyperv>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <launchSecurity supported='no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </features>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: </domainCapabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.914 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.918 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]: <domainCapabilities>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <domain>kvm</domain>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <arch>x86_64</arch>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <vcpu max='1024'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <iothreads supported='yes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <os supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <enum name='firmware'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>efi</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <loader supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>rom</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>pflash</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='readonly'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>yes</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='secure'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>yes</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </loader>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   </os>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:   <cpu>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <enum name='maximumMigratable'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <vendor>AMD</vendor>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='succor'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:     <mode name='custom' supported='yes'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain sudo[279966]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain python3.9[279968]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:23 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </cpu>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <memoryBacking supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <enum name='sourceType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>file</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>anonymous</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>memfd</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </memoryBacking>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <devices>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <disk supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='diskDevice'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>disk</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>cdrom</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>floppy</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>lun</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>fdc</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>sata</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </disk>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <graphics supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vnc</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>egl-headless</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </graphics>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <video supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='modelType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vga</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>cirrus</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>none</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>bochs</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>ramfb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </video>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <hostdev supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='mode'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>subsystem</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='startupPolicy'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>mandatory</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>requisite</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>optional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='subsysType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pci</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='capsType'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='pciBackend'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </hostdev>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <rng supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>random</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>egd</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </rng>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <filesystem supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='driverType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>path</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>handle</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtiofs</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </filesystem>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <tpm supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tpm-tis</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tpm-crb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>emulator</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>external</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendVersion'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>2.0</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </tpm>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <redirdev supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </redirdev>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <channel supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </channel>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <crypto supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>qemu</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </crypto>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <interface supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>passt</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </interface>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <panic supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>isa</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>hyperv</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </panic>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <console supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>null</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vc</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>dev</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>file</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pipe</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>stdio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>udp</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tcp</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>qemu-vdagent</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </console>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </devices>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <features>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <gic supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <genid supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <backup supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <async-teardown supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <s390-pv supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <ps2 supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <tdx supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <sev supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <sgx supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <hyperv supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='features'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>relaxed</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vapic</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>spinlocks</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vpindex</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>runtime</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>synic</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>stimer</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>reset</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vendor_id</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>frequencies</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>reenlightenment</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tlbflush</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>ipi</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>avic</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>emsr_bitmap</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>xmm_input</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <defaults>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </defaults>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </hyperv>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <launchSecurity supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </features>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: </domainCapabilities>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:23.988 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: <domainCapabilities>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <domain>kvm</domain>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <arch>x86_64</arch>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <vcpu max='240'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <iothreads supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <os supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <enum name='firmware'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <loader supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>rom</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pflash</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='readonly'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>yes</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='secure'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>no</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </loader>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </os>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <cpu>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='maximumMigratable'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>on</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>off</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <vendor>AMD</vendor>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='succor'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <mode name='custom' supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ddpd-u'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sha512'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sm3'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sm4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Denverton-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amd-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='auto-ibrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='perfmon-v2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbpb'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='stibp-always-on'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='EPYC-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-128'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-256'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx10-512'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='prefetchiti'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Haswell-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512er'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512pf'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fma4'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tbm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xop'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='amx-tile'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-bf16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-fp16'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bitalg'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrc'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fzrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='la57'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='taa-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ifma'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cmpccxadd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fbsdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='fsrs'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ibrs-all'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='intel-psfd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='lam'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mcdt-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pbrsb-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='psdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rfds-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='serialize'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vaes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='hle'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='rtm'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512bw'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512cd'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512dq'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512f'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='avx512vl'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='invpcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pcid'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='pku'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='mpx'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='core-capability'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='split-lock-detect'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='cldemote'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='erms'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='gfni'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdir64b'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='movdiri'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='xsaves'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='athlon-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='core2duo-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='coreduo-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='n270-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='ss'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <blockers model='phenom-v1'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnow'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <feature name='3dnowext'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </blockers>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </mode>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </cpu>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <memoryBacking supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <enum name='sourceType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>file</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>anonymous</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <value>memfd</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </memoryBacking>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <devices>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <disk supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='diskDevice'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>disk</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>cdrom</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>floppy</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>lun</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>ide</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>fdc</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>sata</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </disk>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <graphics supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vnc</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>egl-headless</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </graphics>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <video supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='modelType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vga</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>cirrus</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>none</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>bochs</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>ramfb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </video>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <hostdev supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='mode'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>subsystem</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='startupPolicy'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>mandatory</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>requisite</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>optional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='subsysType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pci</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>scsi</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='capsType'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='pciBackend'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </hostdev>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <rng supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtio-non-transitional</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>random</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>egd</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </rng>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <filesystem supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='driverType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>path</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>handle</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>virtiofs</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </filesystem>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <tpm supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tpm-tis</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tpm-crb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>emulator</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>external</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendVersion'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>2.0</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </tpm>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <redirdev supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='bus'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>usb</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </redirdev>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <channel supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </channel>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <crypto supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>qemu</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendModel'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>builtin</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </crypto>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <interface supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='backendType'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>default</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>passt</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </interface>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <panic supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='model'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>isa</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>hyperv</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </panic>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <console supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='type'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>null</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vc</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pty</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>dev</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>file</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>pipe</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>stdio</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>udp</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tcp</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>unix</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>qemu-vdagent</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>dbus</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </console>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </devices>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   <features>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <gic supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <genid supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <backup supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <async-teardown supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <s390-pv supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <ps2 supported='yes'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <tdx supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <sev supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <sgx supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <hyperv supported='yes'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <enum name='features'>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>relaxed</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vapic</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>spinlocks</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vpindex</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>runtime</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>synic</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>stimer</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>reset</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>vendor_id</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>frequencies</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>reenlightenment</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>tlbflush</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>ipi</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>avic</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>emsr_bitmap</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <value>xmm_input</value>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </enum>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       <defaults>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:       </defaults>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     </hyperv>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:     <launchSecurity supported='no'/>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:   </features>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: </domainCapabilities>
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.055 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.056 279667 INFO nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Secure Boot support detected
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.059 279667 INFO nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.059 279667 INFO nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.073 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.095 279667 INFO nova.virt.node [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.111 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Verified node 41976f9f-3656-482f-8ad0-c81e454a3952 matches my host np0005625204.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.139 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.143 279667 DEBUG nova.virt.libvirt.vif [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005625204.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T08:23:36Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.144 279667 DEBUG nova.network.os_vif_util [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.145 279667 DEBUG nova.network.os_vif_util [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.145 279667 DEBUG os_vif [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 20 09:35:24 np0005625204.localdomain sudo[280042]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfpwncbiaquedouwfkvilearhexugvft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580123.4979415-3314-224604561936569/AnsiballZ_stat.py
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.189 279667 DEBUG ovsdbapp.backend.ovs_idl [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.189 279667 DEBUG ovsdbapp.backend.ovs_idl [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.189 279667 DEBUG ovsdbapp.backend.ovs_idl [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.190 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.191 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:24 np0005625204.localdomain sudo[280042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.191 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.192 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.193 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.198 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.214 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.214 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.214 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.215 279667 INFO oslo.privsep.daemon [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp7ui3496_/privsep.sock']
Feb 20 09:35:24 np0005625204.localdomain python3.9[280044]: ansible-stat Invoked with path=/etc/systemd/system/edpm_nova_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:35:24 np0005625204.localdomain sudo[280042]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:24 np0005625204.localdomain sudo[280156]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhqlxqsjbqwtzlativpvqixawuphipgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580124.4658573-3314-64964355351845/AnsiballZ_copy.py
Feb 20 09:35:24 np0005625204.localdomain sudo[280156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.822 279667 INFO oslo.privsep.daemon [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.701 280119 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.705 280119 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.709 280119 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 20 09:35:24 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:24.709 280119 INFO oslo.privsep.daemon [-] privsep daemon running as pid 280119
Feb 20 09:35:24 np0005625204.localdomain python3.9[280158]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1771580124.4658573-3314-64964355351845/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:25 np0005625204.localdomain sudo[280156]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.106 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.107 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.109 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.110 279667 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.110 279667 INFO os_vif [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.111 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.114 279667 DEBUG nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.114 279667 INFO nova.compute.manager [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.222 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.222 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.222 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.223 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.224 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:25 np0005625204.localdomain sudo[280234]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdxwwcgqcrcanceeouuklnjxhbflzirx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580124.4658573-3314-64964355351845/AnsiballZ_systemd.py
Feb 20 09:35:25 np0005625204.localdomain sudo[280234]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.684 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.754 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.755 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:35:25 np0005625204.localdomain python3.9[280236]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:35:25 np0005625204.localdomain sudo[280234]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.987 279667 WARNING nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.988 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12188MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.989 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:25 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:25.989 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.134 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.134 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.136 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.196 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.213 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.214 279667 DEBUG nova.compute.provider_tree [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.230 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.248 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_FDC,COMPUTE_VIOMMU_MODEL_VIRTIO,HW_CPU_X86_SVM,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_AMD_SVM,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE4A,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_AVX,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_SHA,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_CLMUL,COMPUTE_NODE,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_FMA3,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_BMI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.288 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.538 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:35:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:35:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:35:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:35:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:35:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.793 279667 DEBUG oslo_concurrency.processutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.799 279667 DEBUG nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.800 279667 INFO nova.virt.libvirt.host [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] kernel doesn't support AMD SEV
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.801 279667 DEBUG nova.compute.provider_tree [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.802 279667 DEBUG nova.virt.libvirt.driver [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.824 279667 DEBUG nova.scheduler.client.report [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.849 279667 DEBUG nova.compute.resource_tracker [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.849 279667 DEBUG oslo_concurrency.lockutils [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.849 279667 DEBUG nova.service [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.880 279667 DEBUG nova.service [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 20 09:35:26 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:26.881 279667 DEBUG nova.servicegroup.drivers.db [None req-0ca34fcc-2e44-48db-af24-3155375e9b17 - - - - - -] DB_Driver: join new ServiceGroup member np0005625204.localdomain to the compute group, service = <Service: host=np0005625204.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 20 09:35:27 np0005625204.localdomain python3.9[280370]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 20 09:35:28 np0005625204.localdomain sudo[280478]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkckkkhvcgmzeyenvojrqoyylmzfqlbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580128.5151887-3437-222993006237039/AnsiballZ_stat.py
Feb 20 09:35:28 np0005625204.localdomain sudo[280478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:28 np0005625204.localdomain python3.9[280480]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 20 09:35:29 np0005625204.localdomain sudo[280478]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:29 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:29.231 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:29 np0005625204.localdomain sudo[280568]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygdrficdfbhiijgmvttieefwvexeqxfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580128.5151887-3437-222993006237039/AnsiballZ_copy.py
Feb 20 09:35:29 np0005625204.localdomain sudo[280568]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:29 np0005625204.localdomain python3.9[280570]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1771580128.5151887-3437-222993006237039/.source.yaml _original_basename=.teucteh_ follow=False checksum=1398ce19331de48b62372cc81e1a3aaab78c97b5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:35:29 np0005625204.localdomain sudo[280568]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:35:30 np0005625204.localdomain podman[280588]: 2026-02-20 09:35:30.154557934 +0000 UTC m=+0.089263357 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:35:30 np0005625204.localdomain podman[280588]: 2026-02-20 09:35:30.164272103 +0000 UTC m=+0.098977586 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:35:30 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:35:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16664 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A13B540000000001030307) 
Feb 20 09:35:30 np0005625204.localdomain python3.9[280701]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:35:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:35:31 np0005625204.localdomain systemd[1]: tmp-crun.kS4XhE.mount: Deactivated successfully.
Feb 20 09:35:31 np0005625204.localdomain podman[280719]: 2026-02-20 09:35:31.158493854 +0000 UTC m=+0.094770437 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, architecture=x86_64, build-date=2026-02-05T04:57:10Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9)
Feb 20 09:35:31 np0005625204.localdomain podman[280719]: 2026-02-20 09:35:31.200313095 +0000 UTC m=+0.136589668 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.7, build-date=2026-02-05T04:57:10Z, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:35:31 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:35:31 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:31.541 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16665 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A13F680000000001030307) 
Feb 20 09:35:31 np0005625204.localdomain python3.9[280830]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:35:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14682 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A141680000000001030307) 
Feb 20 09:35:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:35:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:35:33 np0005625204.localdomain systemd[1]: tmp-crun.zg9zzQ.mount: Deactivated successfully.
Feb 20 09:35:33 np0005625204.localdomain podman[280939]: 2026-02-20 09:35:33.161968798 +0000 UTC m=+0.097586963 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:35:33 np0005625204.localdomain python3.9[280938]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 20 09:35:33 np0005625204.localdomain podman[280939]: 2026-02-20 09:35:33.247383425 +0000 UTC m=+0.183001630 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 20 09:35:33 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:35:33 np0005625204.localdomain podman[280940]: 2026-02-20 09:35:33.248808189 +0000 UTC m=+0.181439361 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 20 09:35:33 np0005625204.localdomain podman[280940]: 2026-02-20 09:35:33.33208623 +0000 UTC m=+0.264717452 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:35:33 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:35:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16666 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A147680000000001030307) 
Feb 20 09:35:34 np0005625204.localdomain sudo[281089]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fsbhlzgpjrymfkzwiuzhkpfnsanogpyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580133.6660664-3588-257959137932253/AnsiballZ_podman_container.py
Feb 20 09:35:34 np0005625204.localdomain sudo[281089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:34 np0005625204.localdomain systemd[1]: tmp-crun.pPg1UP.mount: Deactivated successfully.
Feb 20 09:35:34 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:34.234 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:34 np0005625204.localdomain python3.9[281091]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 20 09:35:34 np0005625204.localdomain sudo[281089]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:34 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 120.4 (401 of 333 items), suggesting rotation.
Feb 20 09:35:34 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:35:34 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:35:34 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:35:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32286 DF PROTO=TCP SPT=54824 DPT=9102 SEQ=4058402939 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A14B680000000001030307) 
Feb 20 09:35:35 np0005625204.localdomain sshd[281146]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:35:35 np0005625204.localdomain sshd[281146]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 09:35:35 np0005625204.localdomain sshd[281146]: Connection closed by 183.220.237.28 port 34926
Feb 20 09:35:35 np0005625204.localdomain sudo[281223]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcomxnetkmudeeasqrdgcpqeplcqjnvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580135.303225-3611-195009955163750/AnsiballZ_systemd.py
Feb 20 09:35:35 np0005625204.localdomain sudo[281223]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:35 np0005625204.localdomain python3.9[281225]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 20 09:35:35 np0005625204.localdomain systemd[1]: Stopping nova_compute container...
Feb 20 09:35:35 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:35.988 279667 DEBUG oslo_privsep.comm [-] EOF on privsep read channel _reader_main /usr/lib/python3.9/site-packages/oslo_privsep/comm.py:170
Feb 20 09:35:36 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:36.544 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16667 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A157280000000001030307) 
Feb 20 09:35:38 np0005625204.localdomain sshd[281243]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:35:39 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:39.237 279667 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:40 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:40.100 279667 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 20 09:35:40 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:40.102 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:35:40 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:40.103 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:35:40 np0005625204.localdomain nova_compute[279644]: 2026-02-20 09:35:40.103 279667 DEBUG oslo_concurrency.lockutils [None req-4c1370ec-39dc-4726-a903-18b2ed3e09da - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:35:40 np0005625204.localdomain virtqemud[206495]: End of file while reading data: Input/output error
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: libpod-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope: Deactivated successfully.
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: libpod-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope: Consumed 4.819s CPU time.
Feb 20 09:35:40 np0005625204.localdomain podman[281229]: 2026-02-20 09:35:40.51155153 +0000 UTC m=+4.604321140 container died 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute)
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3-userdata-shm.mount: Deactivated successfully.
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace-merged.mount: Deactivated successfully.
Feb 20 09:35:40 np0005625204.localdomain podman[281229]: 2026-02-20 09:35:40.616693246 +0000 UTC m=+4.709462826 container cleanup 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:35:40 np0005625204.localdomain podman[281229]: nova_compute
Feb 20 09:35:40 np0005625204.localdomain podman[281244]: 2026-02-20 09:35:40.629766639 +0000 UTC m=+0.104917489 container cleanup 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: libpod-conmon-4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3.scope: Deactivated successfully.
Feb 20 09:35:40 np0005625204.localdomain podman[281270]: error opening file `/run/crun/4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3/status`: No such file or directory
Feb 20 09:35:40 np0005625204.localdomain podman[281259]: 2026-02-20 09:35:40.720007014 +0000 UTC m=+0.062772538 container cleanup 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:35:40 np0005625204.localdomain podman[281259]: nova_compute
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: Stopped nova_compute container.
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: Starting nova_compute container...
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:35:40 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:40 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:40 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:40 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:40 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dac7f9ccbc01f082df2e3108011e8d0ee352a0c18cad03ce9519f50d9549eace/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:40 np0005625204.localdomain podman[281274]: 2026-02-20 09:35:40.868209629 +0000 UTC m=+0.111630886 container init 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:35:40 np0005625204.localdomain podman[281274]: 2026-02-20 09:35:40.87793502 +0000 UTC m=+0.121356287 container start 4f3d0e73dfc59be06a09c49985273678211602e34830c212e2fe9c4ba8994fb3 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-30a66d126d7b377a2bbf9bc0d57e51bdf55bb85eee6d379d83e3d3aa2e3d7293-9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'nova', 'volumes': ['/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/nova:/var/lib/kolla/config_files/src:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/src/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:35:40 np0005625204.localdomain podman[281274]: nova_compute
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: + sudo -E kolla_set_configs
Feb 20 09:35:40 np0005625204.localdomain systemd[1]: Started nova_compute container.
Feb 20 09:35:40 np0005625204.localdomain sudo[281223]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Validating config file
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying service configuration files
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /etc/ceph
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Creating directory /etc/ceph
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ssh-config to /var/lib/nova/.ssh/config
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Copying /var/lib/kolla/config_files/src/run-on-host to /usr/sbin/iscsiadm
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Writing out command to execute
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: ++ cat /run_command
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: + CMD=nova-compute
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: + ARGS=
Feb 20 09:35:40 np0005625204.localdomain nova_compute[281288]: + sudo kolla_copy_cacerts
Feb 20 09:35:41 np0005625204.localdomain nova_compute[281288]: + [[ ! -n '' ]]
Feb 20 09:35:41 np0005625204.localdomain nova_compute[281288]: + . kolla_extend_start
Feb 20 09:35:41 np0005625204.localdomain nova_compute[281288]: Running command: 'nova-compute'
Feb 20 09:35:41 np0005625204.localdomain nova_compute[281288]: + echo 'Running command: '\''nova-compute'\'''
Feb 20 09:35:41 np0005625204.localdomain nova_compute[281288]: + umask 0022
Feb 20 09:35:41 np0005625204.localdomain nova_compute[281288]: + exec nova-compute
Feb 20 09:35:41 np0005625204.localdomain sudo[281407]:     zuul : TTY=pts/1 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aokevafosetdasypeizjhpqhlcuahsri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1771580141.2445927-3639-681343369303/AnsiballZ_podman_container.py
Feb 20 09:35:41 np0005625204.localdomain sudo[281407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 20 09:35:41 np0005625204.localdomain python3.9[281409]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 20 09:35:41 np0005625204.localdomain systemd[1]: Started libpod-conmon-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4.scope.
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:35:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 20 09:35:42 np0005625204.localdomain podman[281433]: 2026-02-20 09:35:42.018082855 +0000 UTC m=+0.131884772 container init d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 20 09:35:42 np0005625204.localdomain podman[281433]: 2026-02-20 09:35:42.027208176 +0000 UTC m=+0.141010093 container start d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 20 09:35:42 np0005625204.localdomain python3.9[281409]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Applying nova statedir ownership
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3 already 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3 to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/console.log
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/_base already 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/_base to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ccf3906461ed5c78e2a6f963756ac32b4b049bce
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/_base/ephemeral_1_0706d66
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/instances/locks already 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances/locks to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ccf3906461ed5c78e2a6f963756ac32b4b049bce
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/instances/locks/nova-ephemeral_1_0706d66
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/d301d14069645d8c23fee2987984776b3e88a570e1aa96d6cf3e31fa880385fd
Feb 20 09:35:42 np0005625204.localdomain nova_compute_init[281454]: INFO:nova_statedir:Nova statedir ownership complete
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: libpod-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4.scope: Deactivated successfully.
Feb 20 09:35:42 np0005625204.localdomain podman[281455]: 2026-02-20 09:35:42.104790511 +0000 UTC m=+0.059223759 container died d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, managed_by=edpm_ansible, config_id=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 20 09:35:42 np0005625204.localdomain podman[281466]: 2026-02-20 09:35:42.179579029 +0000 UTC m=+0.077430730 container cleanup d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=nova_compute_init, container_name=nova_compute_init, config_data={'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False, 'EDPM_CONFIG_HASH': '9df7a35deb9ef647f5642f91b977ea1d47ad0919d3ab9d3f5127875cfc62e74b'}, 'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'net': 'none', 'privileged': False, 'restart': 'never', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: libpod-conmon-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4.scope: Deactivated successfully.
Feb 20 09:35:42 np0005625204.localdomain sudo[281407]: pam_unix(sudo:session): session closed for user root
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d870576fe32644ef4ff787e84b8ce7e75b29857bb1a6c6466c1660fea99b7567-merged.mount: Deactivated successfully.
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d59f255e6fe38573122d2a25e6b609c9536810041dc53b301b8d3fc40522c1a4-userdata-shm.mount: Deactivated successfully.
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.594 281292 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.594 281292 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.595 281292 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.595 281292 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.703 281292 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.713 281292 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:42.713 281292 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 20 09:35:42 np0005625204.localdomain sshd[264753]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Feb 20 09:35:42 np0005625204.localdomain systemd[1]: session-60.scope: Consumed 1min 25.257s CPU time.
Feb 20 09:35:42 np0005625204.localdomain systemd-logind[759]: Session 60 logged out. Waiting for processes to exit.
Feb 20 09:35:42 np0005625204.localdomain systemd-logind[759]: Removed session 60.
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.167 281292 INFO nova.virt.driver [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.282 281292 INFO nova.compute.provider_config [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.302 281292 DEBUG oslo_concurrency.lockutils [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.302 281292 DEBUG oslo_concurrency.lockutils [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.302 281292 DEBUG oslo_concurrency.lockutils [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.303 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.304 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.305 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.306 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console_host                   = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.307 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.308 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.309 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.309 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.309 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.310 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.311 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.312 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.312 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] host                           = np0005625204.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.312 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.313 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.314 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.315 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.316 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.317 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.318 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.319 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.320 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.321 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.322 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.323 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.324 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.325 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.326 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.327 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.328 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.329 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.330 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.330 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.330 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.331 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.332 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.333 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.334 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.335 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.336 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.337 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.338 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.339 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.340 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.341 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.342 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.343 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.344 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.345 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.346 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.347 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.os_region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.348 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.349 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.350 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.351 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.352 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.353 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.354 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.355 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.356 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.357 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.358 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.359 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.360 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.361 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.362 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.363 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.364 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.365 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.365 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.365 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.365 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.365 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.365 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.366 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.367 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.368 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.369 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.barbican_region_name  = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.370 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.371 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.372 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.373 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.374 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.375 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.376 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.377 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.378 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.379 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.380 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.381 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.382 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.383 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.383 281292 WARNING oslo_config.cfg [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: and ``live_migration_inbound_addr`` respectively.
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: ).  Its value may be silently ignored in the future.
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.383 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.383 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.383 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.383 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.384 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.385 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rbd_secret_uuid        = a8557ee9-b55d-5519-942c-cf8f6172f1d8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.386 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.387 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.387 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.387 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.387 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.388 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.388 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.388 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.388 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.388 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.389 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.390 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.391 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.392 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.393 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.394 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.395 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.396 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.397 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.398 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.399 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.400 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.401 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.402 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.403 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.404 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.405 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.406 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.407 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.408 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.409 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.410 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.411 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.412 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.413 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.414 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.415 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.416 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.417 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.417 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.417 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.417 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.417 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.417 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.418 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.419 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.420 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.421 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.422 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.423 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.424 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.425 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.426 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.427 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.428 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.429 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.429 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.429 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.429 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.429 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.429 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.430 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.431 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.432 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.433 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.434 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.435 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.436 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.437 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.438 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.439 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.440 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.440 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.440 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.440 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.440 281292 DEBUG oslo_service.service [None req-af63a5c2-b91d-442c-8511-98334aea9e32 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.441 281292 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.459 281292 INFO nova.virt.node [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.459 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.460 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.460 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.460 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.472 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4b18762190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.474 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4b18762190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.475 281292 INFO nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Connection event '1' reason 'None'
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.481 281292 INFO nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Libvirt host capabilities <capabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <host>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <uuid>f44a30b3-674b-4e65-a07d-fb3d71d4ae11</uuid>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <arch>x86_64</arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model>EPYC-Rome-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <vendor>AMD</vendor>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <microcode version='16777317'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <signature family='23' model='49' stepping='0'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='x2apic'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='tsc-deadline'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='osxsave'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='hypervisor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='tsc_adjust'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='spec-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='stibp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='arch-capabilities'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='cmp_legacy'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='topoext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='virt-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='lbrv'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='tsc-scale'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='vmcb-clean'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='pause-filter'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='pfthreshold'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='svme-addr-chk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='rdctl-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='skip-l1dfl-vmentry'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='mds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature name='pschange-mc-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <pages unit='KiB' size='4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <pages unit='KiB' size='2048'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <pages unit='KiB' size='1048576'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <power_management>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <suspend_mem/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <suspend_disk/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <suspend_hybrid/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </power_management>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <iommu support='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <migration_features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <live/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <uri_transports>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <uri_transport>tcp</uri_transport>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <uri_transport>rdma</uri_transport>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </uri_transports>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </migration_features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <topology>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <cells num='1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <cell id='0'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           <memory unit='KiB'>16116612</memory>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           <pages unit='KiB' size='4'>4029153</pages>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           <pages unit='KiB' size='2048'>0</pages>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           <distances>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <sibling id='0' value='10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           </distances>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           <cpus num='8'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 20 09:35:43 np0005625204.localdomain rsyslogd[758]: imjournal from <localhost:nova_compute>: begin to drop messages due to rate-limiting
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:           </cpus>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         </cell>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </cells>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </topology>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <cache>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </cache>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <secmodel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model>selinux</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <doi>0</doi>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </secmodel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <secmodel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model>dac</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <doi>0</doi>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </secmodel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </host>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <guest>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <os_type>hvm</os_type>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <arch name='i686'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <wordsize>32</wordsize>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <domain type='qemu'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <domain type='kvm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <pae/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <nonpae/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <acpi default='on' toggle='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <apic default='on' toggle='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <cpuselection/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <deviceboot/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <externalSnapshot/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </guest>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <guest>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <os_type>hvm</os_type>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <arch name='x86_64'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <wordsize>64</wordsize>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <domain type='qemu'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <domain type='kvm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <acpi default='on' toggle='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <apic default='on' toggle='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <cpuselection/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <deviceboot/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <disksnapshot default='on' toggle='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <externalSnapshot/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </guest>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: </capabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.489 281292 DEBUG nova.virt.libvirt.volume.mount [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.490 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.495 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: <domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <domain>kvm</domain>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <arch>i686</arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <vcpu max='240'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <iothreads supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <os supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='firmware'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <loader supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>rom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pflash</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='readonly'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>yes</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='secure'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </loader>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </os>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='maximumMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <vendor>AMD</vendor>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='succor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='custom' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <memoryBacking supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='sourceType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>anonymous</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>memfd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </memoryBacking>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <disk supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='diskDevice'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>disk</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cdrom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>floppy</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>lun</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ide</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>fdc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>sata</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <graphics supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vnc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egl-headless</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </graphics>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <video supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='modelType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vga</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cirrus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>none</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>bochs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ramfb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </video>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hostdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='mode'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>subsystem</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='startupPolicy'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>mandatory</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>requisite</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>optional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='subsysType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pci</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='capsType'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='pciBackend'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hostdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <rng supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>random</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </rng>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <filesystem supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='driverType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>path</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>handle</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtiofs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </filesystem>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tpm supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-tis</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-crb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emulator</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>external</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendVersion'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>2.0</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </tpm>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <redirdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </redirdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <channel supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </channel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <crypto supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </crypto>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <interface supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>passt</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </interface>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <panic supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>isa</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>hyperv</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </panic>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <console supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>null</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dev</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pipe</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stdio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>udp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tcp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu-vdagent</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </console>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <gic supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <genid supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backup supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <async-teardown supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <s390-pv supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <ps2 supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tdx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sev supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sgx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hyperv supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='features'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>relaxed</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vapic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>spinlocks</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vpindex</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>runtime</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>synic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stimer</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reset</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vendor_id</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>frequencies</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reenlightenment</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tlbflush</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ipi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>avic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emsr_bitmap</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>xmm_input</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hyperv>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <launchSecurity supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: </domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.502 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: <domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <domain>kvm</domain>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <arch>i686</arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <vcpu max='1024'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <iothreads supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <os supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='firmware'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <loader supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>rom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pflash</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='readonly'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>yes</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='secure'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </loader>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </os>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='maximumMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <vendor>AMD</vendor>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='succor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='custom' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <memoryBacking supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='sourceType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>anonymous</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>memfd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </memoryBacking>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <disk supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='diskDevice'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>disk</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cdrom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>floppy</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>lun</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>fdc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>sata</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <graphics supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vnc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egl-headless</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </graphics>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <video supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='modelType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vga</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cirrus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>none</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>bochs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ramfb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </video>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hostdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='mode'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>subsystem</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='startupPolicy'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>mandatory</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>requisite</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>optional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='subsysType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pci</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='capsType'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='pciBackend'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hostdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <rng supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>random</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </rng>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <filesystem supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='driverType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>path</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>handle</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtiofs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </filesystem>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tpm supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-tis</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-crb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emulator</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>external</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendVersion'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>2.0</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </tpm>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <redirdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </redirdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <channel supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </channel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <crypto supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </crypto>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <interface supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>passt</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </interface>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <panic supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>isa</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>hyperv</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </panic>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <console supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>null</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dev</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pipe</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stdio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>udp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tcp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu-vdagent</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </console>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <gic supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <genid supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backup supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <async-teardown supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <s390-pv supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <ps2 supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tdx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sev supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sgx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hyperv supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='features'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>relaxed</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vapic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>spinlocks</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vpindex</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>runtime</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>synic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stimer</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reset</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vendor_id</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>frequencies</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reenlightenment</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tlbflush</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ipi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>avic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emsr_bitmap</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>xmm_input</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hyperv>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <launchSecurity supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: </domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.557 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.563 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: <domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <domain>kvm</domain>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <arch>x86_64</arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <vcpu max='240'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <iothreads supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <os supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='firmware'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <loader supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>rom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pflash</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='readonly'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>yes</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='secure'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </loader>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </os>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='maximumMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <vendor>AMD</vendor>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='succor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='custom' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <memoryBacking supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='sourceType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>anonymous</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>memfd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </memoryBacking>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <disk supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='diskDevice'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>disk</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cdrom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>floppy</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>lun</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ide</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>fdc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>sata</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <graphics supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vnc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egl-headless</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </graphics>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <video supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='modelType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vga</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cirrus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>none</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>bochs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ramfb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </video>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hostdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='mode'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>subsystem</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='startupPolicy'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>mandatory</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>requisite</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>optional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='subsysType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pci</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='capsType'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='pciBackend'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hostdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <rng supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>random</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </rng>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <filesystem supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='driverType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>path</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>handle</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtiofs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </filesystem>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tpm supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-tis</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-crb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emulator</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>external</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendVersion'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>2.0</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </tpm>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <redirdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </redirdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <channel supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </channel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <crypto supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </crypto>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <interface supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>passt</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </interface>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <panic supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>isa</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>hyperv</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </panic>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <console supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>null</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dev</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pipe</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stdio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>udp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tcp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu-vdagent</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </console>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <gic supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <genid supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backup supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <async-teardown supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <s390-pv supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <ps2 supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tdx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sev supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sgx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hyperv supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='features'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>relaxed</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vapic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>spinlocks</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vpindex</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>runtime</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>synic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stimer</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reset</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vendor_id</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>frequencies</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reenlightenment</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tlbflush</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ipi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>avic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emsr_bitmap</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>xmm_input</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hyperv>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <launchSecurity supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: </domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.621 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: <domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <path>/usr/libexec/qemu-kvm</path>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <domain>kvm</domain>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <arch>x86_64</arch>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <vcpu max='1024'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <iothreads supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <os supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='firmware'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>efi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <loader supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>rom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pflash</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='readonly'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>yes</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='secure'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>yes</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>no</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </loader>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </os>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-passthrough' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='hostPassthroughMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='maximum' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='maximumMigratable'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>on</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>off</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='host-model' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <vendor>AMD</vendor>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='x2apic'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-deadline'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='hypervisor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc_adjust'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='spec-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='stibp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='cmp_legacy'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='overflow-recov'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='succor'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='amd-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='virt-ssbd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lbrv'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='tsc-scale'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='vmcb-clean'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pause-filter'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='pfthreshold'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='svme-addr-chk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <feature policy='disable' name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <mode name='custom' supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Broadwell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cascadelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='ClearwaterForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ddpd-u'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sha512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm3'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sm4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Cooperlake-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Denverton-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Dhyana-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Genoa-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Milan-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Rome-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-Turin-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amd-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='auto-ibrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vp2intersect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fs-gs-base-ns'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibpb-brtype'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='no-nested-data-bp'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='null-sel-clr-base'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='perfmon-v2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbpb'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='srso-user-kernel-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='stibp-always-on'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='EPYC-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='GraniteRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-128'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-256'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx10-512'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='prefetchiti'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Haswell-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-noTSX'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v6'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Icelake-Server-v7'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='IvyBridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='KnightsMill-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4fmaps'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-4vnniw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512er'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512pf'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G4-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Opteron_G5-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fma4'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tbm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xop'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SapphireRapids-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='amx-tile'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-bf16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-fp16'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512-vpopcntdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bitalg'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vbmi2'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrc'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fzrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='la57'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='taa-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='tsx-ldtrk'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='SierraForest-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ifma'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-ne-convert'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx-vnni-int8'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bhi-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='bus-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cmpccxadd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fbsdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='fsrs'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ibrs-all'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='intel-psfd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ipred-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='lam'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mcdt-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pbrsb-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='psdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rfds-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rrsba-ctrl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='sbdr-ssdp-no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='serialize'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vaes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='vpclmulqdq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Client-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='hle'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='rtm'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Skylake-Server-v5'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512bw'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512cd'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512dq'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512f'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='avx512vl'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='invpcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pcid'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='pku'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='mpx'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v2'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v3'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='core-capability'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='split-lock-detect'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='Snowridge-v4'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='cldemote'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='erms'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='gfni'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdir64b'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='movdiri'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='xsaves'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='athlon-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='core2duo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='coreduo-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='n270-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='ss'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <blockers model='phenom-v1'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnow'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <feature name='3dnowext'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </blockers>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </mode>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </cpu>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <memoryBacking supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <enum name='sourceType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>anonymous</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <value>memfd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </memoryBacking>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <disk supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='diskDevice'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>disk</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cdrom</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>floppy</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>lun</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>fdc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>sata</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <graphics supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vnc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egl-headless</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </graphics>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <video supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='modelType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vga</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>cirrus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>none</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>bochs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ramfb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </video>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hostdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='mode'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>subsystem</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='startupPolicy'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>mandatory</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>requisite</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>optional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='subsysType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pci</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>scsi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='capsType'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='pciBackend'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hostdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <rng supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtio-non-transitional</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>random</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>egd</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </rng>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <filesystem supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='driverType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>path</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>handle</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>virtiofs</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </filesystem>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tpm supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-tis</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tpm-crb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emulator</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>external</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendVersion'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>2.0</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </tpm>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <redirdev supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='bus'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>usb</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </redirdev>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <channel supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </channel>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <crypto supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendModel'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>builtin</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </crypto>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <interface supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='backendType'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>default</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>passt</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </interface>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <panic supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='model'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>isa</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>hyperv</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </panic>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <console supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='type'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>null</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vc</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pty</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dev</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>file</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>pipe</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stdio</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>udp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tcp</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>unix</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>qemu-vdagent</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>dbus</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </console>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </devices>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   <features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <gic supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <vmcoreinfo supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <genid supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backingStoreInput supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <backup supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <async-teardown supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <s390-pv supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <ps2 supported='yes'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <tdx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sev supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <sgx supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <hyperv supported='yes'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <enum name='features'>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>relaxed</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vapic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>spinlocks</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vpindex</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>runtime</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>synic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>stimer</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reset</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>vendor_id</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>frequencies</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>reenlightenment</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>tlbflush</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>ipi</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>avic</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>emsr_bitmap</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <value>xmm_input</value>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </enum>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       <defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <spinlocks>4095</spinlocks>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <stimer_direct>on</stimer_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_direct>off</tlbflush_direct>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <tlbflush_extended>off</tlbflush_extended>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:       </defaults>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     </hyperv>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:     <launchSecurity supported='no'/>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:   </features>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: </domainCapabilities>
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.684 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.685 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.688 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.688 281292 INFO nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Secure Boot support detected
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.690 281292 INFO nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.690 281292 INFO nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.703 281292 DEBUG nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.729 281292 INFO nova.virt.node [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Determined node identity 41976f9f-3656-482f-8ad0-c81e454a3952 from /var/lib/nova/compute_id
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.744 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Verified node 41976f9f-3656-482f-8ad0-c81e454a3952 matches my host np0005625204.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.788 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.793 281292 DEBUG nova.virt.libvirt.vif [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=<?>,hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=<?>,node='np0005625204.localdomain',numa_topology=None,old_flavor=<?>,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata=<?>,tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T08:23:36Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.793 281292 DEBUG nova.network.os_vif_util [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.794 281292 DEBUG nova.network.os_vif_util [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.795 281292 DEBUG os_vif [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.876 281292 DEBUG ovsdbapp.backend.ovs_idl [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.876 281292 DEBUG ovsdbapp.backend.ovs_idl [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.877 281292 DEBUG ovsdbapp.backend.ovs_idl [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.877 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.878 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [POLLOUT] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.878 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.879 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.881 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.884 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.907 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.908 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.908 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:35:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:43.910 281292 INFO oslo.privsep.daemon [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmplig99m30/privsep.sock']
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.705 281292 INFO oslo.privsep.daemon [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.600 281538 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.606 281538 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.609 281538 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.609 281538 INFO oslo.privsep.daemon [-] privsep daemon running as pid 281538
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.974 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.975 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.975 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.976 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.977 281292 INFO os_vif [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.978 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.982 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Current state is 1, state in DB is 1. _init_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:1304
Feb 20 09:35:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:44.983 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 20 09:35:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.078 281292 DEBUG oslo_concurrency.lockutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.080 281292 DEBUG oslo_concurrency.lockutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.080 281292 DEBUG oslo_concurrency.lockutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.080 281292 DEBUG nova.compute.resource_tracker [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.081 281292 DEBUG oslo_concurrency.processutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:45 np0005625204.localdomain podman[281542]: 2026-02-20 09:35:45.149858459 +0000 UTC m=+0.087768291 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:35:45 np0005625204.localdomain podman[281542]: 2026-02-20 09:35:45.162123007 +0000 UTC m=+0.100032819 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:35:45 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.572 281292 DEBUG oslo_concurrency.processutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.648 281292 DEBUG nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.648 281292 DEBUG nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.850 281292 WARNING nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.851 281292 DEBUG nova.compute.resource_tracker [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12199MB free_disk=41.83720779418945GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.852 281292 DEBUG oslo_concurrency.lockutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:35:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:45.852 281292 DEBUG oslo_concurrency.lockutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.012 281292 DEBUG nova.compute.resource_tracker [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.013 281292 DEBUG nova.compute.resource_tracker [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.013 281292 DEBUG nova.compute.resource_tracker [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:35:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16668 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A177680000000001030307) 
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.072 281292 DEBUG nova.scheduler.client.report [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.096 281292 DEBUG nova.scheduler.client.report [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.097 281292 DEBUG nova.compute.provider_tree [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.115 281292 DEBUG nova.scheduler.client.report [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.141 281292 DEBUG nova.scheduler.client.report [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.188 281292 DEBUG oslo_concurrency.processutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.563 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.664 281292 DEBUG oslo_concurrency.processutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.670 281292 DEBUG nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.670 281292 INFO nova.virt.libvirt.host [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] kernel doesn't support AMD SEV
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.672 281292 DEBUG nova.compute.provider_tree [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.672 281292 DEBUG nova.virt.libvirt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.692 281292 DEBUG nova.scheduler.client.report [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.712 281292 DEBUG nova.compute.resource_tracker [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.713 281292 DEBUG oslo_concurrency.lockutils [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.861s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.713 281292 DEBUG nova.service [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.737 281292 DEBUG nova.service [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 20 09:35:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:46.738 281292 DEBUG nova.servicegroup.drivers.db [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] DB_Driver: join new ServiceGroup member np0005625204.localdomain to the compute group, service = <Service: host=np0005625204.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 20 09:35:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:35:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:35:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:35:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149681 "" "Go-http-client/1.1"
Feb 20 09:35:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:35:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16791 "" "Go-http-client/1.1"
Feb 20 09:35:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:48.882 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:49 np0005625204.localdomain sshd[281243]: error: kex_exchange_identification: read: Connection timed out
Feb 20 09:35:49 np0005625204.localdomain sshd[281243]: banner exchange: Connection from 183.220.237.28 port 34930: Connection timed out
Feb 20 09:35:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:51.566 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:35:53 np0005625204.localdomain podman[281605]: 2026-02-20 09:35:53.140079846 +0000 UTC m=+0.076800501 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:35:53 np0005625204.localdomain podman[281605]: 2026-02-20 09:35:53.149371763 +0000 UTC m=+0.086092468 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:35:53 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:35:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:53.915 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:35:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:35:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:35:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:35:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:35:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:35:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:56.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:35:58.964 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:35:59 np0005625204.localdomain rsyslogd[758]: imjournal: 8932 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 20 09:36:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51779 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1B0840000000001030307) 
Feb 20 09:36:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:36:00 np0005625204.localdomain systemd[1]: tmp-crun.ff18gd.mount: Deactivated successfully.
Feb 20 09:36:00 np0005625204.localdomain podman[281627]: 2026-02-20 09:36:00.873072112 +0000 UTC m=+0.089030029 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:36:00 np0005625204.localdomain podman[281627]: 2026-02-20 09:36:00.88404006 +0000 UTC m=+0.099997967 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:36:00 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:36:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:01.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51780 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1B4A90000000001030307) 
Feb 20 09:36:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:36:02 np0005625204.localdomain podman[281649]: 2026-02-20 09:36:02.128419413 +0000 UTC m=+0.071277791 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:36:02 np0005625204.localdomain podman[281649]: 2026-02-20 09:36:02.146264255 +0000 UTC m=+0.089122633 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:36:02 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:36:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16669 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1B7680000000001030307) 
Feb 20 09:36:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:02.573 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:36:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:02.577 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:36:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:02.579 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:02.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51781 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1BCA90000000001030307) 
Feb 20 09:36:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:04.006 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:36:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:36:04 np0005625204.localdomain systemd[1]: tmp-crun.5iB1O5.mount: Deactivated successfully.
Feb 20 09:36:04 np0005625204.localdomain podman[281669]: 2026-02-20 09:36:04.150472651 +0000 UTC m=+0.089743711 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:36:04 np0005625204.localdomain systemd[1]: tmp-crun.i7teXq.mount: Deactivated successfully.
Feb 20 09:36:04 np0005625204.localdomain podman[281670]: 2026-02-20 09:36:04.195614015 +0000 UTC m=+0.130292753 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:36:04 np0005625204.localdomain podman[281669]: 2026-02-20 09:36:04.204480999 +0000 UTC m=+0.143752009 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 20 09:36:04 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:36:04 np0005625204.localdomain podman[281670]: 2026-02-20 09:36:04.226281051 +0000 UTC m=+0.160959829 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 20 09:36:04 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:36:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14683 DF PROTO=TCP SPT=59520 DPT=9102 SEQ=762740849 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1BF690000000001030307) 
Feb 20 09:36:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:05.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:05.764 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 20 09:36:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:05.765 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:05.765 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:05.766 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:05.822 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.057s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:05.999 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:05.999 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:06.001 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:06.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51782 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1CC690000000001030307) 
Feb 20 09:36:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:09.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:11.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:14.111 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:36:16 np0005625204.localdomain podman[281712]: 2026-02-20 09:36:16.151680952 +0000 UTC m=+0.084478038 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:36:16 np0005625204.localdomain podman[281712]: 2026-02-20 09:36:16.163963582 +0000 UTC m=+0.096760658 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:36:16 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:36:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51783 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A1ED680000000001030307) 
Feb 20 09:36:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:16.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:36:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:36:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:17.702 281292 DEBUG nova.compute.manager [None req-d8ad929c-a450-4219-9ce8-2dd486b6c045 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:17.713 281292 INFO nova.compute.manager [None req-d8ad929c-a450-4219-9ce8-2dd486b6c045 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Retrieving diagnostics
Feb 20 09:36:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:36:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149681 "" "Go-http-client/1.1"
Feb 20 09:36:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:36:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16792 "" "Go-http-client/1.1"
Feb 20 09:36:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:19.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:21.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:22 np0005625204.localdomain sudo[281734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:36:22 np0005625204.localdomain sudo[281734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:36:22 np0005625204.localdomain sudo[281734]: pam_unix(sudo:session): session closed for user root
Feb 20 09:36:22 np0005625204.localdomain sudo[281752]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:36:22 np0005625204.localdomain sudo[281752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:36:22 np0005625204.localdomain sudo[281752]: pam_unix(sudo:session): session closed for user root
Feb 20 09:36:23 np0005625204.localdomain sudo[281801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:36:23 np0005625204.localdomain sudo[281801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:36:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:36:23 np0005625204.localdomain sudo[281801]: pam_unix(sudo:session): session closed for user root
Feb 20 09:36:23 np0005625204.localdomain podman[281819]: 2026-02-20 09:36:23.984937985 +0000 UTC m=+0.079600878 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:36:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:23.996 281292 DEBUG oslo_concurrency.lockutils [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:23.996 281292 DEBUG oslo_concurrency.lockutils [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:23 np0005625204.localdomain podman[281819]: 2026-02-20 09:36:23.997383029 +0000 UTC m=+0.092045972 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:36:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:23.997 281292 DEBUG nova.compute.manager [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:24.002 281292 DEBUG nova.compute.manager [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338
Feb 20 09:36:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:24.006 281292 DEBUG nova.objects.instance [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'flavor' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:24 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:36:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:24.047 281292 DEBUG nova.virt.libvirt.driver [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071
Feb 20 09:36:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:24.174 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:24 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 20 09:36:25 np0005625204.localdomain sshd[281842]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:36:26 np0005625204.localdomain sshd[281842]: Received disconnect from 54.36.99.29 port 47826:11: Bye Bye [preauth]
Feb 20 09:36:26 np0005625204.localdomain sshd[281842]: Disconnected from authenticating user root 54.36.99.29 port 47826 [preauth]
Feb 20 09:36:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:36:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:36:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:36:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:36:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:36:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:26 np0005625204.localdomain kernel: device tape7aa8e2a-27 left promiscuous mode
Feb 20 09:36:26 np0005625204.localdomain NetworkManager[5988]: <info>  [1771580186.8682] device (tape7aa8e2a-27): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00048|binding|INFO|Releasing lport e7aa8e2a-27a6-452b-906c-21cea166b882 from this chassis (sb_readonly=0)
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00049|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 down in Southbound
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.877 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00050|binding|INFO|Removing iface tape7aa8e2a-27 ovn-installed in OVS
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.880 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:26.888 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:ed:d2 192.168.0.140'], port_security=['fa:16:3e:b0:ed:d2 192.168.0.140'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.140/24', 'neutron:device_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'neutron:device_owner': 'compute:nova', 'neutron:host_id': 'np0005625204.localdomain', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de929a91-c460-4398-96e0-15a80685a485', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '91bce661d685472eb3e7cacab17bf52a', 'neutron:revision_number': '7', 'neutron:security_group_ids': '571bc6f6-22b1-4aad-9b70-3481475089c6 dd806cfc-5243-4295-bd9f-cfd9f58a9f1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee1d7cd7-5f4f-4b75-a06c-f37c0ef97c77, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e7aa8e2a-27a6-452b-906c-21cea166b882) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:36:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:26.889 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e7aa8e2a-27a6-452b-906c-21cea166b882 in datapath de929a91-c460-4398-96e0-15a80685a485 unbound from our chassis
Feb 20 09:36:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:26.891 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network de929a91-c460-4398-96e0-15a80685a485, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:36:26 np0005625204.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Feb 20 09:36:26 np0005625204.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 3min 55.907s CPU time.
Feb 20 09:36:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:26.896 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4877190f-adc9-49ce-82ab-04a36db91dc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:26.897 162652 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-de929a91-c460-4398-96e0-15a80685a485 namespace which is not needed anymore
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00051|ovn_bfd|INFO|Disabled BFD on interface ovn-2df8cc-0
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00052|ovn_bfd|INFO|Disabled BFD on interface ovn-0c414b-0
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00053|ovn_bfd|INFO|Disabled BFD on interface ovn-2275c3-0
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.899 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:26 np0005625204.localdomain systemd-machined[85698]: Machine qemu-1-instance-00000002 terminated.
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00054|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:26Z|00055|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:36:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:26.944 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:27 np0005625204.localdomain systemd[1]: tmp-crun.D5MwRK.mount: Deactivated successfully.
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.088 281292 DEBUG nova.compute.manager [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-unplugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.089 281292 DEBUG oslo_concurrency.lockutils [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.089 281292 DEBUG oslo_concurrency.lockutils [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.090 281292 DEBUG oslo_concurrency.lockutils [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.091 281292 DEBUG nova.compute.manager [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-unplugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.091 281292 WARNING nova.compute.manager [req-58f91363-1952-4518-9dcc-329089ecba24 req-c07e1e4b-bffb-4355-a1a7-91bedd272c77 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-unplugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state active and task_state powering-off.
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.110 281292 INFO nova.virt.libvirt.driver [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance shutdown successfully after 3 seconds.
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.117 281292 INFO nova.virt.libvirt.driver [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance destroyed successfully.
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.118 281292 DEBUG nova.objects.instance [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'numa_topology' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.140 281292 DEBUG nova.compute.manager [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:27.230 281292 DEBUG oslo_concurrency.lockutils [None req-f7d1d0ef-97e6-4a49-9107-bbeabe78b1ff 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.121 281292 DEBUG nova.compute.manager [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.122 281292 DEBUG oslo_concurrency.lockutils [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.122 281292 DEBUG oslo_concurrency.lockutils [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.122 281292 DEBUG oslo_concurrency.lockutils [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.123 281292 DEBUG nova.compute.manager [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.123 281292 WARNING nova.compute.manager [req-0c3cc10b-b945-4f16-ba1a-b9dcb7f33dc9 req-8c6acf4c-6392-445f-8389-ff2d13491da2 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state stopped and task_state None.
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.176 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.527 281292 DEBUG nova.compute.manager [None req-021827a0-cd14-4b02-9789-58184dab09a3 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server [None req-021827a0-cd14-4b02-9789-58184dab09a3 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     raise self.value
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     raise self.value
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 20 09:36:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:29.563 281292 ERROR oslo_messaging.rpc.server 
Feb 20 09:36:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24909 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A225B40000000001030307) 
Feb 20 09:36:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:36:31 np0005625204.localdomain podman[281894]: 2026-02-20 09:36:31.139214178 +0000 UTC m=+0.077775541 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:36:31 np0005625204.localdomain podman[281894]: 2026-02-20 09:36:31.153042035 +0000 UTC m=+0.091603418 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:36:31 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:36:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:31.613 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24910 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A229A80000000001030307) 
Feb 20 09:36:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51784 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A22D680000000001030307) 
Feb 20 09:36:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:36:33 np0005625204.localdomain podman[281917]: 2026-02-20 09:36:33.139536895 +0000 UTC m=+0.079813354 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, release=1770267347, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:36:33 np0005625204.localdomain podman[281917]: 2026-02-20 09:36:33.151060241 +0000 UTC m=+0.091336690 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64)
Feb 20 09:36:33 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:36:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24911 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A231A90000000001030307) 
Feb 20 09:36:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:34.211 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16670 DF PROTO=TCP SPT=50332 DPT=9102 SEQ=4108763558 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A235680000000001030307) 
Feb 20 09:36:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:36:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:36:35 np0005625204.localdomain podman[281937]: 2026-02-20 09:36:35.142591856 +0000 UTC m=+0.074097077 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true)
Feb 20 09:36:35 np0005625204.localdomain podman[281938]: 2026-02-20 09:36:35.203077124 +0000 UTC m=+0.130917643 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 20 09:36:35 np0005625204.localdomain podman[281938]: 2026-02-20 09:36:35.212127573 +0000 UTC m=+0.139968062 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 20 09:36:35 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:36:35 np0005625204.localdomain podman[281937]: 2026-02-20 09:36:35.228027884 +0000 UTC m=+0.159533115 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:36:35 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:36:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:36.615 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:37 np0005625204.localdomain podman[281871]: 2026-02-20 09:36:37.072146909 +0000 UTC m=+10.063554309 container stop 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com)
Feb 20 09:36:37 np0005625204.localdomain systemd[1]: libpod-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba.scope: Deactivated successfully.
Feb 20 09:36:37 np0005625204.localdomain podman[281871]: 2026-02-20 09:36:37.107929714 +0000 UTC m=+10.099337114 container died 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 20 09:36:37 np0005625204.localdomain systemd[1]: tmp-crun.7Ya6an.mount: Deactivated successfully.
Feb 20 09:36:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba-userdata-shm.mount: Deactivated successfully.
Feb 20 09:36:37 np0005625204.localdomain podman[281871]: 2026-02-20 09:36:37.226184825 +0000 UTC m=+10.217592175 container cleanup 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z)
Feb 20 09:36:37 np0005625204.localdomain podman[281981]: 2026-02-20 09:36:37.243104017 +0000 UTC m=+0.155573354 container cleanup 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 20 09:36:37 np0005625204.localdomain systemd[1]: libpod-conmon-57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba.scope: Deactivated successfully.
Feb 20 09:36:37 np0005625204.localdomain podman[281998]: 2026-02-20 09:36:37.314533312 +0000 UTC m=+0.069792015 container remove 57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.319 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6782c0-8516-43e4-a1b7-501455cb25e4]: (4, ('Fri Feb 20 09:36:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485 (57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba)\n57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba\nFri Feb 20 09:36:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485 (57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba)\n57dfc6ee18f9d356a004ff36db1acf34863c149c744b8c7aa64c5a771fa0d1ba\n', 'time="2026-02-20T09:36:37Z" level=warning msg="StopSignal SIGTERM failed to stop container neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485 in 10 seconds, resorting to SIGKILL"\n', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.320 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7fef79dc-5d5b-42ea-961b-bac4ffd2cb6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.321 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde929a91-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:37 np0005625204.localdomain kernel: device tapde929a91-c0 left promiscuous mode
Feb 20 09:36:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:37.323 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:37.333 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.335 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ff17f338-5811-4353-8c51-dbc18da58c3c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.352 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1567e5cd-e0ce-4d69-8a05-0b8b4d446fcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.353 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a19b93-1b75-40c4-b29d-31b53aace2ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.365 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5e93af60-4cec-47fe-bcc4-c4ae05ad6ab2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 637497, 'reachable_time': 16205, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282021, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.377 163070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-de929a91-c460-4398-96e0-15a80685a485 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 20 09:36:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:37.378 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2bd3bdc-ca23-4af5-8518-79225000d70e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24912 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A241690000000001030307) 
Feb 20 09:36:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-7ea5b54d1da71d972d7e8dd243987640d185da35de896817d599cfae85808380-merged.mount: Deactivated successfully.
Feb 20 09:36:38 np0005625204.localdomain systemd[1]: run-netns-ovnmeta\x2dde929a91\x2dc460\x2d4398\x2d96e0\x2d15a80685a485.mount: Deactivated successfully.
Feb 20 09:36:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:39.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:41.618 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.110 281292 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771580187.1088905, f9924957-6cff-426e-9f03-c739820f4ff3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.111 281292 INFO nova.compute.manager [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] VM Stopped (Lifecycle Event)
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.140 281292 DEBUG nova.compute.manager [None req-11efd309-e505-42a7-ae98-feffabb72bc2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.144 281292 DEBUG nova.compute.manager [None req-11efd309-e505-42a7-ae98-feffabb72bc2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: None, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.774 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.776 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:36:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:42.777 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:36:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:43.817 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:36:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:43.818 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:36:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:43.818 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:36:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:43.819 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.272 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.308 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.308 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.309 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.310 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.310 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.311 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.311 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.312 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.312 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.313 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.337 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.337 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.338 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.338 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.339 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.797 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.874 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:36:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:44.875 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.075 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.077 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12615MB free_disk=41.8370475769043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.077 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.078 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.179 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.180 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.180 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.244 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.754 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.761 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.777 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.793 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:36:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:45.794 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24913 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A261680000000001030307) 
Feb 20 09:36:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:46.641 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:36:47 np0005625204.localdomain podman[282067]: 2026-02-20 09:36:47.14028233 +0000 UTC m=+0.078580477 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:36:47 np0005625204.localdomain podman[282067]: 2026-02-20 09:36:47.173135474 +0000 UTC m=+0.111433651 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:36:47 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:36:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:36:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:36:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:36:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 147347 "" "Go-http-client/1.1"
Feb 20 09:36:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:36:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16317 "" "Go-http-client/1.1"
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.063 281292 DEBUG nova.compute.manager [None req-841530b3-780a-4b09-b7a8-bfdd0314cf97 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server [None req-841530b3-780a-4b09-b7a8-bfdd0314cf97 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Exception during message handling: nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     raise self.value
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 214, in decorated_function
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     compute_utils.add_instance_fault_from_exc(context,
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     self.force_reraise()
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     raise self.value
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 6739, in get_instance_diagnostics
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server     raise exception.InstanceInvalidState(
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server nova.exception.InstanceInvalidState: Instance f9924957-6cff-426e-9f03-c739820f4ff3 in power state shutdown. Cannot get_diagnostics while the instance is in this state.
Feb 20 09:36:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:48.081 281292 ERROR oslo_messaging.rpc.server 
Feb 20 09:36:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:49.339 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:50 np0005625204.localdomain sshd[282084]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:36:50 np0005625204.localdomain sshd[282084]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:36:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:51.691 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:36:54 np0005625204.localdomain podman[282086]: 2026-02-20 09:36:54.147840695 +0000 UTC m=+0.083313074 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:36:54 np0005625204.localdomain podman[282086]: 2026-02-20 09:36:54.156080199 +0000 UTC m=+0.091552578 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:36:54 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:36:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:54.385 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:55.931 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'flavor' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:55.950 281292 DEBUG oslo_concurrency.lockutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:36:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:55.951 281292 DEBUG oslo_concurrency.lockutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:36:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:55.951 281292 DEBUG nova.network.neutron [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 20 09:36:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:55.952 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:36:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:36:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:36:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:36:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:36:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:36:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:56.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:56Z|00056|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 20 09:36:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:57.952 281292 DEBUG nova.network.neutron [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:36:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:57.970 281292 DEBUG oslo_concurrency.lockutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.004 281292 INFO nova.virt.libvirt.driver [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance destroyed successfully.
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.005 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'numa_topology' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.020 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'resources' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.032 281292 DEBUG nova.virt.libvirt.vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T09:36:27Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.032 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.034 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.034 281292 DEBUG os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.038 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.039 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7aa8e2a-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.043 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.047 281292 INFO os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.050 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.051 281292 INFO nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] UEFI support detected
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.060 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Start _get_guest_xml network_info=[{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.eph0': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=43eca6d8-1b99-4300-a417-76015fcc59e1,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_options': None, 'image_id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}], 'ephemerals': [{'size': 1, 'device_name': '/dev/vdb', 'device_type': 'disk', 'encryption_format': None, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'encrypted': False, 'guest_format': None, 'encryption_options': None}], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.065 281292 WARNING nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.067 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.068 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.071 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.071 281292 DEBUG nova.virt.libvirt.host [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.072 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.073 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-20T08:22:31Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=1,extra_specs={},flavorid='739ef37c-e459-414b-b65a-355581d54c7c',id=2,is_public=True,memory_mb=512,name='m1.small',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=43eca6d8-1b99-4300-a417-76015fcc59e1,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.074 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.074 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.075 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.075 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.076 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.076 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.076 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.077 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.077 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.078 281292 DEBUG nova.virt.hardware [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.078 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'vcpu_model' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.098 281292 DEBUG nova.privsep.utils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.098 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.582 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:36:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:58.584 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.041 281292 DEBUG oslo_concurrency.processutils [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.043 281292 DEBUG nova.virt.libvirt.vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T09:36:27Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.044 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.045 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.048 281292 DEBUG nova.objects.instance [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Lazy-loading 'pci_devices' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.062 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] End _get_guest_xml xml=<domain type="kvm">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <uuid>f9924957-6cff-426e-9f03-c739820f4ff3</uuid>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <name>instance-00000002</name>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <memory>524288</memory>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <vcpu>1</vcpu>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <metadata>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:name>test</nova:name>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:creationTime>2026-02-20 09:36:58</nova:creationTime>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:flavor name="m1.small">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:memory>512</nova:memory>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:disk>1</nova:disk>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:swap>0</nova:swap>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:ephemeral>1</nova:ephemeral>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:vcpus>1</nova:vcpus>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </nova:flavor>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:owner>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:user uuid="141ec720081546bb92f7e9338deb8445">admin</nova:user>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:project uuid="91bce661d685472eb3e7cacab17bf52a">admin</nova:project>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </nova:owner>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:root type="image" uuid="43eca6d8-1b99-4300-a417-76015fcc59e1"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <nova:ports>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <nova:port uuid="e7aa8e2a-27a6-452b-906c-21cea166b882">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:           <nova:ip type="fixed" address="192.168.0.140" ipVersion="4"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         </nova:port>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </nova:ports>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </nova:instance>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </metadata>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <sysinfo type="smbios">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <system>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <entry name="manufacturer">RDO</entry>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <entry name="product">OpenStack Compute</entry>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <entry name="serial">f9924957-6cff-426e-9f03-c739820f4ff3</entry>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <entry name="uuid">f9924957-6cff-426e-9f03-c739820f4ff3</entry>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <entry name="family">Virtual Machine</entry>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </system>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </sysinfo>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <os>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <type arch="x86_64" machine="pc-q35-rhel9.0.0">hvm</type>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <boot dev="hd"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <smbios mode="sysinfo"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </os>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <features>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <acpi/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <apic/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <vmcoreinfo/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </features>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <clock offset="utc">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <timer name="pit" tickpolicy="delay"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <timer name="hpet" present="no"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </clock>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <cpu mode="host-model" match="exact">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <topology sockets="1" cores="1" threads="1"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </cpu>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   <devices>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <disk type="network" device="disk">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <driver type="raw" cache="none"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <source protocol="rbd" name="vms/f9924957-6cff-426e-9f03-c739820f4ff3_disk">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.103" port="6789"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.105" port="6789"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.104" port="6789"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </source>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <auth username="openstack">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <secret type="ceph" uuid="a8557ee9-b55d-5519-942c-cf8f6172f1d8"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </auth>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <target dev="vda" bus="virtio"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <disk type="network" device="disk">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <driver type="raw" cache="none"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <source protocol="rbd" name="vms/f9924957-6cff-426e-9f03-c739820f4ff3_disk.eph0">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.103" port="6789"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.105" port="6789"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.104" port="6789"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </source>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <auth username="openstack">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:         <secret type="ceph" uuid="a8557ee9-b55d-5519-942c-cf8f6172f1d8"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       </auth>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <target dev="vdb" bus="virtio"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <interface type="ethernet">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <mac address="fa:16:3e:b0:ed:d2"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <model type="virtio"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <driver name="vhost" rx_queue_size="512"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <mtu size="1292"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <target dev="tape7aa8e2a-27"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </interface>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <serial type="pty">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <log file="/var/lib/nova/instances/f9924957-6cff-426e-9f03-c739820f4ff3/console.log" append="off"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </serial>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <video>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <model type="virtio"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </video>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <input type="tablet" bus="usb"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <input type="keyboard" bus="usb"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <rng model="virtio">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <backend model="random">/dev/urandom</backend>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </rng>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <controller type="usb" index="0"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     <memballoon model="virtio">
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:       <stats period="10"/>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:     </memballoon>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:   </devices>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: </domain>
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.065 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.065 281292 DEBUG nova.virt.libvirt.driver [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.066 281292 DEBUG nova.virt.libvirt.vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T08:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='test',display_name='test',ec2_ids=<?>,ephemeral_gb=1,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='np0005625204.localdomain',hostname='test',id=2,image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T08:23:36Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=512,metadata={},migration_context=<?>,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='91bce661d685472eb3e7cacab17bf52a',ramdisk_id='',reservation_id='r-fmzjk66w',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member,admin',image_base_image_ref='43eca6d8-1b99-4300-a417-76015fcc59e1',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='pc-q35-rhel9.0.0',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_owner_specified.openstack.md5='',image_owner_specified.openstack.object='images/cirros',image_owner_specified.openstack.sha256='',owner_project_name='admin',owner_user_name='admin'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T09:36:27Z,user_data=None,user_id='141ec720081546bb92f7e9338deb8445',uuid=f9924957-6cff-426e-9f03-c739820f4ff3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.067 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converting VIF {"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.068 281292 DEBUG nova.network.os_vif_util [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.068 281292 DEBUG os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.069 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.070 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.070 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.073 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.074 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7aa8e2a-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.075 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape7aa8e2a-27, col_values=(('external_ids', {'iface-id': 'e7aa8e2a-27a6-452b-906c-21cea166b882', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:ed:d2', 'vm-uuid': 'f9924957-6cff-426e-9f03-c739820f4ff3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.106 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.112 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.113 281292 INFO os_vif [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:ed:d2,bridge_name='br-int',has_traffic_filtering=True,id=e7aa8e2a-27a6-452b-906c-21cea166b882,network=Network(de929a91-c460-4398-96e0-15a80685a485),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape7aa8e2a-27')
Feb 20 09:36:59 np0005625204.localdomain systemd[1]: Started libvirt secret daemon.
Feb 20 09:36:59 np0005625204.localdomain kernel: device tape7aa8e2a-27 entered promiscuous mode
Feb 20 09:36:59 np0005625204.localdomain NetworkManager[5988]: <info>  [1771580219.2357] manager: (tape7aa8e2a-27): new Tun device (/org/freedesktop/NetworkManager/Devices/15)
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00057|binding|INFO|Claiming lport e7aa8e2a-27a6-452b-906c-21cea166b882 for this chassis.
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00058|binding|INFO|e7aa8e2a-27a6-452b-906c-21cea166b882: Claiming fa:16:3e:b0:ed:d2 192.168.0.140
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.242 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain systemd-udevd[282184]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00059|ovn_bfd|INFO|Enabled BFD on interface ovn-2df8cc-0
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00060|ovn_bfd|INFO|Enabled BFD on interface ovn-0c414b-0
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00061|ovn_bfd|INFO|Enabled BFD on interface ovn-2275c3-0
Feb 20 09:36:59 np0005625204.localdomain NetworkManager[5988]: <info>  [1771580219.2613] device (tape7aa8e2a-27): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 09:36:59 np0005625204.localdomain NetworkManager[5988]: <info>  [1771580219.2628] device (tape7aa8e2a-27): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.257 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:ed:d2 192.168.0.140'], port_security=['fa:16:3e:b0:ed:d2 192.168.0.140'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.0.140/24', 'neutron:device_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-de929a91-c460-4398-96e0-15a80685a485', 'neutron:port_capabilities': '', 'neutron:port_fip': '192.168.122.20', 'neutron:port_name': '', 'neutron:project_id': '91bce661d685472eb3e7cacab17bf52a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '571bc6f6-22b1-4aad-9b70-3481475089c6 dd806cfc-5243-4295-bd9f-cfd9f58a9f1e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee1d7cd7-5f4f-4b75-a06c-f37c0ef97c77, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e7aa8e2a-27a6-452b-906c-21cea166b882) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.260 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e7aa8e2a-27a6-452b-906c-21cea166b882 in datapath de929a91-c460-4398-96e0-15a80685a485 bound to our chassis
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.262 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network de929a91-c460-4398-96e0-15a80685a485
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.272 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[302c4fce-e22d-4f11-a468-70933805a63c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.274 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapde929a91-c1 in ovnmeta-de929a91-c460-4398-96e0-15a80685a485 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.277 162782 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapde929a91-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.277 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d95a06ea-e086-4143-affd-4f1b67cb81de]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.278 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fb255f4c-c6f4-4210-bbca-bc94e43ffad3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.291 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[34d42aa8-f5da-4b86-902d-bd24cd518e6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.307 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.312 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7b1737f3-7107-49f6-8a3d-0e5fa543f8f3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain systemd-machined[85698]: New machine qemu-2-instance-00000002.
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00062|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 ovn-installed in OVS
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00063|binding|INFO|Setting lport e7aa8e2a-27a6-452b-906c-21cea166b882 up in Southbound
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.321 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.322 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.347 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[9f6fb9f0-ad58-4a7c-bc25-2a7057745176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain NetworkManager[5988]: <info>  [1771580219.3544] manager: (tapde929a91-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/16)
Feb 20 09:36:59 np0005625204.localdomain systemd-udevd[282186]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.353 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c97b44ec-9548-4ed3-8a36-db4ffb2ad3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.380 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[c0557cf3-56fb-49fa-92c7-743cc9a4bd81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.383 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac74dfc-540f-4f62-8425-d467f194d280]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c1: link becomes ready
Feb 20 09:36:59 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapde929a91-c0: link becomes ready
Feb 20 09:36:59 np0005625204.localdomain NetworkManager[5988]: <info>  [1771580219.3988] device (tapde929a91-c0): carrier: link connected
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.404 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[36b9c4ce-9a72-4628-8238-b5dbcf7c9d23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.420 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[921bc494-a999-43ed-b33c-41aa47474ebb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde929a91-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:09:c2:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077856, 'reachable_time': 22503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282222, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.434 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5f107aa7-dc2e-46c1-a8fb-3ff869717ed0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:c288'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1077856, 'tstamp': 1077856}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 282223, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.450 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c83e9b92-6474-4604-a752-467a8221d44c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapde929a91-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:09:c2:88'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077856, 'reachable_time': 22503, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 282224, 'error': None, 'target': 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.474 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[96c2b696-0977-4f05-9b38-904cc808f002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.530 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[51cfdfdd-717c-402f-bbc5-8f31f57b34d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.532 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapde929a91-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.533 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.534 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapde929a91-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.536 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain kernel: device tapde929a91-c0 entered promiscuous mode
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.541 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapde929a91-c0, col_values=(('external_ids', {'iface-id': '3323e11d-576a-42f3-bcca-e10425268e61'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.542 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:36:59Z|00064|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.553 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.555 162652 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.556 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[54c62514-27bf-4273-a5bb-8712ece7b139]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.558 162652 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: global
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     log         /dev/log local0 debug
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     log-tag     haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     user        root
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     group       root
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     maxconn     1024
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     pidfile     /var/lib/neutron/external/pids/de929a91-c460-4398-96e0-15a80685a485.pid.haproxy
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     daemon
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: defaults
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     log global
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     mode http
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     option httplog
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     option dontlognull
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     option http-server-close
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     option forwardfor
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     retries                 3
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout http-request    30s
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout connect         30s
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout client          32s
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout server          32s
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout http-keep-alive 30s
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: listen listener
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     bind 169.254.169.254:80
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     server metadata /var/lib/neutron/metadata_proxy
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:     http-request add-header X-OVN-Network-ID de929a91-c460-4398-96e0-15a80685a485
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 20 09:36:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:36:59.560 162652 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-de929a91-c460-4398-96e0-15a80685a485', 'env', 'PROCESS_TAG=haproxy-de929a91-c460-4398-96e0-15a80685a485', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/de929a91-c460-4398-96e0-15a80685a485.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.678 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event <LifecycleEvent: 1771580219.6785743, f9924957-6cff-426e-9f03-c739820f4ff3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.679 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] VM Resumed (Lifecycle Event)
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.683 281292 DEBUG nova.compute.manager [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.686 281292 INFO nova.virt.libvirt.driver [-] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Instance rebooted successfully.
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.687 281292 DEBUG nova.compute.manager [None req-f847f2fb-2e9b-4ce9-bfa0-ee4a5157191d 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.701 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.710 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.733 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] During sync_power_state the instance has a pending task (powering-on). Skip.
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.734 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event <LifecycleEvent: 1771580219.6804693, f9924957-6cff-426e-9f03-c739820f4ff3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.734 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] VM Started (Lifecycle Event)
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.773 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.778 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.975 281292 DEBUG nova.compute.manager [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:36:59 np0005625204.localdomain podman[282299]: 
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.977 281292 DEBUG oslo_concurrency.lockutils [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.978 281292 DEBUG oslo_concurrency.lockutils [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.979 281292 DEBUG oslo_concurrency.lockutils [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.979 281292 DEBUG nova.compute.manager [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:36:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:36:59.980 281292 WARNING nova.compute.manager [req-ed323233-0f0b-4974-ba5f-d1e19f488fe7 req-f72d312b-6409-4cdb-bc1a-6f8b59e45cfb d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state active and task_state None.
Feb 20 09:36:59 np0005625204.localdomain podman[282299]: 2026-02-20 09:36:59.986531948 +0000 UTC m=+0.091907548 container create 9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:37:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683.scope.
Feb 20 09:37:00 np0005625204.localdomain podman[282299]: 2026-02-20 09:36:59.940320011 +0000 UTC m=+0.045695621 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:37:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:37:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/daf1a97849e0b2cfedd7ec5e1af6ebe76cd044cbaef292fa2ede4328d64f268d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:37:00 np0005625204.localdomain podman[282299]: 2026-02-20 09:37:00.094691176 +0000 UTC m=+0.200066776 container init 9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:37:00 np0005625204.localdomain podman[282299]: 2026-02-20 09:37:00.104341564 +0000 UTC m=+0.209717184 container start 9903bc3cf3f19e861e096f5de614cbdd9eb528d04ba83f01013fe4247cd76683 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 20 09:37:00 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:37:00Z|00065|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:37:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:00.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:00 np0005625204.localdomain neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485[282313]: [NOTICE]   (282317) : New worker (282319) forked
Feb 20 09:37:00 np0005625204.localdomain neutron-haproxy-ovnmeta-de929a91-c460-4398-96e0-15a80685a485[282313]: [NOTICE]   (282317) : Loading success.
Feb 20 09:37:00 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:37:00Z|00066|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:37:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:00.201 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:00 np0005625204.localdomain systemd[1]: tmp-crun.1FGOnR.mount: Deactivated successfully.
Feb 20 09:37:00 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:37:00Z|00067|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:37:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:00.309 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42373 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A29AE50000000001030307) 
Feb 20 09:37:00 np0005625204.localdomain snmpd[68593]: IfIndex of an interface changed. Such interfaces will appear multiple times in IF-MIB.
Feb 20 09:37:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42374 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A29EE80000000001030307) 
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.139 281292 DEBUG nova.compute.manager [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG oslo_concurrency.lockutils [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG oslo_concurrency.lockutils [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG oslo_concurrency.lockutils [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "f9924957-6cff-426e-9f03-c739820f4ff3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.140 281292 DEBUG nova.compute.manager [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] No waiting events found dispatching network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.141 281292 WARNING nova.compute.manager [req-e0aa0257-cb0e-41aa-aaa8-d1db6974ff13 req-e855b575-0bad-4a5e-b1d0-9bb1b8cb931b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Received unexpected event network-vif-plugged-e7aa8e2a-27a6-452b-906c-21cea166b882 for instance with vm_state active and task_state None.
Feb 20 09:37:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:02.177 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:37:02 np0005625204.localdomain systemd[1]: tmp-crun.T9TTvY.mount: Deactivated successfully.
Feb 20 09:37:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24914 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2A1690000000001030307) 
Feb 20 09:37:02 np0005625204.localdomain podman[282328]: 2026-02-20 09:37:02.309901247 +0000 UTC m=+0.105689544 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:37:02 np0005625204.localdomain podman[282328]: 2026-02-20 09:37:02.34338379 +0000 UTC m=+0.139172117 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:37:02 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:37:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42375 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2A6E80000000001030307) 
Feb 20 09:37:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:37:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:04.106 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:04 np0005625204.localdomain podman[282350]: 2026-02-20 09:37:04.144986603 +0000 UTC m=+0.076756750 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Feb 20 09:37:04 np0005625204.localdomain podman[282350]: 2026-02-20 09:37:04.158963975 +0000 UTC m=+0.090734172 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:37:04 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:37:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51785 DF PROTO=TCP SPT=40598 DPT=9102 SEQ=3689421858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2AB690000000001030307) 
Feb 20 09:37:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:05.999 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:37:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:06.000 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:37:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:06.001 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:37:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:37:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:37:06 np0005625204.localdomain podman[282371]: 2026-02-20 09:37:06.136274322 +0000 UTC m=+0.072034215 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:37:06 np0005625204.localdomain podman[282371]: 2026-02-20 09:37:06.204727884 +0000 UTC m=+0.140487797 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:37:06 np0005625204.localdomain systemd[1]: tmp-crun.zi4tqc.mount: Deactivated successfully.
Feb 20 09:37:06 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:37:06 np0005625204.localdomain podman[282372]: 2026-02-20 09:37:06.21495627 +0000 UTC m=+0.148754742 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:37:06 np0005625204.localdomain podman[282372]: 2026-02-20 09:37:06.298401636 +0000 UTC m=+0.232200078 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:37:06 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:37:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:07.212 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42376 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2B6A90000000001030307) 
Feb 20 09:37:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:09.111 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:12.253 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:37:13Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:ed:d2 192.168.0.140
Feb 20 09:37:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:14.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42377 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A2D7680000000001030307) 
Feb 20 09:37:16 np0005625204.localdomain sshd[282413]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:37:16 np0005625204.localdomain sshd[282413]: Invalid user sol from 45.148.10.240 port 34238
Feb 20 09:37:17 np0005625204.localdomain sshd[282413]: Connection closed by invalid user sol 45.148.10.240 port 34238 [preauth]
Feb 20 09:37:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:37:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:17.305 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:17 np0005625204.localdomain podman[282415]: 2026-02-20 09:37:17.382828818 +0000 UTC m=+0.135618798 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:37:17 np0005625204.localdomain podman[282415]: 2026-02-20 09:37:17.398245684 +0000 UTC m=+0.151035664 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Feb 20 09:37:17 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:37:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:37:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:37:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:37:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1"
Feb 20 09:37:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:37:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16790 "" "Go-http-client/1.1"
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.204 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.209 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b09f68d2-2b3e-4ea2-9010-af6c08789365', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.205357', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf66b2d2-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '69ae577dc37732c16d83dd9a45628a6b5d6db824b4235bf3f5d5960409113f5c'}]}, 'timestamp': '2026-02-20 09:37:18.210614', '_unique_id': 'eac3e8f193f34cec88e3f49ac9bdddc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.212 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.213 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.214 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 892382541 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd134d19f-a84a-4724-8a40-b6662f82b50f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 892382541, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.214347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6bc11e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '1d3a6c1d7cd98b614f10d2b3436988bd425b9783a99dabea9646323a036c12d9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.214347', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6bd87a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '13b831b5c5fb8addfbc3e29122081e6957c3ea77772953d537747ce43f6c0e22'}]}, 'timestamp': '2026-02-20 09:37:18.244289', '_unique_id': '48f7822ecf06487b844a096ad45e0eb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.247 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73c6df0e-69a4-482e-ac49-06b4830e50fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.246976', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6c578c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': 'e5b3fafaf0121dbf61a367e6309d8fbcf49b544f8109d31b5771da09489db7d7'}]}, 'timestamp': '2026-02-20 09:37:18.247555', '_unique_id': '78856f5f3b9b4fd392ec53aa0ad5c589'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.250 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cacb790b-d7f1-4e61-b630-f990bb39a5f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.250050', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6ccf32-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '2eb2049e7ae9163cb2eb310538c42d1e6b60a4c1221ae7a8233cbd2f61eb4690'}]}, 'timestamp': '2026-02-20 09:37:18.250625', '_unique_id': '05a1c8659e144531aa1a3a623e3eb5e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6bf73473-caa9-4c25-844c-3e533e305fe4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.253006', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6d3f12-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': 'b890c86dc786fcbd11396f4ab51d030b2b04dfd3eb153768c757c9e30e10f7fb'}]}, 'timestamp': '2026-02-20 09:37:18.253565', '_unique_id': '5c0bfaed8a154d268e2ea7c3a596b4cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 221184 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e01391c4-49dd-4375-bcd2-2e5d3b7e5b97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 221184, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.255910', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6db082-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '587777d95d0bd9b1fe6bcd4bb7e067826e501ffc01ab1f5af56c265e6dc280b4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.255910', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6dc0e0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'eacce2afa0a8d90bbcb9fd5cb723134584bb84440c803c3bf02c0fbf3a686ba7'}]}, 'timestamp': '2026-02-20 09:37:18.256880', '_unique_id': '25c3e1ad10094914bdcdfb88ed34773a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.259 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.259 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2e314a59-d88e-4cc4-b31a-048ef362bb81', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.259436', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf6e3b42-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '8d435aba753e7e2d7a3de7ed343d483d7470531841b75b9b21855216eade92ee'}]}, 'timestamp': '2026-02-20 09:37:18.260009', '_unique_id': 'afbabf09586b4ad7a173ea8484c1a3d8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 29305856 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4b8250b-cf7b-456d-8814-08ef7136fdf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29305856, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.262315', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6eab7c-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'c4c82bb217d1cb4f3d785f52887d138f543d45bd80145e708b2eecaf1fbbda11'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.262315', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6ec062-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '8b83831f1693b8b9ad2664ecc2a519280c59d983d7908361e003ac5e292db3c5'}]}, 'timestamp': '2026-02-20 09:37:18.263318', '_unique_id': '9acf0cd8dea94e468b866fdf1c1d1a1c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.264 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 27 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54c51c36-bdec-47df-b6cb-b881d8c439db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 27, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.265985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf6f3d76-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '674e5eceb0b8dd8765a19d294d493a7f1ad8a35b405534caed54ac561b59f5e0'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.265985', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf6f5004-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '940afae317d3228b63f57c0486ae7aa21263d9ba4c156ae7b6a0f25bc29b4cbd'}]}, 'timestamp': '2026-02-20 09:37:18.266982', '_unique_id': '65b70e81365f4fe79cdb613696e0afa8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.269 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.269 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aed8ec76-5654-4980-839e-c9b7166cfd42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.269745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf717550-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '46e4ad3327abd9660ae5fac8a90888e743d515559d926bdb5145ddb92f9c6fd0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.269745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf718a68-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': 'e3e84a9b141e6f43792747d905f8816cc81cfd07872473046a36541742332064'}]}, 'timestamp': '2026-02-20 09:37:18.281591', '_unique_id': 'aa62bf2bbe0145ca867b4c6c03686733'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1076 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c9c9dba-6467-481e-80ef-3aecaef0d876', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1076, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.284052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf71ff52-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '205bde20c8bd5bb37e226dc2cd571f6d2357a8b84ca3a275683d548ca467290a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.284052', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf7210e6-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'b8406bbfcd0483f7e3840f90c01a870b168d8792863c5ac151ee895f9d2cc959'}]}, 'timestamp': '2026-02-20 09:37:18.285029', '_unique_id': '47538f61c80e4601bd08f811aa304be1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.286 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7dbf23db-fcf3-46f1-87d4-9cf291e132fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.287458', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf728274-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '213bf9fea00c420f978499647213120dee793142f990a25766a5cde488fac9a0'}]}, 'timestamp': '2026-02-20 09:37:18.287968', '_unique_id': '1b6197eddf14448caebc18d352ecc20e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46186a6c-86ae-4905-80ca-3bb807fcaa1b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 984, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.290267', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf72f1c8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '604212ad23ab53a6fabeb0b830f1ad493012b9e28a79067a1729a8ef99f88bc7'}]}, 'timestamp': '2026-02-20 09:37:18.290848', '_unique_id': '01e826052b3a48e986b4848efc187469'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 970 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa15539e-4284-477e-971c-b32568911bc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 970, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.293317', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf7365ae-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '77ca163de9c83df833d8ef2430844f8cbf66b9affd3d250f6314fd42a727a5cb'}]}, 'timestamp': '2026-02-20 09:37:18.293887', '_unique_id': 'f17619b25473442facb078b0fbbd76b1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8df96f73-0312-49cb-8355-06e254edadbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.296174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf73d4f8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': 'fda9fd0bea97736b2c89da9d6976f8680147a4adc921412b8819454a06d75852'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.296174', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf73ea2e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '5d28b18303bf8694bab7ab3875c2bcaf193e40cb386d793e096f40c85fa0305a'}]}, 'timestamp': '2026-02-20 09:37:18.297157', '_unique_id': 'ca1843c42aae4323a27c7c6001cbdd5d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.298 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.299 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 11780000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f2897676-5fba-4e7e-9856-6a98f842723d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11780000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:37:18.299592', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'bf78dcc8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.567730861, 'message_signature': '560e64eab02cc80b5b3a20bf2d1176845bb47b4b00a15ab318bc41d988a7a1d9'}]}, 'timestamp': '2026-02-20 09:37:18.329722', '_unique_id': '1a9e0f2b63f14aa19eafb926cdfed0b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4056858143 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '881fa7f5-2fd0-4467-ad7a-8bb2854e34e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4056858143, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.332154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf79528e-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': '1ea4ccd640ea6b21f0b4bc6e4afb380aa52e08603604d9d8c807685f223d6855'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.332154', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf796af8-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.453560348, 'message_signature': 'e4742a261dbceea92b91c6e5f9bbe0601de0c228fb4d0da4610897b5a345a527'}]}, 'timestamp': '2026-02-20 09:37:18.333219', '_unique_id': '9c71586a9ebb49319b569882858cfb8f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.336 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.337 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9eef23b1-15d4-4e21-8707-faee0018b7e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:37:18.337640', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'bf7a2a56-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '28855a85f5d35b0d3411757258f27314e4facb836dcca63c78da506433556787'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:37:18.337640', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'bf7a3eb0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.508956417, 'message_signature': '3b8a400be71d235b1a9f704cfd7b2e3220298478d38d09f7989a38afacf7b08a'}]}, 'timestamp': '2026-02-20 09:37:18.338640', '_unique_id': 'fe02f41e460c427798113e7ba9a715ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.342 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.343 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 984 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '329da2f6-9b5e-42df-b931-6f33bfb22a49', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 984, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.343220', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf7b21a4-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '8851468cdc21eea6772483fd9487cadb3b8a18bf0a517a10f79c5c0d951a6b88'}]}, 'timestamp': '2026-02-20 09:37:18.344477', '_unique_id': '49c22e944aaa4b56bf752956736ff40d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.345 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.347 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '464b5457-d187-4c53-842b-63d04ef03cc1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:37:18.347711', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'bf7bb97a-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.444552919, 'message_signature': '24f4f7703049e4317fda9bacb6b8e20b32843d460dcd853aa8536df7dc75d822'}]}, 'timestamp': '2026-02-20 09:37:18.348472', '_unique_id': 'a4d3330e4b1f45f8a1ac9bf0146c57bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.349 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.351 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 48.83984375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b134f5c-d2d7-4db4-9e4f-6a16ba032791', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 48.83984375, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:37:18.351246', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'bf7c3df0-0e3f-11f1-9294-fa163ef029e2', 'monotonic_time': 10797.567730861, 'message_signature': 'f531553425573865772dca1805f0269b42cc357036f3a6fcfee73f939ffc7601'}]}, 'timestamp': '2026-02-20 09:37:18.351779', '_unique_id': 'c2689147542548e8b2fa07b72378ab0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:37:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:37:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:18.476 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:18.478 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:18 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:18.965 281292 DEBUG nova.compute.manager [None req-05f7fce3-9241-4683-8de9-5db322a59e18 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:37:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:18.970 281292 INFO nova.compute.manager [None req-05f7fce3-9241-4683-8de9-5db322a59e18 141ec720081546bb92f7e9338deb8445 91bce661d685472eb3e7cacab17bf52a - - default default] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Retrieving diagnostics
Feb 20 09:37:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:19.126 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.848 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.849 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 1.3714855
Feb 20 09:37:19 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58326 [20/Feb/2026:09:37:18.475] listener listener/metadata 0/0/0/1374/1374 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.866 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.868 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-keys HTTP/1.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:19 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58332 [20/Feb/2026:09:37:19.866] listener listener/metadata 0/0/0/29/29 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-keys HTTP/1.1"
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.896 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/public-keys HTTP/1.1" status: 404  len: 297 time: 0.0279260
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.910 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.911 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-id HTTP/1.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.923 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.923 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/instance-id HTTP/1.1" status: 200  len: 146 time: 0.0120754
Feb 20 09:37:19 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58346 [20/Feb/2026:09:37:19.910] listener listener/metadata 0/0/0/13/13 200 130 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-id HTTP/1.1"
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.930 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.931 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/ami-launch-index HTTP/1.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.948 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.949 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1" status: 200  len: 136 time: 0.0181334
Feb 20 09:37:19 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58362 [20/Feb/2026:09:37:19.930] listener listener/metadata 0/0/0/19/19 200 120 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/ami-launch-index HTTP/1.1"
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.957 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.958 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/instance-type HTTP/1.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.969 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:19 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58370 [20/Feb/2026:09:37:19.957] listener listener/metadata 0/0/0/12/12 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/instance-type HTTP/1.1"
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.969 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/instance-type HTTP/1.1" status: 200  len: 143 time: 0.0117185
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.976 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.977 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-ipv4 HTTP/1.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.989 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:19 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58382 [20/Feb/2026:09:37:19.976] listener listener/metadata 0/0/0/13/13 200 133 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1"
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.989 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/local-ipv4 HTTP/1.1" status: 200  len: 149 time: 0.0121412
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.996 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:19.996 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-ipv4 HTTP/1.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:19 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.008 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58390 [20/Feb/2026:09:37:19.995] listener listener/metadata 0/0/0/13/13 200 134 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.009 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 150 time: 0.0122588
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.015 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.016 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/hostname HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.030 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.030 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/hostname HTTP/1.1" status: 200  len: 139 time: 0.0137398
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58398 [20/Feb/2026:09:37:20.015] listener listener/metadata 0/0/0/14/14 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/hostname HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.037 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.038 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/local-hostname HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58408 [20/Feb/2026:09:37:20.036] listener listener/metadata 0/0/0/13/13 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/local-hostname HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.049 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.050 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/local-hostname HTTP/1.1" status: 200  len: 139 time: 0.0121861
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.056 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.057 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/user-data HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58414 [20/Feb/2026:09:37:20.056] listener listener/metadata 0/0/0/14/14 404 281 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/user-data HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.070 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/user-data HTTP/1.1" status: 404  len: 297 time: 0.0128839
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.083 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.084 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.096 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58426 [20/Feb/2026:09:37:20.083] listener listener/metadata 0/0/0/13/13 200 139 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.097 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/block-device-mapping HTTP/1.1" status: 200  len: 155 time: 0.0123501
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.102 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.103 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.121 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.122 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1" status: 200  len: 138 time: 0.0197082
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58430 [20/Feb/2026:09:37:20.101] listener listener/metadata 0/0/0/21/21 200 122 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ami HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.128 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.129 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.141 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58446 [20/Feb/2026:09:37:20.127] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.141 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/block-device-mapping/ephemeral0 HTTP/1.1" status: 200  len: 143 time: 0.0122240
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.147 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.148 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.159 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58460 [20/Feb/2026:09:37:20.147] listener listener/metadata 0/0/0/13/13 200 127 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.160 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/block-device-mapping/root HTTP/1.1" status: 200  len: 143 time: 0.0117393
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.167 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.168 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/public-hostname HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.179 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58466 [20/Feb/2026:09:37:20.166] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/public-hostname HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.179 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/public-hostname HTTP/1.1" status: 200  len: 139 time: 0.0114832
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.186 162777 DEBUG eventlet.wsgi.server [-] (162777) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.187 162777 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Accept: */*
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Connection: close
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Content-Type: text/plain
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: Host: 169.254.169.254
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: User-Agent: curl/7.84.0
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Forwarded-For: 192.168.0.140
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: X-Ovn-Network-Id: de929a91-c460-4398-96e0-15a80685a485 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.198 162777 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161
Feb 20 09:37:20 np0005625204.localdomain haproxy-metadata-proxy-de929a91-c460-4398-96e0-15a80685a485[282319]: 192.168.0.140:58474 [20/Feb/2026:09:37:20.186] listener listener/metadata 0/0/0/12/12 200 123 - - ---- 1/1/0/0/0 0/0 "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1"
Feb 20 09:37:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:37:20.199 162777 INFO eventlet.wsgi.server [-] 192.168.0.140,<local> "GET /2009-04-04/meta-data/placement/availability-zone HTTP/1.1" status: 200  len: 139 time: 0.0113223
Feb 20 09:37:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:22.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:24 np0005625204.localdomain sudo[282435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:37:24 np0005625204.localdomain sudo[282435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:37:24 np0005625204.localdomain sudo[282435]: pam_unix(sudo:session): session closed for user root
Feb 20 09:37:24 np0005625204.localdomain sudo[282453]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:37:24 np0005625204.localdomain sudo[282453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:37:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:24.128 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:24 np0005625204.localdomain sudo[282453]: pam_unix(sudo:session): session closed for user root
Feb 20 09:37:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:37:25 np0005625204.localdomain podman[282502]: 2026-02-20 09:37:25.163489756 +0000 UTC m=+0.097437489 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:37:25 np0005625204.localdomain podman[282502]: 2026-02-20 09:37:25.172880286 +0000 UTC m=+0.106827979 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:37:25 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:37:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:37:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:37:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:37:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:37:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:37:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:37:27 np0005625204.localdomain sudo[282528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:37:27 np0005625204.localdomain sudo[282528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:37:27 np0005625204.localdomain sudo[282528]: pam_unix(sudo:session): session closed for user root
Feb 20 09:37:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:27.380 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:29.130 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:29 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:37:29Z|00068|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Feb 20 09:37:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2795 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A310150000000001030307) 
Feb 20 09:37:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2796 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A314280000000001030307) 
Feb 20 09:37:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:32.423 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:37:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42378 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A317680000000001030307) 
Feb 20 09:37:32 np0005625204.localdomain podman[282546]: 2026-02-20 09:37:32.534948662 +0000 UTC m=+0.083368163 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:37:32 np0005625204.localdomain podman[282546]: 2026-02-20 09:37:32.568585801 +0000 UTC m=+0.117005282 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:37:32 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:37:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2797 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A31C280000000001030307) 
Feb 20 09:37:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:34.132 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24915 DF PROTO=TCP SPT=42432 DPT=9102 SEQ=3451200846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A31F690000000001030307) 
Feb 20 09:37:34 np0005625204.localdomain sshd[282570]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:37:35 np0005625204.localdomain sshd[282570]: Invalid user test8 from 18.221.252.160 port 47814
Feb 20 09:37:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:37:35 np0005625204.localdomain sshd[282570]: Received disconnect from 18.221.252.160 port 47814:11: Bye Bye [preauth]
Feb 20 09:37:35 np0005625204.localdomain sshd[282570]: Disconnected from invalid user test8 18.221.252.160 port 47814 [preauth]
Feb 20 09:37:35 np0005625204.localdomain podman[282572]: 2026-02-20 09:37:35.09343099 +0000 UTC m=+0.065810333 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_id=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64)
Feb 20 09:37:35 np0005625204.localdomain podman[282572]: 2026-02-20 09:37:35.109732213 +0000 UTC m=+0.082111586 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Feb 20 09:37:35 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:37:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:37:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:37:37 np0005625204.localdomain podman[282592]: 2026-02-20 09:37:37.149483067 +0000 UTC m=+0.082719844 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 20 09:37:37 np0005625204.localdomain podman[282592]: 2026-02-20 09:37:37.189501842 +0000 UTC m=+0.122738589 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 20 09:37:37 np0005625204.localdomain podman[282593]: 2026-02-20 09:37:37.201494802 +0000 UTC m=+0.132508821 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:37:37 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:37:37 np0005625204.localdomain podman[282593]: 2026-02-20 09:37:37.231616123 +0000 UTC m=+0.162630122 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:37:37 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:37:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:37.456 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2798 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A32BE90000000001030307) 
Feb 20 09:37:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:39.134 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:42.492 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:44.136 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:45.735 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:45.736 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:45.774 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:45.774 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:37:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:45.774 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:37:45 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2799 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A34B680000000001030307) 
Feb 20 09:37:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:46.824 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:37:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:46.825 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:37:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:46.825 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:37:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:46.826 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:37:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:47.523 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:37:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:37:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:37:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1"
Feb 20 09:37:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:37:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16793 "" "Go-http-client/1.1"
Feb 20 09:37:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:37:48 np0005625204.localdomain podman[282635]: 2026-02-20 09:37:48.148139881 +0000 UTC m=+0.087375268 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:37:48 np0005625204.localdomain podman[282635]: 2026-02-20 09:37:48.159033147 +0000 UTC m=+0.098268534 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 20 09:37:48 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:37:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 09:37:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.445 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.465 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.466 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.466 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.467 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.467 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.468 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.468 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.469 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.469 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.470 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.486 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.487 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.487 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.487 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.488 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:37:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:49.966 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.083 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.084 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.324 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.326 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12328MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.326 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.327 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.401 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.401 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.402 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.448 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.913 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:37:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:50.919 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:37:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:51.034 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:37:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:51.170 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:37:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:51.171 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:37:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:52.555 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:54.140 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:55 np0005625204.localdomain sshd[282699]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:37:55 np0005625204.localdomain sshd[282699]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:37:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:37:55 np0005625204.localdomain podman[282701]: 2026-02-20 09:37:55.923390452 +0000 UTC m=+0.089027519 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:37:55 np0005625204.localdomain podman[282701]: 2026-02-20 09:37:55.931055878 +0000 UTC m=+0.096692895 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:37:55 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:37:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:37:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:37:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:37:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:37:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:37:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:37:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:57.564 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:37:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:37:59.142 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14433 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A385460000000001030307) 
Feb 20 09:38:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14434 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A389680000000001030307) 
Feb 20 09:38:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2800 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A38B680000000001030307) 
Feb 20 09:38:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:02.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:38:03 np0005625204.localdomain podman[282725]: 2026-02-20 09:38:03.148723888 +0000 UTC m=+0.085748438 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:38:03 np0005625204.localdomain podman[282725]: 2026-02-20 09:38:03.159330325 +0000 UTC m=+0.096354845 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:38:03 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:38:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14435 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A391690000000001030307) 
Feb 20 09:38:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:04.144 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42379 DF PROTO=TCP SPT=43962 DPT=9102 SEQ=683381799 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A395680000000001030307) 
Feb 20 09:38:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:38:06.001 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:38:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:38:06.002 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:38:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:38:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:38:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:38:06 np0005625204.localdomain systemd[1]: tmp-crun.4eneJZ.mount: Deactivated successfully.
Feb 20 09:38:06 np0005625204.localdomain podman[282748]: 2026-02-20 09:38:06.149706494 +0000 UTC m=+0.084267692 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Feb 20 09:38:06 np0005625204.localdomain podman[282748]: 2026-02-20 09:38:06.186205801 +0000 UTC m=+0.120767019 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, release=1770267347, name=ubi9/ubi-minimal)
Feb 20 09:38:06 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:38:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:07.628 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14436 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3A1280000000001030307) 
Feb 20 09:38:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:38:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:38:08 np0005625204.localdomain podman[282768]: 2026-02-20 09:38:08.145669026 +0000 UTC m=+0.085796299 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:38:08 np0005625204.localdomain systemd[1]: tmp-crun.qxKaIL.mount: Deactivated successfully.
Feb 20 09:38:08 np0005625204.localdomain podman[282769]: 2026-02-20 09:38:08.203538003 +0000 UTC m=+0.138092974 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:38:08 np0005625204.localdomain podman[282769]: 2026-02-20 09:38:08.210433855 +0000 UTC m=+0.144988826 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:38:08 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:38:08 np0005625204.localdomain podman[282768]: 2026-02-20 09:38:08.265132794 +0000 UTC m=+0.205260087 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 20 09:38:08 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:38:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:09.147 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:12.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:14.149 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:16 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14437 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3C1680000000001030307) 
Feb 20 09:38:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:17.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:38:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:38:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:38:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1"
Feb 20 09:38:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:38:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16795 "" "Go-http-client/1.1"
Feb 20 09:38:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:38:19 np0005625204.localdomain podman[282811]: 2026-02-20 09:38:19.147502338 +0000 UTC m=+0.068706572 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true)
Feb 20 09:38:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:19.150 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:19 np0005625204.localdomain podman[282811]: 2026-02-20 09:38:19.162006786 +0000 UTC m=+0.083211060 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:38:19 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:38:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:22.708 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:24.151 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:38:26 np0005625204.localdomain podman[282830]: 2026-02-20 09:38:26.12424606 +0000 UTC m=+0.065333248 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:38:26 np0005625204.localdomain podman[282830]: 2026-02-20 09:38:26.162038826 +0000 UTC m=+0.103125994 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:38:26 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:38:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:38:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:38:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:38:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:38:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:38:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:38:27 np0005625204.localdomain sudo[282855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:38:27 np0005625204.localdomain sudo[282855]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:38:27 np0005625204.localdomain sudo[282855]: pam_unix(sudo:session): session closed for user root
Feb 20 09:38:27 np0005625204.localdomain sudo[282873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:38:27 np0005625204.localdomain sudo[282873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:38:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:27.712 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:28 np0005625204.localdomain sudo[282873]: pam_unix(sudo:session): session closed for user root
Feb 20 09:38:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:29.152 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:30 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8531 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3FA750000000001030307) 
Feb 20 09:38:31 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8532 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A3FE680000000001030307) 
Feb 20 09:38:31 np0005625204.localdomain sshd[282924]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:38:31 np0005625204.localdomain sshd[282924]: Accepted publickey for zuul from 38.102.83.114 port 50884 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:38:31 np0005625204.localdomain systemd-logind[759]: New session 62 of user zuul.
Feb 20 09:38:31 np0005625204.localdomain systemd[1]: Started Session 62 of User zuul.
Feb 20 09:38:31 np0005625204.localdomain sshd[282924]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 09:38:32 np0005625204.localdomain sudo[282928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:38:32 np0005625204.localdomain sudo[282928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:38:32 np0005625204.localdomain sudo[282928]: pam_unix(sudo:session): session closed for user root
Feb 20 09:38:32 np0005625204.localdomain sudo[282961]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pydnmiglamzjwxwypypentwlljhfabrk ; /usr/bin/python3
Feb 20 09:38:32 np0005625204.localdomain sudo[282961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 09:38:32 np0005625204.localdomain python3[282964]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:38:32 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14438 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A401680000000001030307) 
Feb 20 09:38:32 np0005625204.localdomain systemd-journald[48359]: Field hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.7 (252 of 333 items), suggesting rotation.
Feb 20 09:38:32 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:38:32 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:38:32 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:38:32 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:38:32 np0005625204.localdomain subscription-manager[282965]: Unregistered machine with identity: 430a9023-94d5-4ff5-8ad4-f0155783873a
Feb 20 09:38:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:32.733 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:32 np0005625204.localdomain sudo[282961]: pam_unix(sudo:session): session closed for user root
Feb 20 09:38:33 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8533 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A406680000000001030307) 
Feb 20 09:38:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:38:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:34.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:34 np0005625204.localdomain podman[282968]: 2026-02-20 09:38:34.18779137 +0000 UTC m=+0.121363927 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:38:34 np0005625204.localdomain podman[282968]: 2026-02-20 09:38:34.199937185 +0000 UTC m=+0.133509783 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:38:34 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:38:34 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2801 DF PROTO=TCP SPT=39566 DPT=9102 SEQ=1234505415 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A409690000000001030307) 
Feb 20 09:38:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:38:37 np0005625204.localdomain podman[282990]: 2026-02-20 09:38:37.144498351 +0000 UTC m=+0.082214670 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vcs-type=git, maintainer=Red Hat, Inc.)
Feb 20 09:38:37 np0005625204.localdomain podman[282990]: 2026-02-20 09:38:37.157256194 +0000 UTC m=+0.094972583 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter)
Feb 20 09:38:37 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:38:37 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8534 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A416280000000001030307) 
Feb 20 09:38:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:37.762 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:38:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:38:39 np0005625204.localdomain podman[283010]: 2026-02-20 09:38:39.145707465 +0000 UTC m=+0.083301723 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 20 09:38:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:39.183 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:39 np0005625204.localdomain podman[283011]: 2026-02-20 09:38:39.205537462 +0000 UTC m=+0.140448067 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 20 09:38:39 np0005625204.localdomain podman[283011]: 2026-02-20 09:38:39.235438094 +0000 UTC m=+0.170348699 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Feb 20 09:38:39 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:38:39 np0005625204.localdomain podman[283010]: 2026-02-20 09:38:39.290448473 +0000 UTC m=+0.228042731 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 20 09:38:39 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:38:40 np0005625204.localdomain sshd[283057]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:38:42 np0005625204.localdomain sshd[283057]: Invalid user n8n from 188.166.218.64 port 53188
Feb 20 09:38:42 np0005625204.localdomain sshd[283057]: Received disconnect from 188.166.218.64 port 53188:11: Bye Bye [preauth]
Feb 20 09:38:42 np0005625204.localdomain sshd[283057]: Disconnected from invalid user n8n 188.166.218.64 port 53188 [preauth]
Feb 20 09:38:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:42.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:44.212 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:46 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8535 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A437680000000001030307) 
Feb 20 09:38:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:38:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:38:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:38:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1"
Feb 20 09:38:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:38:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16798 "" "Go-http-client/1.1"
Feb 20 09:38:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:47.821 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:49.247 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:38:50 np0005625204.localdomain systemd[1]: tmp-crun.rhkjpv.mount: Deactivated successfully.
Feb 20 09:38:50 np0005625204.localdomain podman[283059]: 2026-02-20 09:38:50.154821031 +0000 UTC m=+0.093419254 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 20 09:38:50 np0005625204.localdomain podman[283059]: 2026-02-20 09:38:50.167030298 +0000 UTC m=+0.105628521 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:38:50 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.172 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.174 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.174 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.174 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.589 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.589 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.590 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:38:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:51.590 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.040 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.055 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.055 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.056 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.056 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.057 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.057 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.058 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.058 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.058 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.059 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.077 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.078 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.078 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.079 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.079 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.572 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.637 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.638 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.855 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.890 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.892 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12331MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.893 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.893 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.954 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.954 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:38:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:52.955 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:38:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:53.001 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:38:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:53.480 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:38:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:53.486 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:38:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:53.505 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:38:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:53.508 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:38:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:53.508 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:38:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:54.286 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:38:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:38:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:38:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:38:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:38:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:38:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:38:57 np0005625204.localdomain podman[283124]: 2026-02-20 09:38:57.143246355 +0000 UTC m=+0.080131465 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:38:57 np0005625204.localdomain podman[283124]: 2026-02-20 09:38:57.176191181 +0000 UTC m=+0.113076341 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:38:57 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:38:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:57.857 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:38:59.288 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:38:59 np0005625204.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 20 09:39:00 np0005625204.localdomain sshd[283148]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:39:00 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=614 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A46FA50000000001030307) 
Feb 20 09:39:00 np0005625204.localdomain sshd[283148]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:39:01 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=615 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A473A80000000001030307) 
Feb 20 09:39:02 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8536 DF PROTO=TCP SPT=47216 DPT=9102 SEQ=60590503 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A477680000000001030307) 
Feb 20 09:39:02 np0005625204.localdomain sudo[283150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:02 np0005625204.localdomain sudo[283150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:02 np0005625204.localdomain sudo[283150]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:02.888 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:03 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=616 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A47BA90000000001030307) 
Feb 20 09:39:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:04.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:04 np0005625204.localdomain sudo[283168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:04 np0005625204.localdomain sudo[283168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:39:04 np0005625204.localdomain sudo[283168]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:04 np0005625204.localdomain systemd[1]: tmp-crun.UqdEeV.mount: Deactivated successfully.
Feb 20 09:39:04 np0005625204.localdomain podman[283186]: 2026-02-20 09:39:04.452152781 +0000 UTC m=+0.084696007 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:39:04 np0005625204.localdomain podman[283186]: 2026-02-20 09:39:04.489185304 +0000 UTC m=+0.121728580 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:39:04 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:39:04 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14439 DF PROTO=TCP SPT=40906 DPT=9102 SEQ=2418160281 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A47F680000000001030307) 
Feb 20 09:39:05 np0005625204.localdomain sudo[283210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:05 np0005625204.localdomain sudo[283210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:05 np0005625204.localdomain sudo[283210]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:39:06.002 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:39:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:39:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:39:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:39:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:39:07 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=617 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A48B690000000001030307) 
Feb 20 09:39:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:07.930 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:39:08 np0005625204.localdomain podman[283228]: 2026-02-20 09:39:08.132170407 +0000 UTC m=+0.073399057 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal)
Feb 20 09:39:08 np0005625204.localdomain podman[283228]: 2026-02-20 09:39:08.145718305 +0000 UTC m=+0.086946965 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, vcs-type=git, release=1770267347, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64)
Feb 20 09:39:08 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:39:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:09.377 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:39:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:39:10 np0005625204.localdomain podman[283249]: 2026-02-20 09:39:10.14821219 +0000 UTC m=+0.085177350 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:39:10 np0005625204.localdomain podman[283248]: 2026-02-20 09:39:10.197360667 +0000 UTC m=+0.136635879 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Feb 20 09:39:10 np0005625204.localdomain podman[283249]: 2026-02-20 09:39:10.228093325 +0000 UTC m=+0.165058455 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 20 09:39:10 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:39:10 np0005625204.localdomain podman[283248]: 2026-02-20 09:39:10.283915378 +0000 UTC m=+0.223190560 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:39:10 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:39:11 np0005625204.localdomain sshd[283290]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:39:12 np0005625204.localdomain sshd[283290]: Received disconnect from 182.93.7.194 port 46166:11: Bye Bye [preauth]
Feb 20 09:39:12 np0005625204.localdomain sshd[283290]: Disconnected from authenticating user root 182.93.7.194 port 46166 [preauth]
Feb 20 09:39:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:12.970 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:13 np0005625204.localdomain sshd[283292]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:39:13 np0005625204.localdomain sshd[283292]: Accepted publickey for tripleo-admin from 192.168.122.11 port 37722 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:39:13 np0005625204.localdomain systemd-logind[759]: New session 63 of user tripleo-admin.
Feb 20 09:39:13 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 20 09:39:13 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 20 09:39:13 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 20 09:39:13 np0005625204.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Queued start job for default target Main User Target.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Created slice User Application Slice.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Reached target Paths.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Reached target Timers.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Starting D-Bus User Message Bus Socket...
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Starting Create User's Volatile Files and Directories...
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Listening on D-Bus User Message Bus Socket.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Reached target Sockets.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Finished Create User's Volatile Files and Directories.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Reached target Basic System.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Reached target Main User Target.
Feb 20 09:39:13 np0005625204.localdomain systemd[283296]: Startup finished in 170ms.
Feb 20 09:39:13 np0005625204.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 20 09:39:13 np0005625204.localdomain systemd[1]: Started Session 63 of User tripleo-admin.
Feb 20 09:39:13 np0005625204.localdomain sshd[283292]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 20 09:39:14 np0005625204.localdomain sudo[283437]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqfakjiirqbvfbwnspkneebdnclpiowp ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771580353.8067753-62823-128508241316596/AnsiballZ_blockinfile.py
Feb 20 09:39:14 np0005625204.localdomain sudo[283437]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 09:39:14 np0005625204.localdomain python3[283439]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:39:14 np0005625204.localdomain sudo[283437]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:14.412 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:15 np0005625204.localdomain sudo[283581]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vftmfucxmfiokbmtkuozmwlfvqvdirhu ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771580354.6060088-62839-229050608328416/AnsiballZ_systemd.py
Feb 20 09:39:15 np0005625204.localdomain sudo[283581]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 09:39:15 np0005625204.localdomain python3[283583]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 20 09:39:15 np0005625204.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:b0:bd:ac MACDST=fa:16:3e:ba:18:b1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=618 DF PROTO=TCP SPT=33758 DPT=9102 SEQ=3728658166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080A59A4AB680000000001030307) 
Feb 20 09:39:16 np0005625204.localdomain systemd[1]: Stopping Netfilter Tables...
Feb 20 09:39:16 np0005625204.localdomain systemd[1]: nftables.service: Deactivated successfully.
Feb 20 09:39:16 np0005625204.localdomain systemd[1]: Stopped Netfilter Tables.
Feb 20 09:39:16 np0005625204.localdomain systemd[1]: Starting Netfilter Tables...
Feb 20 09:39:16 np0005625204.localdomain systemd[1]: Finished Netfilter Tables.
Feb 20 09:39:16 np0005625204.localdomain sudo[283581]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:17 np0005625204.localdomain sshd[283607]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:39:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:39:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:39:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:39:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 148534 "" "Go-http-client/1.1"
Feb 20 09:39:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:39:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16805 "" "Go-http-client/1.1"
Feb 20 09:39:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:18.000 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.206 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.207 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain sshd[283607]: Invalid user mysqladmin from 54.36.99.29 port 54076
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.240 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.240 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73b692cd-f415-470c-b4ab-0c771134f529', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.207883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f1dc1c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'ee9c663745fe97c61f07363f06a3fc1b3c1e5142efa73d6a16537217fdcabd1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.207883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f1f0c6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '8b841ebbaecf2f042798d138f3c1364011eca809eb6ccbaa972bf27a3f20b2d1'}]}, 'timestamp': '2026-02-20 09:39:18.241317', '_unique_id': 'c54326b00586456c8e358a1f533c17d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.242 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.244 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 8786 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5329101e-6210-4efe-b4bf-76ee4d2d0fc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 8786, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.244317', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f33486-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '6003d56d6c185069487bdb80ad7b10755df6f422b37ec764c53e4114814247e9'}]}, 'timestamp': '2026-02-20 09:39:18.249671', '_unique_id': '6a9380c231cd41279de995091f2ad8f2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.250 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.251 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.252 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f3040e2-1e59-494e-9421-629a8e6d5087', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.252066', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f3a704-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'a92123936a442f2fede5ca3f55899b4490410d6937770069389ae024f6071607'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.252066', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f3bb68-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'c67afcdd1ebc5d5c897cc9cb80bcafc403d222c044ccc743fe4dae802247e580'}]}, 'timestamp': '2026-02-20 09:39:18.253064', '_unique_id': 'e49b8534f8c74c259132de9a5f2e213d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.255 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '946aace1-ebca-45a0-9687-a75e5f8d0bd1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.255443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f42bac-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': 'b778305166c57ad0197da68064e340c9c33c9f2d08ec8a42f5f395a1bf11d7f3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.255443', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f43e58-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '5274ae861fba453ce040f40746a1034e9753d3add6d1a0b69637d9f3f1433612'}]}, 'timestamp': '2026-02-20 09:39:18.256405', '_unique_id': 'de7b27e97d444ceb860c38b488d5c76f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.259 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0ec1c70-30a7-49f7-8ba5-002805fcc111', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.258813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f4ae24-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '4d12b331b486fa7af372ab7091afcf36485bfb53abc66711dd616ee2a13306c2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.258813', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f4c3aa-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '7bd86e9a1ee20c342115a53766b6f4adfc6c58f6d43b36df0c909ebb65a98f60'}]}, 'timestamp': '2026-02-20 09:39:18.259932', '_unique_id': '0ef628fe89e440f99936f904d3b9b311'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.260 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6dce8f2e-87e3-4f54-89f5-4e93b15f29d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.262189', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f531dc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': 'd41ffc208b0f2c4e63c326ad3f9887254326c4c590371fd77678c8b4dfa034c3'}]}, 'timestamp': '2026-02-20 09:39:18.262696', '_unique_id': 'd5ead5902ac245cdb4d54ca57d047978'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.263 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.264 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f27a502-f0b0-4b10-9df7-19f172aee987', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.264767', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f59622-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '8c50742407c66f2910a5218e304c3b42a7e1432626c8e3989db7b01de26d43b1'}]}, 'timestamp': '2026-02-20 09:39:18.265225', '_unique_id': '4ab51598e3ba4120b910b509dbd57cad'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '835ab93c-b5df-41c8-b87f-8a986e6d7a6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.267303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06f7ad54-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '8c61f5995d17b8eae4c2aa55bc48ea93c479ea0f353ffa3cc08171013c81b39e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.267303', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06f7bea2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '56309ac1167b2086bf29485462eff2f595f7891a578c9f376afa0ffb3dfd4500'}]}, 'timestamp': '2026-02-20 09:39:18.279337', '_unique_id': '3419b9dcb4ad4b05af743fd18aca0bbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.281 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f0a9b131-3fca-49fd-8823-bf6405ea8b13', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.281953', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f8363e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '527886f55b226962a8f6f893f3425863455ebc0242bfefa32c07d7722ac711c5'}]}, 'timestamp': '2026-02-20 09:39:18.282427', '_unique_id': '0ccadb522f4f495d95ac54b1d10054f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.283 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.284 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3080cc46-cda0-4c89-ac61-a64ddcc04934', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.284500', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f89a52-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '72d47c827183392c0ff599418df6e70c6bd651408caa34c7d1ee65530c4f5ce9'}]}, 'timestamp': '2026-02-20 09:39:18.284992', '_unique_id': '28cd9e572c454a6f9fc874a130aa6aae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.286 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23fbc0af-636a-4d5f-a4cd-133c0bc8932e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.287226', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06f90352-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '57a493a0a076a8fc018d7e107ee6cf1bc84a78afe1196b1b8bb64f560c364335'}]}, 'timestamp': '2026-02-20 09:39:18.287707', '_unique_id': 'c42b9ff86dcc40c399127ffe6d0625c4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.289 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain sshd[283607]: Received disconnect from 54.36.99.29 port 54076:11: Bye Bye [preauth]
Feb 20 09:39:18 np0005625204.localdomain sshd[283607]: Disconnected from invalid user mysqladmin 54.36.99.29 port 54076 [preauth]
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 12790000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b21e0a7-3cfa-4010-9aba-6ba6a144eb75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12790000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:39:18.289925', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '06fc3ed2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.547470831, 'message_signature': '33f55981dadb3f2fded64b016b56856c9630d20152505bc2ec2f86e9cfbe7e45'}]}, 'timestamp': '2026-02-20 09:39:18.308889', '_unique_id': 'c8d0c37a760547418ba4fa0dcd03d7fc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.310 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32f3d706-994f-40a8-adfd-256baf74a4d4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.310993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06fca3cc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '99be17f8c70d875c931cb6087003622d9bf28a37538943628f1fc31a45002d1a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.310993', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06fcb330-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '4c402f859b728a6cb5efd6cfe4b3ef8f3693125720825df5871740ea47b390c0'}]}, 'timestamp': '2026-02-20 09:39:18.311843', '_unique_id': 'a2d6f59d4a6b42668ec275e6f733fa27'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1dbfd6a-3f8e-47ed-8767-8583cbf952c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.313950', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fd1776-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': 'ed2076580d54a86cfa626750130bc0cad8116a85daffbc2361fe90e53be376db'}]}, 'timestamp': '2026-02-20 09:39:18.314408', '_unique_id': '718d4925cff247719ef1e2ddabbb711c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '477d61d3-97d8-40fc-85e2-f489ac8c1c1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.316447', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fd7a04-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '725a9ba815b87b2fcc493ae427d6ca7c922fb1d948332c5ee879df6f91b9ad59'}]}, 'timestamp': '2026-02-20 09:39:18.316930', '_unique_id': 'cf9fa9ad75e94ee08cfe097fd0fe2275'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.317 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.318 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 5839 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab52d0c2-f99d-46c2-81e5-b6e5ffb70612', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 5839, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.318987', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fddbf2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '8f5d57e8aa01914c40219d282109d3c208316498119e5fc8b0fb1ec389978193'}]}, 'timestamp': '2026-02-20 09:39:18.319435', '_unique_id': '0fa6be3f2f76403c932940c9ca671f7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '023fb75c-4a44-4687-8ad8-27219f4b4614', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.321468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06fe3ea8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '69e23a18ed303adc8733045c0f78729ac74c8fa0aee5ec510793f2dda2616559'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.321468', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06fe4f24-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.447081972, 'message_signature': '95dd703cbe16c0b1e454a0cc2efdc524204101a631b2bce571fef2662238cefe'}]}, 'timestamp': '2026-02-20 09:39:18.322364', '_unique_id': '7dba012e021e4c0a9851e5e524be3222'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.324 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4f6c0c1-0403-4af8-abfb-56556a4f5865', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:39:18.324435', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '06feb1ee-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.547470831, 'message_signature': 'dbe6757ab778d5cf4338acacb65f53c03265d9c094a81297dd13d78028c68914'}]}, 'timestamp': '2026-02-20 09:39:18.324898', '_unique_id': '8c6efea9a48840cfa8b142f922008572'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b7ee831-e59d-4569-9883-b228b1794a7b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.326935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06ff1292-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '8265f9c1f66d77397c840668359c9153ddbe18fde9f0fb880d468e8b693b05c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.326935', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06ff228c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '2192187047f62b311f84a113c5b48253133321ed0e9fefbc5f43eb6f815dbf3c'}]}, 'timestamp': '2026-02-20 09:39:18.327800', '_unique_id': 'ea31809b1307488da660b807a2c49de2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7eae78f8-dd82-4ec0-98eb-5072d01f4a19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:39:18.330023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '06ff8ad8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': 'd4010639a29e3fd462dde92af2efb4532351908aeef8a56763602767958b2823'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:39:18.330023', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '06ff9bb8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.506501206, 'message_signature': '52bd550b7187682bf4e57f0ada400cf3865427f0105e70d4673e482f28967c26'}]}, 'timestamp': '2026-02-20 09:39:18.330874', '_unique_id': '36caaf21a082427db728c94b6b1f9c4d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.333 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce79320e-5c0a-484d-aeda-cae262a2267d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:39:18.332958', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '06fffdd8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 10917.483515237, 'message_signature': '1a1737010af18add42c883e5a621c61d56f2b00bae9c8650f4cd2a4163becf43'}]}, 'timestamp': '2026-02-20 09:39:18.333410', '_unique_id': 'aa664bb066ff4ea6a1f94afb2a3b2ac2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:39:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:39:18.334 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:39:18 np0005625204.localdomain sudo[283609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:39:18 np0005625204.localdomain sudo[283609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:18 np0005625204.localdomain sudo[283609]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:19 np0005625204.localdomain sudo[283627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:39:19 np0005625204.localdomain sudo[283627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:19 np0005625204.localdomain sudo[283627]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:19.442 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:39:21 np0005625204.localdomain systemd[1]: tmp-crun.hPnKsn.mount: Deactivated successfully.
Feb 20 09:39:21 np0005625204.localdomain podman[283664]: 2026-02-20 09:39:21.152805647 +0000 UTC m=+0.085193691 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:39:21 np0005625204.localdomain podman[283664]: 2026-02-20 09:39:21.162705993 +0000 UTC m=+0.095094057 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 20 09:39:21 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:39:22 np0005625204.localdomain sudo[283681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:22 np0005625204.localdomain sudo[283681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:22 np0005625204.localdomain sudo[283681]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:23.003 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:23 np0005625204.localdomain sudo[283699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:23 np0005625204.localdomain sudo[283699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:23 np0005625204.localdomain sudo[283699]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:23 np0005625204.localdomain sudo[283717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:23 np0005625204.localdomain sudo[283717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:23 np0005625204.localdomain sudo[283717]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:24.479 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:24 np0005625204.localdomain sudo[283735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:24 np0005625204.localdomain sudo[283735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:24 np0005625204.localdomain sudo[283735]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:26 np0005625204.localdomain sudo[283753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:26 np0005625204.localdomain sudo[283753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:26 np0005625204.localdomain sudo[283753]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:39:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:39:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:39:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:39:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:39:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:39:27 np0005625204.localdomain sudo[283771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:39:27 np0005625204.localdomain sudo[283771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:39:27 np0005625204.localdomain sudo[283771]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:27 np0005625204.localdomain sudo[283790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:39:27 np0005625204.localdomain sudo[283790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:27 np0005625204.localdomain systemd[1]: tmp-crun.sMSXpq.mount: Deactivated successfully.
Feb 20 09:39:27 np0005625204.localdomain podman[283789]: 2026-02-20 09:39:27.409531954 +0000 UTC m=+0.103238868 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:39:27 np0005625204.localdomain podman[283789]: 2026-02-20 09:39:27.419203732 +0000 UTC m=+0.112910676 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:39:27 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:39:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:28.006 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 2026-02-20 09:39:28.036764796 +0000 UTC m=+0.083933072 container create e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, name=rhceph, RELEASE=main, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: Started libpod-conmon-e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771.scope.
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 2026-02-20 09:39:28.005347856 +0000 UTC m=+0.052516202 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 2026-02-20 09:39:28.135357309 +0000 UTC m=+0.182525585 container init e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 2026-02-20 09:39:28.146830124 +0000 UTC m=+0.193998410 container start e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, architecture=x86_64, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, vcs-type=git, GIT_CLEAN=True, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 2026-02-20 09:39:28.147672599 +0000 UTC m=+0.194840935 container attach e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, build-date=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, name=rhceph, version=7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:39:28 np0005625204.localdomain vibrant_jemison[283891]: 167 167
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: libpod-e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771.scope: Deactivated successfully.
Feb 20 09:39:28 np0005625204.localdomain podman[283876]: 2026-02-20 09:39:28.152199309 +0000 UTC m=+0.199367615 container died e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347)
Feb 20 09:39:28 np0005625204.localdomain podman[283896]: 2026-02-20 09:39:28.249181852 +0000 UTC m=+0.083104835 container remove e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jemison, GIT_CLEAN=True, architecture=x86_64, vcs-type=git, RELEASE=main, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=1770267347, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph)
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: libpod-conmon-e46634454bd02979f0ab2c6cb412238a00fd1afe2dd41373e982406c7277f771.scope: Deactivated successfully.
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:39:28 np0005625204.localdomain systemd-rc-local-generator[283934]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:39:28 np0005625204.localdomain systemd-sysv-generator[283940]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e7762a87729e6faf8de3f77f15b2bfe28c40ae41738f2fe65325dcca7d8c2ca4-merged.mount: Deactivated successfully.
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:39:28 np0005625204.localdomain systemd-sysv-generator[283982]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:39:28 np0005625204.localdomain systemd-rc-local-generator[283979]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:28 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:39:29 np0005625204.localdomain systemd[1]: Starting Ceph mds.mds.np0005625204.wnsphl for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 09:39:29 np0005625204.localdomain podman[284043]: 
Feb 20 09:39:29 np0005625204.localdomain podman[284043]: 2026-02-20 09:39:29.406353113 +0000 UTC m=+0.078513955 container create f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., version=7)
Feb 20 09:39:29 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:39:29 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 09:39:29 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 09:39:29 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8787d6f317aff53ba0fdc5c30f1a1911fd26611ed096de37d13cf2d1d6681323/merged/var/lib/ceph/mds/ceph-mds.np0005625204.wnsphl supports timestamps until 2038 (0x7fffffff)
Feb 20 09:39:29 np0005625204.localdomain podman[284043]: 2026-02-20 09:39:29.456447859 +0000 UTC m=+0.128608691 container init f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, build-date=2026-02-09T10:25:24Z)
Feb 20 09:39:29 np0005625204.localdomain podman[284043]: 2026-02-20 09:39:29.467176941 +0000 UTC m=+0.139337773 container start f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z)
Feb 20 09:39:29 np0005625204.localdomain bash[284043]: f489614c7976d084794979376ce3e3c066a553b0e4a87593ef4817bb5a855475
Feb 20 09:39:29 np0005625204.localdomain podman[284043]: 2026-02-20 09:39:29.371486947 +0000 UTC m=+0.043647799 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:39:29 np0005625204.localdomain systemd[1]: Started Ceph mds.mds.np0005625204.wnsphl for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 09:39:29 np0005625204.localdomain ceph-mds[284061]: set uid:gid to 167:167 (ceph:ceph)
Feb 20 09:39:29 np0005625204.localdomain ceph-mds[284061]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mds, pid 2
Feb 20 09:39:29 np0005625204.localdomain ceph-mds[284061]: main not setting numa affinity
Feb 20 09:39:29 np0005625204.localdomain ceph-mds[284061]: pidfile_write: ignore empty --pid-file
Feb 20 09:39:29 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mds-mds-np0005625204-wnsphl[284057]: starting mds.mds.np0005625204.wnsphl at 
Feb 20 09:39:29 np0005625204.localdomain sudo[283790]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:29 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Updating MDS map to version 7 from mon.1
Feb 20 09:39:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:29.533 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:30 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Updating MDS map to version 8 from mon.1
Feb 20 09:39:30 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Monitors have assigned me to become a standby.
Feb 20 09:39:32 np0005625204.localdomain sshd[282927]: Received disconnect from 38.102.83.114 port 50884:11: disconnected by user
Feb 20 09:39:32 np0005625204.localdomain sshd[282927]: Disconnected from user zuul 38.102.83.114 port 50884
Feb 20 09:39:32 np0005625204.localdomain sshd[282924]: pam_unix(sshd:session): session closed for user zuul
Feb 20 09:39:32 np0005625204.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Feb 20 09:39:32 np0005625204.localdomain systemd-logind[759]: Session 62 logged out. Waiting for processes to exit.
Feb 20 09:39:32 np0005625204.localdomain systemd-logind[759]: Removed session 62.
Feb 20 09:39:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:33.008 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:34.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:34 np0005625204.localdomain sudo[284081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:34 np0005625204.localdomain sudo[284081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:39:34 np0005625204.localdomain sudo[284081]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:34 np0005625204.localdomain sudo[284105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:39:34 np0005625204.localdomain systemd[1]: tmp-crun.fmLnjx.mount: Deactivated successfully.
Feb 20 09:39:34 np0005625204.localdomain sudo[284105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:34 np0005625204.localdomain podman[284099]: 2026-02-20 09:39:34.902290455 +0000 UTC m=+0.088667388 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:39:34 np0005625204.localdomain sudo[284105]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:34 np0005625204.localdomain podman[284099]: 2026-02-20 09:39:34.911167569 +0000 UTC m=+0.097544512 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:39:34 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:39:34 np0005625204.localdomain sudo[284139]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:39:34 np0005625204.localdomain sudo[284139]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:35 np0005625204.localdomain podman[284229]: 2026-02-20 09:39:35.818241789 +0000 UTC m=+0.085347425 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, name=rhceph, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph)
Feb 20 09:39:35 np0005625204.localdomain podman[284229]: 2026-02-20 09:39:35.905989288 +0000 UTC m=+0.173094914 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, distribution-scope=public, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, vcs-type=git, ceph=True, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.component=rhceph-container)
Feb 20 09:39:36 np0005625204.localdomain sudo[284139]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:36 np0005625204.localdomain sudo[284315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:39:36 np0005625204.localdomain sudo[284315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:36 np0005625204.localdomain sudo[284315]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:36 np0005625204.localdomain sudo[284333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:39:36 np0005625204.localdomain sudo[284333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:37 np0005625204.localdomain sudo[284333]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:37 np0005625204.localdomain sudo[284383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:37 np0005625204.localdomain sudo[284383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:37 np0005625204.localdomain sudo[284383]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:38.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:38 np0005625204.localdomain sudo[284401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:39:38 np0005625204.localdomain sudo[284401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:39:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:39:38 np0005625204.localdomain sudo[284401]: pam_unix(sudo:session): session closed for user root
Feb 20 09:39:38 np0005625204.localdomain podman[284419]: 2026-02-20 09:39:38.484543194 +0000 UTC m=+0.070613290 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1770267347, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.tags=minimal rhel9)
Feb 20 09:39:38 np0005625204.localdomain podman[284419]: 2026-02-20 09:39:38.523039783 +0000 UTC m=+0.109109859 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, version=9.7, release=1770267347, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 20 09:39:38 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:39:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:39.616 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:39:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:39:41 np0005625204.localdomain podman[284439]: 2026-02-20 09:39:41.151797859 +0000 UTC m=+0.088472462 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:39:41 np0005625204.localdomain podman[284439]: 2026-02-20 09:39:41.187434599 +0000 UTC m=+0.124109192 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:39:41 np0005625204.localdomain podman[284440]: 2026-02-20 09:39:41.19913588 +0000 UTC m=+0.134534384 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:39:41 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:39:41 np0005625204.localdomain podman[284440]: 2026-02-20 09:39:41.232151819 +0000 UTC m=+0.167550283 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:39:41 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:39:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:43.864 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:44.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:39:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 4979 writes, 22K keys, 4979 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 4979 writes, 657 syncs, 7.58 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 40 writes, 107 keys, 40 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s
                                                          Interval WAL: 40 writes, 20 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:39:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:39:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:39:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:39:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150740 "" "Go-http-client/1.1"
Feb 20 09:39:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:39:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17284 "" "Go-http-client/1.1"
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.053 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.053 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.077 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.078 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.078 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.871 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.913 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.914 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.914 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:39:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:48.915 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.379 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.394 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.394 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.395 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.396 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.396 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.397 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.397 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.398 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.398 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.399 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.414 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.415 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.416 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.416 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.417 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.672 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.870 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.949 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:39:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:49.950 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.174 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.176 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12305MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.176 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.177 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.257 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.257 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.257 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.292 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.723 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.730 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.759 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.761 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:39:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:50.762 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:39:51 np0005625204.localdomain sshd[284526]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:39:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5820 writes, 25K keys, 5820 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5820 writes, 845 syncs, 6.89 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 104 writes, 322 keys, 104 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 104 writes, 42 syncs, 2.48 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:39:51 np0005625204.localdomain sshd[284526]: Invalid user sol from 45.148.10.240 port 56820
Feb 20 09:39:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:39:51 np0005625204.localdomain podman[284528]: 2026-02-20 09:39:51.828854092 +0000 UTC m=+0.097351642 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute)
Feb 20 09:39:51 np0005625204.localdomain podman[284528]: 2026-02-20 09:39:51.868079294 +0000 UTC m=+0.136576834 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 20 09:39:51 np0005625204.localdomain sshd[284526]: Connection closed by invalid user sol 45.148.10.240 port 56820 [preauth]
Feb 20 09:39:51 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:39:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:53.876 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:54.699 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:39:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:39:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:39:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:39:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:39:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:39:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:39:58 np0005625204.localdomain podman[284547]: 2026-02-20 09:39:58.14132832 +0000 UTC m=+0.080600291 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:39:58 np0005625204.localdomain podman[284547]: 2026-02-20 09:39:58.14935239 +0000 UTC m=+0.088624351 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:39:58 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:39:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:58.881 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:39:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:39:59.701 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:01 np0005625204.localdomain sshd[284569]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:40:02 np0005625204.localdomain sshd[284569]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:40:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:03.884 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:04.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:40:05 np0005625204.localdomain podman[284571]: 2026-02-20 09:40:05.143051715 +0000 UTC m=+0.081090357 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:40:05 np0005625204.localdomain podman[284571]: 2026-02-20 09:40:05.158811016 +0000 UTC m=+0.096849638 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:40:05 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:40:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:40:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:40:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:40:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:40:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:40:06.004 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:40:08 np0005625204.localdomain sudo[284593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:08 np0005625204.localdomain sudo[284593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:40:08 np0005625204.localdomain sudo[284593]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:08 np0005625204.localdomain podman[284611]: 2026-02-20 09:40:08.878174939 +0000 UTC m=+0.091956565 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, vcs-type=git, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:40:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:08.885 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:08 np0005625204.localdomain podman[284611]: 2026-02-20 09:40:08.899049038 +0000 UTC m=+0.112830674 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 20 09:40:08 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:40:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:09.776 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:10 np0005625204.localdomain sshd[284632]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:40:11 np0005625204.localdomain sshd[284632]: Invalid user sqlserver from 154.91.170.41 port 47534
Feb 20 09:40:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:40:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:40:11 np0005625204.localdomain sshd[284632]: Received disconnect from 154.91.170.41 port 47534:11: Bye Bye [preauth]
Feb 20 09:40:11 np0005625204.localdomain sshd[284632]: Disconnected from invalid user sqlserver 154.91.170.41 port 47534 [preauth]
Feb 20 09:40:11 np0005625204.localdomain systemd[1]: tmp-crun.ekaiHt.mount: Deactivated successfully.
Feb 20 09:40:11 np0005625204.localdomain podman[284634]: 2026-02-20 09:40:11.585975338 +0000 UTC m=+0.090577051 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:40:11 np0005625204.localdomain podman[284635]: 2026-02-20 09:40:11.629034539 +0000 UTC m=+0.130109553 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:40:11 np0005625204.localdomain podman[284634]: 2026-02-20 09:40:11.656107472 +0000 UTC m=+0.160709195 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 20 09:40:11 np0005625204.localdomain podman[284635]: 2026-02-20 09:40:11.663148122 +0000 UTC m=+0.164223156 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true)
Feb 20 09:40:11 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:40:11 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:40:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:13.888 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:14.809 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:16 np0005625204.localdomain sshd[283312]: Received disconnect from 192.168.122.11 port 37722:11: disconnected by user
Feb 20 09:40:16 np0005625204.localdomain sshd[283312]: Disconnected from user tripleo-admin 192.168.122.11 port 37722
Feb 20 09:40:16 np0005625204.localdomain sshd[283292]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 20 09:40:16 np0005625204.localdomain systemd[1]: session-63.scope: Deactivated successfully.
Feb 20 09:40:16 np0005625204.localdomain systemd[1]: session-63.scope: Consumed 1.343s CPU time.
Feb 20 09:40:16 np0005625204.localdomain systemd-logind[759]: Session 63 logged out. Waiting for processes to exit.
Feb 20 09:40:16 np0005625204.localdomain systemd-logind[759]: Removed session 63.
Feb 20 09:40:17 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 20 09:40:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:40:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:40:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:40:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150740 "" "Go-http-client/1.1"
Feb 20 09:40:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:40:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17282 "" "Go-http-client/1.1"
Feb 20 09:40:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:18.891 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:19.845 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:40:22 np0005625204.localdomain podman[284677]: 2026-02-20 09:40:22.151501878 +0000 UTC m=+0.090403047 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:40:22 np0005625204.localdomain podman[284677]: 2026-02-20 09:40:22.165980819 +0000 UTC m=+0.104882018 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:40:22 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:40:22 np0005625204.localdomain sudo[284696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:22 np0005625204.localdomain sudo[284696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:22 np0005625204.localdomain sudo[284696]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:22 np0005625204.localdomain sudo[284714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:40:22 np0005625204.localdomain sudo[284714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:22 np0005625204.localdomain sudo[284714]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:22 np0005625204.localdomain sudo[284732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 -- inventory --format=json-pretty --filter-for-batch
Feb 20 09:40:22 np0005625204.localdomain sudo[284732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 2026-02-20 09:40:23.546925215 +0000 UTC m=+0.076331469 container create deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, release=1770267347, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Feb 20 09:40:23 np0005625204.localdomain systemd[1]: Started libpod-conmon-deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082.scope.
Feb 20 09:40:23 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 2026-02-20 09:40:23.517847959 +0000 UTC m=+0.047254233 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 2026-02-20 09:40:23.627859396 +0000 UTC m=+0.157265630 container init deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7)
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 2026-02-20 09:40:23.647873589 +0000 UTC m=+0.177279823 container start deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, name=rhceph, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 2026-02-20 09:40:23.648630272 +0000 UTC m=+0.178036546 container attach deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_CLEAN=True, RELEASE=main, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, distribution-scope=public)
Feb 20 09:40:23 np0005625204.localdomain hungry_kare[284805]: 167 167
Feb 20 09:40:23 np0005625204.localdomain systemd[1]: libpod-deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082.scope: Deactivated successfully.
Feb 20 09:40:23 np0005625204.localdomain podman[284790]: 2026-02-20 09:40:23.652689048 +0000 UTC m=+0.182095302 container died deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, distribution-scope=public, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:40:23 np0005625204.localdomain podman[284810]: 2026-02-20 09:40:23.75644754 +0000 UTC m=+0.089647492 container remove deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_kare, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, name=rhceph, ceph=True, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:40:23 np0005625204.localdomain systemd[1]: libpod-conmon-deb173a915cfb49bb38af74c395ae3ce3fea1c63664aa43aa729512210d5d082.scope: Deactivated successfully.
Feb 20 09:40:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:23.894 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:23 np0005625204.localdomain podman[284832]: 
Feb 20 09:40:23 np0005625204.localdomain podman[284832]: 2026-02-20 09:40:23.98954744 +0000 UTC m=+0.077951829 container create 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, vendor=Red Hat, Inc., io.buildah.version=1.42.2, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True)
Feb 20 09:40:24 np0005625204.localdomain systemd[1]: Started libpod-conmon-48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32.scope.
Feb 20 09:40:24 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:40:24 np0005625204.localdomain podman[284832]: 2026-02-20 09:40:23.959620607 +0000 UTC m=+0.048025016 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:40:24 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 20 09:40:24 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 09:40:24 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:40:24 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 09:40:24 np0005625204.localdomain podman[284832]: 2026-02-20 09:40:24.070411548 +0000 UTC m=+0.158815907 container init 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, RELEASE=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True)
Feb 20 09:40:24 np0005625204.localdomain podman[284832]: 2026-02-20 09:40:24.080839432 +0000 UTC m=+0.169243801 container start 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1770267347, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Feb 20 09:40:24 np0005625204.localdomain podman[284832]: 2026-02-20 09:40:24.0810928 +0000 UTC m=+0.169497199 container attach 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, ceph=True, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64)
Feb 20 09:40:24 np0005625204.localdomain systemd[1]: tmp-crun.6hsQM0.mount: Deactivated successfully.
Feb 20 09:40:24 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-6f0f50769b2da77e064c5711c04601da71302746682bc5484215241e6fe690ea-merged.mount: Deactivated successfully.
Feb 20 09:40:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:24.879 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]: [
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:     {
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "available": false,
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "ceph_device": false,
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "lsm_data": {},
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "lvs": [],
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "path": "/dev/sr0",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "rejected_reasons": [
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "Has a FileSystem",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "Insufficient space (<5GB)"
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         ],
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         "sys_api": {
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "actuators": null,
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "device_nodes": "sr0",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "human_readable_size": "482.00 KB",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "id_bus": "ata",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "model": "QEMU DVD-ROM",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "nr_requests": "2",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "partitions": {},
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "path": "/dev/sr0",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "removable": "1",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "rev": "2.5+",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "ro": "0",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "rotational": "1",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "sas_address": "",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "sas_device_handle": "",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "scheduler_mode": "mq-deadline",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "sectors": 0,
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "sectorsize": "2048",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "size": 493568.0,
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "support_discard": "0",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "type": "disk",
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:             "vendor": "QEMU"
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:         }
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]:     }
Feb 20 09:40:25 np0005625204.localdomain priceless_swirles[284847]: ]
Feb 20 09:40:25 np0005625204.localdomain systemd[1]: libpod-48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32.scope: Deactivated successfully.
Feb 20 09:40:25 np0005625204.localdomain podman[286553]: 2026-02-20 09:40:25.121267964 +0000 UTC m=+0.046017744 container died 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.42.2, version=7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:40:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0177be9e094a73715e5cb2ecca08f8e921c8b9374a34207279c51c3c2fac2232-merged.mount: Deactivated successfully.
Feb 20 09:40:25 np0005625204.localdomain podman[286553]: 2026-02-20 09:40:25.169835847 +0000 UTC m=+0.094585577 container remove 48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=priceless_swirles, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.42.2, vcs-type=git, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z)
Feb 20 09:40:25 np0005625204.localdomain systemd[1]: libpod-conmon-48592510d9739930b1e53f99424e060d872e9196792cef3f6b203b64de8ebf32.scope: Deactivated successfully.
Feb 20 09:40:25 np0005625204.localdomain sudo[284732]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:25 np0005625204.localdomain sudo[286566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:25 np0005625204.localdomain sudo[286566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:25 np0005625204.localdomain sudo[286566]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:26 np0005625204.localdomain sudo[286584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:26 np0005625204.localdomain sudo[286584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:26 np0005625204.localdomain sudo[286584]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Activating special unit Exit the Session...
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped target Main User Target.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped target Basic System.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped target Paths.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped target Sockets.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped target Timers.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Closed D-Bus User Message Bus Socket.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Stopped Create User's Volatile Files and Directories.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Removed slice User Application Slice.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Reached target Shutdown.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Finished Exit the Session.
Feb 20 09:40:26 np0005625204.localdomain systemd[283296]: Reached target Exit the Session.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 20 09:40:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:40:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:40:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:40:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:40:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:40:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 20 09:40:26 np0005625204.localdomain systemd[1]: user-1003.slice: Consumed 1.778s CPU time.
Feb 20 09:40:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:28.897 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:40:29 np0005625204.localdomain podman[286605]: 2026-02-20 09:40:29.152350324 +0000 UTC m=+0.087652552 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:40:29 np0005625204.localdomain podman[286605]: 2026-02-20 09:40:29.163103999 +0000 UTC m=+0.098406277 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:40:29 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:40:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:29.920 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:33.900 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:34.969 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:40:36 np0005625204.localdomain podman[286629]: 2026-02-20 09:40:36.162189201 +0000 UTC m=+0.100052477 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:40:36 np0005625204.localdomain podman[286629]: 2026-02-20 09:40:36.172998147 +0000 UTC m=+0.110861413 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:40:36 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:40:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:38.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:40:39 np0005625204.localdomain podman[286652]: 2026-02-20 09:40:39.149134413 +0000 UTC m=+0.085934588 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter)
Feb 20 09:40:39 np0005625204.localdomain podman[286652]: 2026-02-20 09:40:39.191965537 +0000 UTC m=+0.128765692 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:40:39 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:40:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:40.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:40 np0005625204.localdomain sshd[286672]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:40:40 np0005625204.localdomain sshd[286672]: Received disconnect from 18.221.252.160 port 59560:11: Bye Bye [preauth]
Feb 20 09:40:40 np0005625204.localdomain sshd[286672]: Disconnected from authenticating user root 18.221.252.160 port 59560 [preauth]
Feb 20 09:40:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:40:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:40:42 np0005625204.localdomain podman[286674]: 2026-02-20 09:40:42.153821067 +0000 UTC m=+0.087272639 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:40:42 np0005625204.localdomain podman[286675]: 2026-02-20 09:40:42.219203143 +0000 UTC m=+0.148833126 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:40:42 np0005625204.localdomain podman[286675]: 2026-02-20 09:40:42.230071371 +0000 UTC m=+0.159701394 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true)
Feb 20 09:40:42 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:40:42 np0005625204.localdomain podman[286674]: 2026-02-20 09:40:42.286321333 +0000 UTC m=+0.219772915 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 20 09:40:42 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:40:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:42.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:42.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 09:40:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:42.738 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 09:40:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:42.738 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:42.738 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 09:40:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:42.753 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:43.766 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:44.641 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:44.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:44.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.017 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.742 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:40:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:45.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.234 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.311 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.312 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.499 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.500 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=12312MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.500 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.501 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.599 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.600 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.600 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.639 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.712 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.713 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.727 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.750 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:40:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:46.794 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:40:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:47.281 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:40:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:47.288 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:40:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:47.306 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:40:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:47.309 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:40:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:47.309 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:40:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:40:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:40:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:40:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 150740 "" "Go-http-client/1.1"
Feb 20 09:40:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:40:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17293 "" "Go-http-client/1.1"
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.307 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.308 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.308 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.309 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.924 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.925 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.925 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:40:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:49.925 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:40:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:50.019 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:50.303 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:40:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:50.319 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:40:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:50.319 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:40:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:50.320 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:40:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:50.320 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:40:52 np0005625204.localdomain sudo[286762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:40:52 np0005625204.localdomain sudo[286762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:40:52 np0005625204.localdomain sudo[286762]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:52 np0005625204.localdomain sudo[286786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:40:52 np0005625204.localdomain sudo[286786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:52 np0005625204.localdomain podman[286780]: 2026-02-20 09:40:52.646874979 +0000 UTC m=+0.086100132 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:40:52 np0005625204.localdomain podman[286780]: 2026-02-20 09:40:52.657910173 +0000 UTC m=+0.097135346 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:40:52 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:40:53 np0005625204.localdomain sudo[286786]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:53 np0005625204.localdomain sudo[286848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:53 np0005625204.localdomain sudo[286848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:53 np0005625204.localdomain sudo[286848]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:54 np0005625204.localdomain sudo[286866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:54 np0005625204.localdomain sudo[286866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:54 np0005625204.localdomain sudo[286866]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:54.681 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:55.021 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:40:55 np0005625204.localdomain sudo[286884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:40:55 np0005625204.localdomain sudo[286884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:40:55 np0005625204.localdomain sudo[286884]: pam_unix(sudo:session): session closed for user root
Feb 20 09:40:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:40:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:40:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:40:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:40:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:40:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:40:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:40:59.719 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:00.023 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:41:00 np0005625204.localdomain systemd[1]: tmp-crun.GP6eGb.mount: Deactivated successfully.
Feb 20 09:41:00 np0005625204.localdomain podman[286902]: 2026-02-20 09:41:00.153391965 +0000 UTC m=+0.096038802 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:41:00 np0005625204.localdomain podman[286902]: 2026-02-20 09:41:00.163177959 +0000 UTC m=+0.105824876 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:41:00 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:41:00 np0005625204.localdomain sudo[286925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:00 np0005625204.localdomain sudo[286925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:00 np0005625204.localdomain sudo[286925]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:01 np0005625204.localdomain sudo[286943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:01 np0005625204.localdomain sudo[286943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 2026-02-20 09:41:01.636926747 +0000 UTC m=+0.081160568 container create 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main)
Feb 20 09:41:01 np0005625204.localdomain systemd[1]: Started libpod-conmon-51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f.scope.
Feb 20 09:41:01 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 2026-02-20 09:41:01.602067131 +0000 UTC m=+0.046301022 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 2026-02-20 09:41:01.715919557 +0000 UTC m=+0.160153368 container init 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, ceph=True, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 2026-02-20 09:41:01.727760966 +0000 UTC m=+0.171994777 container start 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7)
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 2026-02-20 09:41:01.728126107 +0000 UTC m=+0.172359968 container attach 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, build-date=2026-02-09T10:25:24Z, release=1770267347, vcs-type=git, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.buildah.version=1.42.2)
Feb 20 09:41:01 np0005625204.localdomain systemd[1]: libpod-51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f.scope: Deactivated successfully.
Feb 20 09:41:01 np0005625204.localdomain goofy_mendeleev[287017]: 167 167
Feb 20 09:41:01 np0005625204.localdomain podman[287002]: 2026-02-20 09:41:01.735699893 +0000 UTC m=+0.179933734 container died 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7)
Feb 20 09:41:01 np0005625204.localdomain podman[287022]: 2026-02-20 09:41:01.843033375 +0000 UTC m=+0.093517003 container remove 51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=goofy_mendeleev, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Feb 20 09:41:01 np0005625204.localdomain systemd[1]: libpod-conmon-51b2dadce0437f53db87ecc66f89705734d52c18e98ca3e3669f5dcb12204a6f.scope: Deactivated successfully.
Feb 20 09:41:01 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:41:02 np0005625204.localdomain systemd-rc-local-generator[287058]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:41:02 np0005625204.localdomain systemd-sysv-generator[287062]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-173e2ff66011b02bc37c533e2757c4ccf663dd3e2e6bdce4052d9039c460a794-merged.mount: Deactivated successfully.
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:41:02 np0005625204.localdomain systemd-rc-local-generator[287104]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:41:02 np0005625204.localdomain systemd-sysv-generator[287108]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:02 np0005625204.localdomain systemd[1]: Starting Ceph mgr.np0005625204.exgrzx for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 09:41:02 np0005625204.localdomain podman[287168]: 
Feb 20 09:41:03 np0005625204.localdomain podman[287168]: 2026-02-20 09:41:03.010334069 +0000 UTC m=+0.084206484 container create 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_BRANCH=main)
Feb 20 09:41:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:03 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b9e28716f67341559a2c7754a3d43e5be9a65141ccec5caabb6d3668d2ba6c5a/merged/var/lib/ceph/mgr/ceph-np0005625204.exgrzx supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:03 np0005625204.localdomain podman[287168]: 2026-02-20 09:41:02.975180233 +0000 UTC m=+0.049052688 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:41:03 np0005625204.localdomain podman[287168]: 2026-02-20 09:41:03.081932738 +0000 UTC m=+0.155805153 container init 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx, GIT_BRANCH=main, ceph=True, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 20 09:41:03 np0005625204.localdomain podman[287168]: 2026-02-20 09:41:03.089502724 +0000 UTC m=+0.163375139 container start 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, RELEASE=main, architecture=x86_64)
Feb 20 09:41:03 np0005625204.localdomain bash[287168]: 04a522f5043e1f0321542737f02274209f1473a7e6f57561c1fa327510156513
Feb 20 09:41:03 np0005625204.localdomain systemd[1]: Started Ceph mgr.np0005625204.exgrzx for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: set uid:gid to 167:167 (ceph:ceph)
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2
Feb 20 09:41:03 np0005625204.localdomain sudo[286943]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: pidfile_write: ignore empty --pid-file
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'alerts'
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'balancer'
Feb 20 09:41:03 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:03.266+0000 7f0946085140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'cephadm'
Feb 20 09:41:03 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:03.337+0000 7f0946085140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'crash'
Feb 20 09:41:03 np0005625204.localdomain sudo[287216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:03 np0005625204.localdomain sudo[287216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:03 np0005625204.localdomain sudo[287216]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 20 09:41:03 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'dashboard'
Feb 20 09:41:03 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:03.980+0000 7f0946085140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 20 09:41:03 np0005625204.localdomain sudo[287234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:04 np0005625204.localdomain sudo[287234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:04 np0005625204.localdomain sudo[287234]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:04 np0005625204.localdomain sudo[287252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:41:04 np0005625204.localdomain sudo[287252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'devicehealth'
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'diskprediction_local'
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.545+0000 7f0946085140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]:   from numpy import show_config as show_numpy_config
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.690+0000 7f0946085140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'influx'
Feb 20 09:41:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:04.770 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'insights'
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.797+0000 7f0946085140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'iostat'
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'k8sevents'
Feb 20 09:41:04 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:04.915+0000 7f0946085140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 20 09:41:04 np0005625204.localdomain systemd[1]: tmp-crun.3i0FaH.mount: Deactivated successfully.
Feb 20 09:41:04 np0005625204.localdomain podman[287343]: 2026-02-20 09:41:04.967949314 +0000 UTC m=+0.108996145 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=)
Feb 20 09:41:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:05.025 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:05 np0005625204.localdomain podman[287343]: 2026-02-20 09:41:05.115176819 +0000 UTC m=+0.256223680 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, architecture=x86_64, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public)
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'localpool'
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'mds_autoscaler'
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'mirroring'
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'nfs'
Feb 20 09:41:05 np0005625204.localdomain sudo[287252]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'orchestrator'
Feb 20 09:41:05 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.634+0000 7f0946085140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'osd_perf_query'
Feb 20 09:41:05 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.774+0000 7f0946085140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'osd_support'
Feb 20 09:41:05 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.837+0000 7f0946085140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.890+0000 7f0946085140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'pg_autoscaler'
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 20 09:41:05 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'progress'
Feb 20 09:41:05 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:05.956+0000 7f0946085140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:41:06.003 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:41:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:41:06.004 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:41:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:41:06.005 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'prometheus'
Feb 20 09:41:06 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.013+0000 7f0946085140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain sudo[287444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:06 np0005625204.localdomain sudo[287444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:06 np0005625204.localdomain sudo[287444]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.303+0000 7f0946085140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'rbd_support'
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'restful'
Feb 20 09:41:06 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.381+0000 7f0946085140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'rgw'
Feb 20 09:41:06 np0005625204.localdomain sudo[287462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:06 np0005625204.localdomain sudo[287462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:41:06 np0005625204.localdomain sudo[287462]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'rook'
Feb 20 09:41:06 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:06.698+0000 7f0946085140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 20 09:41:06 np0005625204.localdomain podman[287480]: 2026-02-20 09:41:06.722219277 +0000 UTC m=+0.094720521 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:41:06 np0005625204.localdomain podman[287480]: 2026-02-20 09:41:06.738140092 +0000 UTC m=+0.110641316 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:41:06 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'selftest'
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.095+0000 7f0946085140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'snap_schedule'
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.155+0000 7f0946085140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'stats'
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'status'
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'telegraf'
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.340+0000 7f0946085140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'telemetry'
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.397+0000 7f0946085140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain sudo[287503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:07 np0005625204.localdomain sudo[287503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:07 np0005625204.localdomain sudo[287503]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'test_orchestrator'
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.527+0000 7f0946085140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'volumes'
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.668+0000 7f0946085140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain sshd[287521]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.850+0000 7f0946085140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'zabbix'
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:41:07.908+0000 7f0946085140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd1e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 20 09:41:07 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1308191220
Feb 20 09:41:07 np0005625204.localdomain sshd[287521]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:41:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:09.814 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:10.026 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:41:10 np0005625204.localdomain podman[287523]: 2026-02-20 09:41:10.136440005 +0000 UTC m=+0.074867482 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, vcs-type=git, config_id=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:41:10 np0005625204.localdomain podman[287523]: 2026-02-20 09:41:10.153085083 +0000 UTC m=+0.091512570 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 20 09:41:10 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:41:10 np0005625204.localdomain sshd[287544]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:41:10 np0005625204.localdomain sshd[287546]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:41:11 np0005625204.localdomain sshd[287544]: Invalid user supertest from 86.99.116.54 port 51686
Feb 20 09:41:11 np0005625204.localdomain sshd[287544]: Received disconnect from 86.99.116.54 port 51686:11: Bye Bye [preauth]
Feb 20 09:41:11 np0005625204.localdomain sshd[287544]: Disconnected from invalid user supertest 86.99.116.54 port 51686 [preauth]
Feb 20 09:41:11 np0005625204.localdomain sudo[287548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:11 np0005625204.localdomain sudo[287548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:11 np0005625204.localdomain sudo[287548]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:11 np0005625204.localdomain sshd[287546]: Received disconnect from 196.189.116.182 port 52714:11: Bye Bye [preauth]
Feb 20 09:41:11 np0005625204.localdomain sshd[287546]: Disconnected from authenticating user root 196.189.116.182 port 52714 [preauth]
Feb 20 09:41:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:41:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:41:13 np0005625204.localdomain systemd[1]: tmp-crun.8vYZug.mount: Deactivated successfully.
Feb 20 09:41:13 np0005625204.localdomain podman[287566]: 2026-02-20 09:41:13.140118959 +0000 UTC m=+0.079970512 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:41:13 np0005625204.localdomain podman[287566]: 2026-02-20 09:41:13.180091304 +0000 UTC m=+0.119942887 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:41:13 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:41:13 np0005625204.localdomain podman[287567]: 2026-02-20 09:41:13.185494902 +0000 UTC m=+0.123175637 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:41:13 np0005625204.localdomain podman[287567]: 2026-02-20 09:41:13.268031992 +0000 UTC m=+0.205712757 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 20 09:41:13 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:41:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:14.856 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:14 np0005625204.localdomain sudo[287607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:14 np0005625204.localdomain sudo[287607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:14 np0005625204.localdomain sudo[287607]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:14 np0005625204.localdomain sudo[287625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:41:14 np0005625204.localdomain sudo[287625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:14 np0005625204.localdomain sudo[287625]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:15.028 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:15 np0005625204.localdomain sudo[287643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:41:15 np0005625204.localdomain sudo[287643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287643]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:15 np0005625204.localdomain sudo[287661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287661]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:15 np0005625204.localdomain sudo[287679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287679]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:15 np0005625204.localdomain sudo[287697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287697]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:15 np0005625204.localdomain sudo[287731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287731]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:15 np0005625204.localdomain sudo[287749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287749]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:41:15 np0005625204.localdomain sudo[287767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287767]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:15 np0005625204.localdomain sudo[287785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287785]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:15 np0005625204.localdomain sudo[287803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287803]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:15 np0005625204.localdomain sudo[287821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287821]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:15 np0005625204.localdomain sudo[287839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287839]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:15 np0005625204.localdomain sudo[287857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:15 np0005625204.localdomain sudo[287857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:15 np0005625204.localdomain sudo[287857]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:16 np0005625204.localdomain sudo[287891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287891]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:16 np0005625204.localdomain sudo[287909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287909]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:16 np0005625204.localdomain sudo[287927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287927]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:41:16 np0005625204.localdomain sudo[287945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287945]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287963]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:41:16 np0005625204.localdomain sudo[287963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287963]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:16 np0005625204.localdomain sudo[287981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287981]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[287999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:16 np0005625204.localdomain sudo[287999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[287999]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[288017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:16 np0005625204.localdomain sudo[288017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[288017]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[288051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:16 np0005625204.localdomain sudo[288051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[288051]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[288069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:16 np0005625204.localdomain sudo[288069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[288069]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[288087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:16 np0005625204.localdomain sudo[288087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[288087]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:16 np0005625204.localdomain sudo[288105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:16 np0005625204.localdomain sudo[288105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:16 np0005625204.localdomain sudo[288105]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:17 np0005625204.localdomain sudo[288123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288123]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:17 np0005625204.localdomain sudo[288141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288141]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:17 np0005625204.localdomain sudo[288159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288159]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:17 np0005625204.localdomain sudo[288177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288177]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:17 np0005625204.localdomain sudo[288211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288211]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:17 np0005625204.localdomain sudo[288229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288229]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:17 np0005625204.localdomain sudo[288247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288247]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain sudo[288265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:17 np0005625204.localdomain sudo[288265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain sudo[288265]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:41:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:41:17 np0005625204.localdomain sudo[288283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:17 np0005625204.localdomain sudo[288283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:41:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 152934 "" "Go-http-client/1.1"
Feb 20 09:41:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:41:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17776 "" "Go-http-client/1.1"
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.204 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.209 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd0ca0f87-9326-4bf6-971c-c2d8292d7d47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.205498', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e73c3fc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'e51641bcb468ffb933d16eb86f1f024166de44bf7e4fb7ab83100b10af79912a'}]}, 'timestamp': '2026-02-20 09:41:18.210353', '_unique_id': '9d4f5bba6724492789c735c4e029a740'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.211 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.240 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.241 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25c9448b-18ff-4876-85bb-7179ba1e7868', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.212322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e788892-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '1bf19394436b84ba7577e6130026489f516f899e653a32c5a7d3e4866a2ad725'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.212322', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7896a2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '176cadc1cafd4615f50c9531ed17736e0f78fa2971f1c132595e7dd763347538'}]}, 'timestamp': '2026-02-20 09:41:18.241911', '_unique_id': 'f22d99f77a69441b8deded7550b5ddfd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.242 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '929d4bb7-9a18-4417-94d3-81619aa42dce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.243958', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e78f296-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'd13b47b024b7e5639fe686247d8a59f053c1699dd8382a7416e9b7ea02cc8b00'}]}, 'timestamp': '2026-02-20 09:41:18.244293', '_unique_id': '3fc304af228c476395ebe3c8b486a32e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.244 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.245 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.245 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.245 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '66881217-4ea4-49d2-8314-6ce4dc24e7fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.245626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e793422-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '44299dd4e7107be0e9e77dbac4bcd1eafe7a4d7383daa029d58bb21e4b81834d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.245626', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e793e18-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'f6d13830dd37cd926140047736578a176eb8d8cd5948ffdbd84832868cb0ad1c'}]}, 'timestamp': '2026-02-20 09:41:18.246176', '_unique_id': 'd0fe15a07bba4384bfbffe31c4c67e39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.246 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.247 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.257 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.258 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e674b1f-1f6a-42c8-999f-4f85d0551764', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.247506', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7b18d2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': 'ac4ebb4bf1667f2151e86d3a6f350ca45a49d8c3781b0464f6d678dd8486be6d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.247506', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7b249e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': 'df0b99bec8216efd096691d19a77751a20d642ad8be4d63e154eb5a5c6285b54'}]}, 'timestamp': '2026-02-20 09:41:18.258663', '_unique_id': '50fc6cf364d042d991999e4af2a4109c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.259 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.260 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.260 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74680d83-7c93-44f6-9cce-0d89e528f807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.260425', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7b75e8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '4bc7e1e7acddbd8752ae5c159b3890b0464db04da385d902ea9372a841b5a11f'}]}, 'timestamp': '2026-02-20 09:41:18.260763', '_unique_id': 'bdb851dd57b74f82b734f23ed063582a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.261 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.262 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.262 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f41f0410-3fee-447e-99f9-9487fc00617f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.262115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7bb756-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': '626b2affd87870526ed91f86a1fda02f80079de80b81d812586624fc8d7ff5a5'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.262115', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7bc17e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': '0cf99cafa2d6304250d9eea04d9c8f1a7388db28c5a0680aec00b573b5409512'}]}, 'timestamp': '2026-02-20 09:41:18.262660', '_unique_id': 'ae578f14de1447fd8eba820188f15dd6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f664eeb-02ea-4cbf-a06c-e6234bf9b0fa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.263985', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7c0080-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'b69b978cd8c61ca68c8b5c88c211756b6136c9099a710ca0ed541ec421487811'}]}, 'timestamp': '2026-02-20 09:41:18.264297', '_unique_id': '7475cbb0d20c47a28a8dddae99c776c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.264 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.265 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.265 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'da48e571-2403-4652-94cf-e8dd90e4c4e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.265781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7c468a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '6dcb1cedea515aac49947d7e27d5cd6877e01d7f306d4ed37e3b52bfd13ed15f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.265781', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7c50b2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '4f67a15de5ac3568e20a8116b748ed852b14ea77f2fbcee29d53150b0a53b675'}]}, 'timestamp': '2026-02-20 09:41:18.266314', '_unique_id': '5490b2aeb11942dfb5ebedb49e84ea61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c2de212-c6ce-4b60-8a1b-69d24ba5acf3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.267612', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7c8ece-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'bd5759504a0607975fb244bb317e884cfe5a3369fedb26ca4448d3232d41d018'}]}, 'timestamp': '2026-02-20 09:41:18.267920', '_unique_id': 'be5ec14ca0d14c7f919013e0d92d8279'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.269 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.285 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '783080da-111d-43a8-80bb-2a6303b3fe55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:41:18.269197', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '4e7f6090-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.52498042, 'message_signature': 'c2eddadf6e12bbcfca79496484eda8ded3a333271fd18282b4a57cb2d94bd152'}]}, 'timestamp': '2026-02-20 09:41:18.286450', '_unique_id': '11b4b4ada7344d91a3efa9b4e446a7cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.287 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.288 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.288 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0dd6fd65-4fd8-4c15-a67f-d7b91c9f40bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.288251', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e7fb4aa-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '9edbb0387b876dde9cd4a89792f19c1474abe29decfff6c18b0dda2383039128'}]}, 'timestamp': '2026-02-20 09:41:18.288559', '_unique_id': 'c424baaba3c8479f9da27e762cce030b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.289 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b84c887-0d44-47c9-abc0-68b783df62a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.289904', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e7ff4ba-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'ce727c06d3fd82a81ff932965a6cac3834122da2d5953a1a7133c533dec05462'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.289904', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e7ffee2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'b0cb146a8f4e9a9961048ab80b3f2364f5ce52eb35fdead2bc39036736454f45'}]}, 'timestamp': '2026-02-20 09:41:18.290437', '_unique_id': '67cb37d2f71545e994b6323ce14b1b05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.291 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 13400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b8f5fadb-8bb8-415c-8aaf-b62d11d25e22', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13400000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:41:18.291773', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4e803dee-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.52498042, 'message_signature': '7fe951bdca0fab0ab91435da8f83d72319d1a33652e55cd56cff46065c20cd05'}]}, 'timestamp': '2026-02-20 09:41:18.292058', '_unique_id': '45080121cc8c4a54a32355f25bb549ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.292 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '72bf3aca-6cbc-49ea-a91a-ecb5f980cfa8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.293360', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e807be2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '24995bcbef417870006e5a70f9f8ebe61a60c44887303e21a353b106e4c052d9'}]}, 'timestamp': '2026-02-20 09:41:18.293669', '_unique_id': '88a3c0bdf6b94cbfaffbf217e2328175'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.294 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.295 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7fc51fd9-40ea-4d6a-b518-571b42a6b8a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.294945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e80b99a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '37a6c85e9636b72f2add9d0dd792879dd8974959b432186e1e90dace43d1e80c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.294945', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e80c368-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'faebb5f3ca4fa834ab27da47bbb6073f9600537d30ea535f2049167a8d6fb919'}]}, 'timestamp': '2026-02-20 09:41:18.295467', '_unique_id': 'c8608659408c4945bb93c993deb2587a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2eb4d616-f2b3-4def-ac76-4d6546fe9c91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.296895', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e81059e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': '5af7f7d962e28109ea9bf220919fe03a2af5a6ad944f63b2637b59b5d19b82db'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.296895', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e810fb2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.486699227, 'message_signature': 'e766ace757b2c49e776c1efc857cf8a9c2ff293229c3aa70df5d4deb30541f4b'}]}, 'timestamp': '2026-02-20 09:41:18.297418', '_unique_id': '191ab5a2994a43ebbeeacae20a26b637'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.298 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.298 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0883aa5e-dac5-4805-aae2-95c3ff7913b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.298821', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e815116-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'b183a7cb436eecdb85b8bcfa91dbeed4bb05b2630e17b8ebc7721c8efc93dbe6'}]}, 'timestamp': '2026-02-20 09:41:18.299109', '_unique_id': '933ba66d50f24b07a3f06d36f1204cbb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.299 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.300 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.300 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.300 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e62c15e5-d029-45a8-ada2-0df81b44b8aa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:41:18.300408', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4e818f0a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': 'f45dc91944a8c438f128e27dfea5853726e68cfae97fcd380f2f0babb2ad51fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:41:18.300408', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4e8199b4-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.451499551, 'message_signature': '300c944a4778e461e0402225cb3e7f1023a0ea140c56db331a1edbc238e3df24'}]}, 'timestamp': '2026-02-20 09:41:18.300953', '_unique_id': 'ee0f3875cc994c8396708b575c3d33a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.301 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.302 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0d31d06-781b-413b-8cd3-833e88eb1f50', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.302319', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e81d9a6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': 'd82c754db8e56c98048393f8071ed2fc4c15dd0f27535ff708b01755eb930a22'}]}, 'timestamp': '2026-02-20 09:41:18.302603', '_unique_id': 'd4867395cb304a5d808714a9c7215d91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.303 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83f7e59e-c374-4009-8091-2877410b276d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:41:18.303962', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4e8219a2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11037.444673999, 'message_signature': '25accc19377fb033ee8ce98debeaf265b3e3924c5d7cfcd35d3573611f5eedf8'}]}, 'timestamp': '2026-02-20 09:41:18.304260', '_unique_id': 'efda48d1d86e4425b235c1baa8679a23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:41:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:41:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 2026-02-20 09:41:18.365774169 +0000 UTC m=+0.076551324 container create 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.42.2, release=1770267347, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: Started libpod-conmon-3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c.scope.
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 2026-02-20 09:41:18.433480788 +0000 UTC m=+0.144257943 container init 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.42.2, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 2026-02-20 09:41:18.335512107 +0000 UTC m=+0.046289282 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 2026-02-20 09:41:18.441926591 +0000 UTC m=+0.152703736 container start 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 2026-02-20 09:41:18.442359234 +0000 UTC m=+0.153136429 container attach 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, GIT_BRANCH=main, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:41:18 np0005625204.localdomain magical_blackburn[288363]: 167 167
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: libpod-3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c.scope: Deactivated successfully.
Feb 20 09:41:18 np0005625204.localdomain podman[288347]: 2026-02-20 09:41:18.445200093 +0000 UTC m=+0.155977278 container died 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, distribution-scope=public)
Feb 20 09:41:18 np0005625204.localdomain podman[288368]: 2026-02-20 09:41:18.529610622 +0000 UTC m=+0.074435199 container remove 3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_blackburn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1770267347, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, version=7)
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: libpod-conmon-3e43b413994d6e4d246a444e30e820ddb63a0eef227b62444f6d6f2dda7aa81c.scope: Deactivated successfully.
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 2026-02-20 09:41:18.617799698 +0000 UTC m=+0.056044026 container create edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, release=1770267347, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, architecture=x86_64, ceph=True, GIT_CLEAN=True, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: Started libpod-conmon-edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc.scope.
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:41:18 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:18 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:18 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:18 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/87a11e613b96ca656ef7659440bbb51a36ed51c004440dce3d18d42b3069f8e4/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 2026-02-20 09:41:18.674559836 +0000 UTC m=+0.112804204 container init edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, ceph=True, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 2026-02-20 09:41:18.682358189 +0000 UTC m=+0.120602537 container start edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, architecture=x86_64, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, build-date=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 2026-02-20 09:41:18.682595376 +0000 UTC m=+0.120839754 container attach edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, vcs-type=git, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.buildah.version=1.42.2, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.tags=rhceph ceph)
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 2026-02-20 09:41:18.590087555 +0000 UTC m=+0.028331953 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: libpod-edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc.scope: Deactivated successfully.
Feb 20 09:41:18 np0005625204.localdomain podman[288386]: 2026-02-20 09:41:18.774098466 +0000 UTC m=+0.212342854 container died edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.buildah.version=1.42.2, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:41:18 np0005625204.localdomain podman[288427]: 2026-02-20 09:41:18.846675616 +0000 UTC m=+0.062309761 container remove edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_boyd, RELEASE=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc.)
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: libpod-conmon-edd11889b7f67a82e890a02b38985f434dfc204b5a01b0d7fd343d06377043dc.scope: Deactivated successfully.
Feb 20 09:41:18 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:41:18 np0005625204.localdomain systemd-rc-local-generator[288465]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:41:18 np0005625204.localdomain systemd-sysv-generator[288470]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f5184decee9029c694779f63b9e03a50095e53d0d479941ae3f2c6f3e74347d7-merged.mount: Deactivated successfully.
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:41:19 np0005625204.localdomain systemd-sysv-generator[288508]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:41:19 np0005625204.localdomain systemd-rc-local-generator[288504]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:41:19 np0005625204.localdomain systemd[1]: Starting Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 09:41:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:19.859 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:19 np0005625204.localdomain podman[288568]: 
Feb 20 09:41:20 np0005625204.localdomain podman[288568]: 2026-02-20 09:41:20.001439419 +0000 UTC m=+0.073899292 container create 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 20 09:41:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:20.029 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:20 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:20 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:20 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:20 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff)
Feb 20 09:41:20 np0005625204.localdomain podman[288568]: 2026-02-20 09:41:20.055493932 +0000 UTC m=+0.127953805 container init 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, ceph=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347)
Feb 20 09:41:20 np0005625204.localdomain podman[288568]: 2026-02-20 09:41:20.065734491 +0000 UTC m=+0.138194364 container start 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1770267347, GIT_BRANCH=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, vcs-type=git, version=7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=)
Feb 20 09:41:20 np0005625204.localdomain bash[288568]: 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04
Feb 20 09:41:20 np0005625204.localdomain podman[288568]: 2026-02-20 09:41:19.971218017 +0000 UTC m=+0.043677940 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:41:20 np0005625204.localdomain systemd[1]: Started Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: set uid:gid to 167:167 (ceph:ceph)
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pidfile_write: ignore empty --pid-file
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: load: jerasure load: lrc 
Feb 20 09:41:20 np0005625204.localdomain sudo[288283]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: RocksDB version: 7.9.2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Git sha 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: DB SUMMARY
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: DB Session ID:  RDMWWACFW9Z8Q9K53AN8
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: CURRENT file:  CURRENT
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: IDENTITY file:  IDENTITY
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005625204/store.db dir, Total Num: 0, files: 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005625204/store.db: 000004.log size: 761 ; 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                         Options.error_if_exists: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.create_if_missing: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                         Options.paranoid_checks: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                                     Options.env: 0x562d04acfa20
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                                Options.info_log: 0x562d057fcd20
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.max_file_opening_threads: 16
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                              Options.statistics: (nil)
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                               Options.use_fsync: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.max_log_file_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                         Options.allow_fallocate: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                        Options.use_direct_reads: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.create_missing_column_families: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                              Options.db_log_dir: 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                                 Options.wal_dir: 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.advise_random_on_open: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                    Options.write_buffer_manager: 0x562d0580d540
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                            Options.rate_limiter: (nil)
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.unordered_write: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                               Options.row_cache: None
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                              Options.wal_filter: None
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.allow_ingest_behind: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.two_write_queues: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.manual_wal_flush: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.wal_compression: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.atomic_flush: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.log_readahead_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.allow_data_in_errors: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.db_host_id: __hostname__
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.max_background_jobs: 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.max_background_compactions: -1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.max_subcompactions: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.max_total_wal_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                          Options.max_open_files: -1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                          Options.bytes_per_sync: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:       Options.compaction_readahead_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.max_background_flushes: -1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Compression algorithms supported:
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kZSTD supported: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kXpressCompression supported: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kBZip2Compression supported: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kLZ4Compression supported: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kZlibCompression supported: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kLZ4HCCompression supported: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         kSnappyCompression supported: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:           Options.merge_operator: 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:        Options.compaction_filter: None
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562d057fc980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x562d057f9350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:        Options.write_buffer_size: 33554432
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:  Options.max_write_buffer_number: 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.compression: NoCompression
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.num_levels: 7
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.table_properties_collectors: 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                           Options.bloom_locality: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                               Options.ttl: 2592000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                       Options.enable_blob_files: false
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                           Options.min_blob_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: ff5418ad-30e3-42a0-9ea4-01185f113ffa
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580480127693, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580480130710, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580480130904, "job": 1, "event": "recovery_finished"}
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562d05820e00
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: DB pointer 0x562d05916000
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 does not exist in monmap, will attempt to join an existing cluster
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x562d057f9350#2 capacity: 512.00 MB usage: 1.17 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.7e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1,0.95 KB,0.000181794%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: using public_addr v2:172.18.0.108:0/0 -> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: starting mon.np0005625204 rank -1 at public addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] at bind addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005625204 fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(???) e0 preinit fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing) e3 sync_obtain_latest_monmap
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing) e3 sync_obtain_latest_monmap obtained monmap e3
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).mds e17 new map
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2026-02-20T07:58:28.398421+0000
                                                           modified        2026-02-20T09:40:14.722031+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        83
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26854}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26854 members: 26854
                                                           [mds.mds.np0005625203.zsrwgk{0:26854} state up:active seq 13 addr [v2:172.18.0.107:6808/3334119751,v1:172.18.0.107:6809/3334119751] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005625202.akhmop{-1:17124} state up:standby seq 1 addr [v2:172.18.0.106:6808/3865978972,v1:172.18.0.106:6809/3865978972] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005625204.wnsphl{-1:26848} state up:standby seq 1 addr [v2:172.18.0.108:6808/2508223371,v1:172.18.0.108:6809/2508223371] compat {c=[1],r=[1],i=[17ff]}]
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3781: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3782: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17250 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625202.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mgr to host np0005625202.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 3314933000852226048, adjusting msgr requires
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).osd e84 crush map has features 288514051259236352, adjusting msgr requires
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17256 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625203.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mgr to host np0005625203.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3783: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17262 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625204.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mgr to host np0005625204.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17268 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Saving service mgr spec with placement label:mgr
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3784: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17274 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3785: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17286 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625199.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mon to host np0005625199.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3786: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17292 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625199.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label _admin to host np0005625199.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17304 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625200.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mon to host np0005625200.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/1275459803' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/1275459803' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3787: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.26855 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625200.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label _admin to host np0005625200.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625202.arwxwo started
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mgrmap e12: np0005625199.ileebh(active, since 2h), standbys: np0005625201.mtnyvu, np0005625200.ypbkax, np0005625202.arwxwo
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3788: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17331 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625201.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mon to host np0005625201.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625203.lonygy started
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17337 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625201.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label _admin to host np0005625201.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3789: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mgrmap e13: np0005625199.ileebh(active, since 2h), standbys: np0005625201.mtnyvu, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17343 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625202.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mon to host np0005625202.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625204.exgrzx started
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17349 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625202.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label _admin to host np0005625202.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3790: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mgrmap e14: np0005625199.ileebh(active, since 2h), standbys: np0005625201.mtnyvu, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17355 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625203.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mon to host np0005625203.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3791: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17361 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625203.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label _admin to host np0005625203.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17367 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625204.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label mon to host np0005625204.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3792: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17373 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005625204.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Added label _admin to host np0005625204.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3793: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17379 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Saving service mon spec with placement label:mon
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3794: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='client.17385 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625202", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: pgmap v3795: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Feb 20 09:41:20 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd1e0 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 20 09:41:22 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@-1(probing) e4  my rank is now 3 (was -1)
Feb 20 09:41:22 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:41:22 np0005625204.localdomain ceph-mon[288586]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 20 09:41:22 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:22 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 20 09:41:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:41:23 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 20 09:41:23 np0005625204.localdomain systemd[1]: tmp-crun.vgef3v.mount: Deactivated successfully.
Feb 20 09:41:23 np0005625204.localdomain podman[288625]: 2026-02-20 09:41:23.146786715 +0000 UTC m=+0.081701855 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 20 09:41:23 np0005625204.localdomain podman[288625]: 2026-02-20 09:41:23.157377165 +0000 UTC m=+0.092292305 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 20 09:41:23 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:41:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:24.895 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:25.031 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mgrc update_daemon_metadata mon.np0005625204 metadata {addrs=[v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005625204.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005625204.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: pgmap v3796: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625199"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625199 calling monitor election
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: pgmap v3797: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: pgmap v3798: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625199 is new leader, mons np0005625199,np0005625201,np0005625200,np0005625204 in quorum (ranks 0,1,2,3)
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: monmap epoch 4
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:41:20.444808+0000
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625199
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: osdmap e84: 6 total, 6 up, 6 in
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mgrmap e14: np0005625199.ileebh(active, since 2h), standbys: np0005625201.mtnyvu, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e4 handle_auth_request failed to assign global_id
Feb 20 09:41:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:41:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:41:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:41:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:41:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:41:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='client.27061 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625202", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:27 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 20 09:41:27 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fcf20 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 20 09:41:27 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:41:27 np0005625204.localdomain ceph-mon[288586]: paxos.3).electionLogic(18) init, last seen epoch 18
Feb 20 09:41:27 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:27 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:28 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 20 09:41:28 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 20 09:41:28 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 20 09:41:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:29.897 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:30.035 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:30 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 20 09:41:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:41:30 np0005625204.localdomain systemd[1]: tmp-crun.HClEgM.mount: Deactivated successfully.
Feb 20 09:41:30 np0005625204.localdomain podman[288644]: 2026-02-20 09:41:30.886601676 +0000 UTC m=+0.096518597 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:41:30 np0005625204.localdomain podman[288644]: 2026-02-20 09:41:30.898043602 +0000 UTC m=+0.107960533 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:41:30 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: pgmap v3799: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625199"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625199 calling monitor election
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: pgmap v3800: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: pgmap v3801: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625199 is new leader, mons np0005625199,np0005625201,np0005625200,np0005625204 in quorum (ranks 0,1,2,3)
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: monmap epoch 5
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:41:27.125231+0000
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625199
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: osdmap e84: 6 total, 6 up, 6 in
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mgrmap e14: np0005625199.ileebh(active, since 2h), standbys: np0005625201.mtnyvu, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: Health check failed: 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204 (MON_DOWN)
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: Health detail: HEALTH_WARN 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: [WRN] MON_DOWN: 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]:     mon.np0005625203 (rank 4) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum)
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 20 09:41:32 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: paxos.3).electionLogic(20) init, last seen epoch 20
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:32 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:33 np0005625204.localdomain sudo[288668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:33 np0005625204.localdomain sudo[288668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:33 np0005625204.localdomain sudo[288668]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:33 np0005625204.localdomain sudo[288686]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:33 np0005625204.localdomain sudo[288686]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:33 np0005625204.localdomain sudo[288686]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:33 np0005625204.localdomain sudo[288704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:41:33 np0005625204.localdomain sudo[288704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:34 np0005625204.localdomain podman[288794]: 2026-02-20 09:41:34.403179072 +0000 UTC m=+0.093383189 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, architecture=x86_64, vcs-type=git, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Feb 20 09:41:34 np0005625204.localdomain podman[288794]: 2026-02-20 09:41:34.515077917 +0000 UTC m=+0.205282064 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, GIT_CLEAN=True, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:41:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:34.901 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:35.037 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:35 np0005625204.localdomain sudo[288704]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:41:37 np0005625204.localdomain systemd[1]: tmp-crun.QlJYzu.mount: Deactivated successfully.
Feb 20 09:41:37 np0005625204.localdomain podman[288918]: 2026-02-20 09:41:37.145387953 +0000 UTC m=+0.080817428 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:41:37 np0005625204.localdomain podman[288918]: 2026-02-20 09:41:37.162999991 +0000 UTC m=+0.098429456 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:41:37 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:37 np0005625204.localdomain ceph-mds[284061]: mds.beacon.mds.np0005625204.wnsphl missed beacon ack from the monitors
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: pgmap v3802: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625199"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625199 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: pgmap v3803: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: pgmap v3804: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625199 is new leader, mons np0005625199,np0005625201,np0005625200,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3,4,5)
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: monmap epoch 6
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:41:32.466876+0000
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625199
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005625202
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: osdmap e84: 6 total, 6 up, 6 in
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: mgrmap e14: np0005625199.ileebh(active, since 2h), standbys: np0005625201.mtnyvu, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: Health check cleared: MON_DOWN (was: 1/5 mons down, quorum np0005625199,np0005625201,np0005625200,np0005625204)
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: Cluster is now healthy
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:37 np0005625204.localdomain sudo[288941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:41:37 np0005625204.localdomain sudo[288941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:37 np0005625204.localdomain sudo[288941]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:37 np0005625204.localdomain sudo[288959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:41:37 np0005625204.localdomain sudo[288959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:37 np0005625204.localdomain sudo[288959]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:37 np0005625204.localdomain sudo[288977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:37 np0005625204.localdomain sudo[288977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:37 np0005625204.localdomain sudo[288977]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:37 np0005625204.localdomain sudo[288995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:37 np0005625204.localdomain sudo[288995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:37 np0005625204.localdomain sudo[288995]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289013]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289047]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289047]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289065]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:41:38 np0005625204.localdomain sudo[289083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289083]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:38 np0005625204.localdomain sudo[289101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289101]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:38 np0005625204.localdomain sudo[289119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289119]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289137]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:38 np0005625204.localdomain sudo[289155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289155]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289173]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289207]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:38 np0005625204.localdomain sudo[289225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:38 np0005625204.localdomain sudo[289225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:38 np0005625204.localdomain sudo[289225]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:39 np0005625204.localdomain sudo[289243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain sudo[289243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:39 np0005625204.localdomain sudo[289243]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: Updating np0005625199.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='client.17400 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625202", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:39.942 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:40.040 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: Updating np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: pgmap v3805: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:40 np0005625204.localdomain sudo[289261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:40 np0005625204.localdomain sudo[289261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:41:40 np0005625204.localdomain sudo[289261]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:40 np0005625204.localdomain podman[289279]: 2026-02-20 09:41:40.733569279 +0000 UTC m=+0.084356738 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, release=1770267347, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 20 09:41:40 np0005625204.localdomain podman[289279]: 2026-02-20 09:41:40.751085234 +0000 UTC m=+0.101872703 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, container_name=openstack_network_exporter, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:41:40 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='client.34103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625203", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:41:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:42 np0005625204.localdomain ceph-mon[288586]: pgmap v3806: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:42 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625199 (monmap changed)...
Feb 20 09:41:42 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625199 on np0005625199.localdomain
Feb 20 09:41:42 np0005625204.localdomain ceph-mon[288586]: from='client.34109 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625204", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: pgmap v3807: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625199.ileebh (monmap changed)...
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625199.ileebh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625199.ileebh on np0005625199.localdomain
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.103:0/2264224357' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:41:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:41:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:41:44 np0005625204.localdomain podman[289299]: 2026-02-20 09:41:44.168429111 +0000 UTC m=+0.096346862 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:41:44 np0005625204.localdomain systemd[1]: tmp-crun.92BqaX.mount: Deactivated successfully.
Feb 20 09:41:44 np0005625204.localdomain podman[289299]: 2026-02-20 09:41:44.241990812 +0000 UTC m=+0.169908553 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:41:44 np0005625204.localdomain podman[289300]: 2026-02-20 09:41:44.242069184 +0000 UTC m=+0.170317225 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:41:44 np0005625204.localdomain podman[289300]: 2026-02-20 09:41:44.271301134 +0000 UTC m=+0.199549205 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:41:44 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:41:44 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625199 (monmap changed)...
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625199", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625199 on np0005625199.localdomain
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.103:0/1826137495' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:41:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:44.945 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.042 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.743 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:41:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:45.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: pgmap v3808: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)...
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/3313861519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/2554763741' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/3613321036' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' 
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:41:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14120 172.18.0.103:0/1462291611' entity='mgr.np0005625199.ileebh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon).osd e84 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon).osd e84 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 e85: 6 total, 6 up, 6 in
Feb 20 09:41:46 np0005625204.localdomain sshd[26606]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26667]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26798]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26743]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain sshd[26779]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 20 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-27.scope: Consumed 3min 23.356s CPU time.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 27 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 17 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 26 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 24 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain sshd[26686]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26628]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26762]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26705]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26647]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 20.
Feb 20 09:41:46 np0005625204.localdomain sshd[26724]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain sshd[26587]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 26.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 18 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 23 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 25 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: session-15.scope: Deactivated successfully.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 19 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 21 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 22 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Session 15 logged out. Waiting for processes to exit.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 24.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 17.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 27.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 18.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 23.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 19.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 25.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 22.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 21.
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: Removed session 15.
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/191250644' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.296 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:41:46 np0005625204.localdomain sshd[289362]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:41:46 np0005625204.localdomain sshd[289362]: Accepted publickey for ceph-admin from 192.168.122.105 port 49166 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 09:41:46 np0005625204.localdomain systemd-logind[759]: New session 65 of user ceph-admin.
Feb 20 09:41:46 np0005625204.localdomain systemd[1]: Started Session 65 of User ceph-admin.
Feb 20 09:41:46 np0005625204.localdomain sshd[289362]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.485 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.486 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:41:46 np0005625204.localdomain sudo[289366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:46 np0005625204.localdomain sudo[289366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:46 np0005625204.localdomain sudo[289366]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:46 np0005625204.localdomain sudo[289384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:41:46 np0005625204.localdomain sudo[289384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.790 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.792 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11851MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.792 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.793 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.874 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.875 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.875 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:41:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:46.915 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.103:0/2662030267' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: Activating manager daemon np0005625201.mtnyvu
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: osdmap e85: 6 total, 6 up, 6 in
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.103:0/2662030267' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: mgrmap e15: np0005625201.mtnyvu(active, starting, since 0.0872194s), standbys: np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625199"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr metadata", "who": "np0005625200.ypbkax", "id": "np0005625200.ypbkax"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: Manager daemon np0005625201.mtnyvu is now available
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/mirror_snapshot_schedule"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/mirror_snapshot_schedule"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/191250644' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/trash_purge_schedule"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625201.mtnyvu/trash_purge_schedule"} : dispatch
Feb 20 09:41:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/262602945' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:47.405 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:41:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:47.412 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:41:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:47.439 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:41:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:47.442 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:41:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:47.442 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:41:47 np0005625204.localdomain podman[289494]: 2026-02-20 09:41:47.600807314 +0000 UTC m=+0.103027690 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, release=1770267347, description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2)
Feb 20 09:41:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:41:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:41:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:41:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:41:47 np0005625204.localdomain podman[289494]: 2026-02-20 09:41:47.777836318 +0000 UTC m=+0.280056733 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=1770267347, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:41:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:41:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18258 "" "Go-http-client/1.1"
Feb 20 09:41:48 np0005625204.localdomain ceph-mon[288586]: mgrmap e16: np0005625201.mtnyvu(active, since 1.08829s), standbys: np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:48 np0005625204.localdomain ceph-mon[288586]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:48 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/1988365290' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:41:48 np0005625204.localdomain sudo[289384]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.441 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.442 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:48 np0005625204.localdomain sudo[289614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:48 np0005625204.localdomain sudo[289614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:48 np0005625204.localdomain sudo[289614]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:48 np0005625204.localdomain sudo[289632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:41:48 np0005625204.localdomain sudo[289632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.740 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.740 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.961 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.962 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.963 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:41:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:48.963 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:41:47] ENGINE Bus STARTING
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:41:47] ENGINE Serving on http://172.18.0.105:8765
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: mgrmap e17: np0005625201.mtnyvu(active, since 2s), standbys: np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:41:48] ENGINE Serving on https://172.18.0.105:7150
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:41:48] ENGINE Bus STARTED
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:41:48] ENGINE Client ('172.18.0.105', 35862) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:49 np0005625204.localdomain sudo[289632]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:49 np0005625204.localdomain sudo[289681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:41:49 np0005625204.localdomain sudo[289681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:49 np0005625204.localdomain sudo[289681]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:49.416 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:41:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:49.429 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:41:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:49.430 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:41:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:49.430 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:49.431 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:41:49 np0005625204.localdomain sudo[289699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:41:49 np0005625204.localdomain sudo[289699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:49 np0005625204.localdomain sudo[289699]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:49.949 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:50.043 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:50 np0005625204.localdomain sudo[289736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:41:50 np0005625204.localdomain sudo[289736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289736]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:41:50 np0005625204.localdomain sudo[289754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289754]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 _set_new_cache_sizes cache_size:1019574915 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:41:50 np0005625204.localdomain sudo[289772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:50 np0005625204.localdomain sudo[289772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289772]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:50 np0005625204.localdomain sudo[289790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289790]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:50 np0005625204.localdomain sudo[289808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289808]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:50.430 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:41:50 np0005625204.localdomain sudo[289842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:50 np0005625204.localdomain sudo[289842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289842]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:41:50 np0005625204.localdomain sudo[289860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289860]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:41:50 np0005625204.localdomain sudo[289878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289878]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625199", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd/host:np0005625199", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:41:50 np0005625204.localdomain ceph-mon[288586]: mgrmap e18: np0005625201.mtnyvu(active, since 4s), standbys: np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:50 np0005625204.localdomain sudo[289896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:50 np0005625204.localdomain sudo[289896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289896]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:50 np0005625204.localdomain sudo[289914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289914]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:50 np0005625204.localdomain sudo[289932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:50 np0005625204.localdomain sudo[289932]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:50 np0005625204.localdomain sudo[289950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:50 np0005625204.localdomain sudo[289950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[289950]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[289968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:51 np0005625204.localdomain sudo[289968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[289968]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:51 np0005625204.localdomain sudo[290002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290002]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:41:51 np0005625204.localdomain sudo[290020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290020]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain sudo[290038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290038]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:41:51 np0005625204.localdomain sudo[290056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290056]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:41:51 np0005625204.localdomain sudo[290074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290074]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:51 np0005625204.localdomain sudo[290092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290092]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:51 np0005625204.localdomain sudo[290110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290110]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625199.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:41:51 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625199.ileebh started
Feb 20 09:41:51 np0005625204.localdomain sudo[290128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:51 np0005625204.localdomain sudo[290128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290128]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:51 np0005625204.localdomain sudo[290162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:51 np0005625204.localdomain sudo[290162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:51 np0005625204.localdomain sudo[290162]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:41:52 np0005625204.localdomain sudo[290180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290180]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain sudo[290198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290198]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:52 np0005625204.localdomain sudo[290216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290216]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:41:52 np0005625204.localdomain sudo[290234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290234]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:52 np0005625204.localdomain sudo[290252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290252]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290270]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:41:52 np0005625204.localdomain sudo[290270]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290270]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:52 np0005625204.localdomain sudo[290288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290288]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:52 np0005625204.localdomain sudo[290322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290322]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain sudo[290340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:41:52 np0005625204.localdomain sudo[290340]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290340]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: Updating np0005625199.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: mgrmap e19: np0005625201.mtnyvu(active, since 6s), standbys: np0005625199.ileebh, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr metadata", "who": "np0005625199.ileebh", "id": "np0005625199.ileebh"} : dispatch
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:52 np0005625204.localdomain sudo[290358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:52 np0005625204.localdomain sudo[290358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:52 np0005625204.localdomain sudo[290358]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:53 np0005625204.localdomain sudo[290376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:41:53 np0005625204.localdomain sudo[290376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:41:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:41:53 np0005625204.localdomain sudo[290376]: pam_unix(sudo:session): session closed for user root
Feb 20 09:41:53 np0005625204.localdomain podman[290394]: 2026-02-20 09:41:53.540874605 +0000 UTC m=+0.079930160 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:41:53 np0005625204.localdomain podman[290394]: 2026-02-20 09:41:53.555965045 +0000 UTC m=+0.095020570 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 20 09:41:53 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625200 (monmap changed)...
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625200 on np0005625200.localdomain
Feb 20 09:41:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:54.989 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:41:55.044 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 _set_new_cache_sizes cache_size:1020043081 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)...
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:41:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)...
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:41:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:41:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:41:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:41:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:41:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)...
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:41:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)...
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:41:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: from='client.27032 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:41:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:00.027 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:00.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon).osd e85 _set_new_cache_sizes cache_size:1020054453 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:42:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:42:01 np0005625204.localdomain systemd[1]: tmp-crun.grmIZs.mount: Deactivated successfully.
Feb 20 09:42:01 np0005625204.localdomain podman[290413]: 2026-02-20 09:42:01.140902736 +0000 UTC m=+0.078468344 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:42:01 np0005625204.localdomain podman[290413]: 2026-02-20 09:42:01.176087381 +0000 UTC m=+0.113652969 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:42:01 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: from='client.17490 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625199", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 ' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.1 v2:172.18.0.105:3300/0
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.105:3300/0
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@3(peon) e7  my rank is now 2 (was 3)
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: paxos.2).electionLogic(24) init, last seen epoch 24
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: --2- 172.18.0.108:0/150842184 >> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] conn(0x562e2f9b5800 0x562e2f9b6b00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 20 09:42:02 np0005625204.localdomain ceph-mds[284061]: --2- [v2:172.18.0.108:6808/2508223371,v1:172.18.0.108:6809/2508223371] >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x55bd6fd77400 0x55bd6eff5180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Feb 20 09:42:02 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fb84000 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='client.17496 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005625199"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: Remove daemons mon.np0005625199
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: Safe to remove mon.np0005625199: new quorum should be ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203', 'np0005625202'] (from ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203', 'np0005625202'])
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: Removing monitor np0005625199 from monmap...
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon rm", "name": "np0005625199"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: Removing daemon mon.np0005625199 from np0005625199.localdomain -- ports []
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3,4)
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: monmap epoch 7
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:42:02.105420+0000
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005625202
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: osdmap e85: 6 total, 6 up, 6 in
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: mgrmap e19: np0005625201.mtnyvu(active, since 16s), standbys: np0005625199.ileebh, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:42:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:03 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:42:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:42:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:42:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:03 np0005625204.localdomain sshd[290437]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:42:04 np0005625204.localdomain sshd[290437]: Invalid user oracle from 54.36.99.29 port 48242
Feb 20 09:42:04 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:42:04 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:42:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:04 np0005625204.localdomain sshd[290437]: Received disconnect from 54.36.99.29 port 48242:11: Bye Bye [preauth]
Feb 20 09:42:04 np0005625204.localdomain sshd[290437]: Disconnected from invalid user oracle 54.36.99.29 port 48242 [preauth]
Feb 20 09:42:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:05.029 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:05.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054725 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:42:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:42:06.004 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:42:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:42:06.005 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:42:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:42:06.006 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: from='client.27205 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625199.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: Removed label mon from host np0005625199.localdomain
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:42:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:42:08 np0005625204.localdomain podman[290439]: 2026-02-20 09:42:08.136417216 +0000 UTC m=+0.074838695 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:42:08 np0005625204.localdomain podman[290439]: 2026-02-20 09:42:08.171983402 +0000 UTC m=+0.110404911 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:42:08 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='client.27215 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625199.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: Removed label mgr from host np0005625199.localdomain
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: from='client.34174 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625199.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: Removed label _admin from host np0005625199.localdomain
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:42:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:10.049 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:10.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:10.051 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:42:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:10.051 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:10.085 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:10.086 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:10 np0005625204.localdomain sudo[290464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:10 np0005625204.localdomain sudo[290464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:10 np0005625204.localdomain sudo[290464]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:10 np0005625204.localdomain sudo[290482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:10 np0005625204.localdomain sudo[290482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625203 (monmap changed)...
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 2026-02-20 09:42:10.724577768 +0000 UTC m=+0.077380241 container create 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, name=rhceph, build-date=2026-02-09T10:25:24Z, vcs-type=git, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: Started libpod-conmon-199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6.scope.
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 2026-02-20 09:42:10.694587269 +0000 UTC m=+0.047389762 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 2026-02-20 09:42:10.80236089 +0000 UTC m=+0.155163363 container init 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-type=git, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: tmp-crun.iQ5OwI.mount: Deactivated successfully.
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 2026-02-20 09:42:10.821201975 +0000 UTC m=+0.174004488 container start 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1770267347, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph)
Feb 20 09:42:10 np0005625204.localdomain keen_villani[290532]: 167 167
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 2026-02-20 09:42:10.821754421 +0000 UTC m=+0.174556934 container attach 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, release=1770267347, io.openshift.expose-services=, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: libpod-199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6.scope: Deactivated successfully.
Feb 20 09:42:10 np0005625204.localdomain podman[290517]: 2026-02-20 09:42:10.82771162 +0000 UTC m=+0.180514073 container died 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, version=7, distribution-scope=public, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:42:10 np0005625204.localdomain podman[290547]: 2026-02-20 09:42:10.903655517 +0000 UTC m=+0.068448933 container remove 199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=keen_villani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.buildah.version=1.42.2, version=7)
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: libpod-conmon-199b15c0ad41e84404b294951c51d153bc4a5a2617fe28302a81905a574889c6.scope: Deactivated successfully.
Feb 20 09:42:10 np0005625204.localdomain podman[290533]: 2026-02-20 09:42:10.876429211 +0000 UTC m=+0.092554406 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public)
Feb 20 09:42:10 np0005625204.localdomain podman[290533]: 2026-02-20 09:42:10.95910626 +0000 UTC m=+0.175231475 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:42:10 np0005625204.localdomain sudo[290482]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:10 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:42:11 np0005625204.localdomain sudo[290574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:11 np0005625204.localdomain sudo[290574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:11 np0005625204.localdomain sudo[290574]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:11 np0005625204.localdomain sudo[290592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:11 np0005625204.localdomain sudo[290592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 2026-02-20 09:42:11.577201993 +0000 UTC m=+0.078423193 container create 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vendor=Red Hat, Inc., io.openshift.expose-services=, CEPH_POINT_RELEASE=, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2026-02-09T10:25:24Z)
Feb 20 09:42:11 np0005625204.localdomain systemd[1]: Started libpod-conmon-29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615.scope.
Feb 20 09:42:11 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 2026-02-20 09:42:11.636181011 +0000 UTC m=+0.137401961 container init 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-type=git, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 2026-02-20 09:42:11.643349085 +0000 UTC m=+0.144570065 container start 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, name=rhceph, release=1770267347, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 2026-02-20 09:42:11.643590922 +0000 UTC m=+0.144811872 container attach 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public)
Feb 20 09:42:11 np0005625204.localdomain tender_morse[290643]: 167 167
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 2026-02-20 09:42:11.545955246 +0000 UTC m=+0.047176256 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:11 np0005625204.localdomain systemd[1]: libpod-29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615.scope: Deactivated successfully.
Feb 20 09:42:11 np0005625204.localdomain podman[290628]: 2026-02-20 09:42:11.646121308 +0000 UTC m=+0.147342258 container died 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, io.openshift.expose-services=, RELEASE=main, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=1770267347, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:42:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:11 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2a7c64a96c40c35b21abaeb1833ec8a76641ce9852319b6ed0e4e3925b0b93f6-merged.mount: Deactivated successfully.
Feb 20 09:42:11 np0005625204.localdomain podman[290649]: 2026-02-20 09:42:11.732820028 +0000 UTC m=+0.073020630 container remove 29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_morse, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, RELEASE=main, ceph=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph)
Feb 20 09:42:11 np0005625204.localdomain systemd[1]: libpod-conmon-29de5de06ed711f6812cbc67ffb90f88a91a56f02d141492cd2e2a7e41d08615.scope: Deactivated successfully.
Feb 20 09:42:11 np0005625204.localdomain sudo[290592]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:11 np0005625204.localdomain sudo[290672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:11 np0005625204.localdomain sudo[290672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:11 np0005625204.localdomain sudo[290672]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:12 np0005625204.localdomain sudo[290690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:12 np0005625204.localdomain sudo[290690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 2026-02-20 09:42:12.527784544 +0000 UTC m=+0.077584498 container create 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, io.openshift.tags=rhceph ceph, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:12 np0005625204.localdomain systemd[1]: Started libpod-conmon-0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12.scope.
Feb 20 09:42:12 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 2026-02-20 09:42:12.497964069 +0000 UTC m=+0.047764043 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 2026-02-20 09:42:12.599650959 +0000 UTC m=+0.149450903 container init 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, release=1770267347, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=7, distribution-scope=public, ceph=True, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:42:12 np0005625204.localdomain nostalgic_kowalevski[290740]: 167 167
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 2026-02-20 09:42:12.61773501 +0000 UTC m=+0.167534954 container start 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, name=rhceph, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 2026-02-20 09:42:12.618096941 +0000 UTC m=+0.167896925 container attach 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, release=1770267347, version=7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:12 np0005625204.localdomain systemd[1]: libpod-0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12.scope: Deactivated successfully.
Feb 20 09:42:12 np0005625204.localdomain podman[290725]: 2026-02-20 09:42:12.620993278 +0000 UTC m=+0.170793272 container died 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, version=7, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=)
Feb 20 09:42:12 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.0 (monmap changed)...
Feb 20 09:42:12 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:42:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:42:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:12 np0005625204.localdomain podman[290745]: 2026-02-20 09:42:12.723677587 +0000 UTC m=+0.090529035 container remove 0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_kowalevski, name=rhceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:42:12 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fe4148f1cdf25b963e05d78801b63ba5e781225cb31f3535550571ccb6b78f7e-merged.mount: Deactivated successfully.
Feb 20 09:42:12 np0005625204.localdomain systemd[1]: libpod-conmon-0a7b01bca64a168ab290ba938ab9c7e5f9f3ba42497164621ec2bef62976ed12.scope: Deactivated successfully.
Feb 20 09:42:12 np0005625204.localdomain sudo[290690]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:13 np0005625204.localdomain sudo[290770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:13 np0005625204.localdomain sudo[290770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:13 np0005625204.localdomain sudo[290770]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:13 np0005625204.localdomain sudo[290788]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:13 np0005625204.localdomain sudo[290788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:13 np0005625204.localdomain sshd[290832]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 2026-02-20 09:42:13.602047954 +0000 UTC m=+0.074436713 container create b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.expose-services=, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:42:13 np0005625204.localdomain systemd[1]: Started libpod-conmon-b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1.scope.
Feb 20 09:42:13 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 2026-02-20 09:42:13.566725624 +0000 UTC m=+0.039114443 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 2026-02-20 09:42:13.675600069 +0000 UTC m=+0.147988838 container init b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, vcs-type=git, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, distribution-scope=public, GIT_BRANCH=main)
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 2026-02-20 09:42:13.68431863 +0000 UTC m=+0.156707399 container start b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 2026-02-20 09:42:13.685779153 +0000 UTC m=+0.158167922 container attach b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:42:13 np0005625204.localdomain systemd[1]: libpod-b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1.scope: Deactivated successfully.
Feb 20 09:42:13 np0005625204.localdomain friendly_stonebraker[290839]: 167 167
Feb 20 09:42:13 np0005625204.localdomain podman[290822]: 2026-02-20 09:42:13.689471365 +0000 UTC m=+0.161860164 container died b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, version=7, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)...
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:13 np0005625204.localdomain sshd[290832]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:42:13 np0005625204.localdomain systemd[1]: tmp-crun.iV7Lo9.mount: Deactivated successfully.
Feb 20 09:42:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3c74b7653ffc47217de9b755a70dd9a5367d950122ff4c38a60a68c00bc2351f-merged.mount: Deactivated successfully.
Feb 20 09:42:13 np0005625204.localdomain podman[290844]: 2026-02-20 09:42:13.814549565 +0000 UTC m=+0.115162834 container remove b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_stonebraker, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7)
Feb 20 09:42:13 np0005625204.localdomain systemd[1]: libpod-conmon-b8ad3e509c07348294bc6700464cde08e875ca8443603486b43a28bdaf5fd4c1.scope: Deactivated successfully.
Feb 20 09:42:13 np0005625204.localdomain sudo[290788]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:14 np0005625204.localdomain sudo[290861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:14 np0005625204.localdomain sudo[290861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:14 np0005625204.localdomain sudo[290861]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:14 np0005625204.localdomain sudo[290879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:14 np0005625204.localdomain sudo[290879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:42:14 np0005625204.localdomain podman[290913]: 2026-02-20 09:42:14.528684377 +0000 UTC m=+0.087619308 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Feb 20 09:42:14 np0005625204.localdomain podman[290913]: 2026-02-20 09:42:14.565029506 +0000 UTC m=+0.123964467 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:42:14 np0005625204.localdomain podman[290912]: 2026-02-20 09:42:14.581299874 +0000 UTC m=+0.140025029 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 2026-02-20 09:42:14.65120048 +0000 UTC m=+0.184276326 container create 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.42.2, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 20 09:42:14 np0005625204.localdomain podman[290912]: 2026-02-20 09:42:14.696067116 +0000 UTC m=+0.254792231 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: Started libpod-conmon-5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441.scope.
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 2026-02-20 09:42:14.614212121 +0000 UTC m=+0.147287977 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 2026-02-20 09:42:14.731047245 +0000 UTC m=+0.264123091 container init 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, distribution-scope=public, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=)
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 2026-02-20 09:42:14.740803767 +0000 UTC m=+0.273879613 container start 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, release=1770267347, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, version=7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_BRANCH=main, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 2026-02-20 09:42:14.741088645 +0000 UTC m=+0.274164491 container attach 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, architecture=x86_64, build-date=2026-02-09T10:25:24Z, vcs-type=git, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, release=1770267347, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:42:14 np0005625204.localdomain sad_saha[290973]: 167 167
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: libpod-5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441.scope: Deactivated successfully.
Feb 20 09:42:14 np0005625204.localdomain podman[290926]: 2026-02-20 09:42:14.745597531 +0000 UTC m=+0.278673417 container died 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, version=7, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d91c343a0d58b4f0a87f394aaf88111a173056b50935e0e0697880220cf8ee5e-merged.mount: Deactivated successfully.
Feb 20 09:42:14 np0005625204.localdomain podman[290978]: 2026-02-20 09:42:14.846782035 +0000 UTC m=+0.088629239 container remove 5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_saha, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, release=1770267347, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, version=7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:14 np0005625204.localdomain systemd[1]: libpod-conmon-5842967ecfe303784e7548bfc59ad70951bc1ad00f6f32a7fe0a3e2dc15d9441.scope: Deactivated successfully.
Feb 20 09:42:14 np0005625204.localdomain sudo[290879]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:15 np0005625204.localdomain sudo[290994]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:15 np0005625204.localdomain sudo[290994]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:15 np0005625204.localdomain sudo[290994]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:15.087 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:15 np0005625204.localdomain sudo[291012]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:15 np0005625204.localdomain sudo[291012]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 2026-02-20 09:42:15.60338032 +0000 UTC m=+0.078955488 container create 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main)
Feb 20 09:42:15 np0005625204.localdomain systemd[1]: Started libpod-conmon-3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc.scope.
Feb 20 09:42:15 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 2026-02-20 09:42:15.669272685 +0000 UTC m=+0.144847833 container init 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, GIT_CLEAN=True)
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 2026-02-20 09:42:15.571822684 +0000 UTC m=+0.047397882 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 2026-02-20 09:42:15.679002227 +0000 UTC m=+0.154577375 container start 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_CLEAN=True, ceph=True, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 2026-02-20 09:42:15.679220914 +0000 UTC m=+0.154796072 container attach 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, RELEASE=main, architecture=x86_64, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1770267347, name=rhceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Feb 20 09:42:15 np0005625204.localdomain magical_snyder[291063]: 167 167
Feb 20 09:42:15 np0005625204.localdomain systemd[1]: libpod-3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc.scope: Deactivated successfully.
Feb 20 09:42:15 np0005625204.localdomain podman[291048]: 2026-02-20 09:42:15.683157301 +0000 UTC m=+0.158732449 container died 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, ceph=True, version=7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_CLEAN=True, io.buildah.version=1.42.2, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:42:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:15 np0005625204.localdomain podman[291068]: 2026-02-20 09:42:15.777594613 +0000 UTC m=+0.085859456 container remove 3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=magical_snyder, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.expose-services=, release=1770267347, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Feb 20 09:42:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ea52fc9008f4e689bde064e32d6a152515a5859ef1e2fd6ab06df578c787ca01-merged.mount: Deactivated successfully.
Feb 20 09:42:15 np0005625204.localdomain systemd[1]: libpod-conmon-3c028b2dc9d20d50126e90e93a5b830cce6a625a6f84b10f459dc3d0ee2a7efc.scope: Deactivated successfully.
Feb 20 09:42:15 np0005625204.localdomain sudo[291012]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625204 (monmap changed)...
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:42:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:42:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:42:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:42:17 np0005625204.localdomain ceph-mon[288586]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:42:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18254 "" "Go-http-client/1.1"
Feb 20 09:42:18 np0005625204.localdomain sudo[291084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:42:18 np0005625204.localdomain sudo[291084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291084]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:42:18 np0005625204.localdomain sudo[291102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291102]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:42:18 np0005625204.localdomain sudo[291120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291120]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:18 np0005625204.localdomain sudo[291138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291138]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:42:18 np0005625204.localdomain sudo[291156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291156]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:42:18 np0005625204.localdomain sudo[291190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291190]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:42:18 np0005625204.localdomain sudo[291208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291208]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain sudo[291226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291226]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:42:18 np0005625204.localdomain sudo[291244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291244]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:42:18 np0005625204.localdomain sudo[291262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291262]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:42:18 np0005625204.localdomain sudo[291280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291280]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain sudo[291298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:18 np0005625204.localdomain sudo[291298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291298]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Removing np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Removing np0005625199.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Removing np0005625199.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:18 np0005625204.localdomain sudo[291316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:42:18 np0005625204.localdomain sudo[291316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:18 np0005625204.localdomain sudo[291316]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:19 np0005625204.localdomain sudo[291350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:42:19 np0005625204.localdomain sudo[291350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:19 np0005625204.localdomain sudo[291350]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:19 np0005625204.localdomain sudo[291368]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:42:19 np0005625204.localdomain sudo[291368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:19 np0005625204.localdomain sudo[291368]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:19 np0005625204.localdomain sudo[291386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:42:19 np0005625204.localdomain sudo[291386]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:19 np0005625204.localdomain sudo[291386]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:20.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: Removing daemon mgr.np0005625199.ileebh from np0005625199.localdomain -- ports [9283, 8765]
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='client.34184 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005625199.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: Added label _no_schedule to host np0005625199.localdomain
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:20 np0005625204.localdomain ceph-mon[288586]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625199.localdomain
Feb 20 09:42:21 np0005625204.localdomain ceph-mon[288586]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:22 np0005625204.localdomain sudo[291404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:42:22 np0005625204.localdomain sudo[291404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:22 np0005625204.localdomain sudo[291404]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: from='client.34223 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005625199.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: Removing key for mgr.np0005625199.ileebh
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth rm", "entity": "mgr.np0005625199.ileebh"} : dispatch
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005625199.ileebh"}]': finished
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:42:22 np0005625204.localdomain sshd[291422]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain"} : dispatch
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain"}]': finished
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:42:23 np0005625204.localdomain sudo[291424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:42:23 np0005625204.localdomain sudo[291424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:23 np0005625204.localdomain sudo[291424]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:23 np0005625204.localdomain sshd[291422]: Invalid user tommy from 188.166.218.64 port 52168
Feb 20 09:42:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:42:24 np0005625204.localdomain systemd[1]: tmp-crun.R2JFpO.mount: Deactivated successfully.
Feb 20 09:42:24 np0005625204.localdomain podman[291442]: 2026-02-20 09:42:24.095024977 +0000 UTC m=+0.090204345 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 20 09:42:24 np0005625204.localdomain podman[291442]: 2026-02-20 09:42:24.11011116 +0000 UTC m=+0.105290528 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:42:24 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:42:24 np0005625204.localdomain sshd[291422]: Received disconnect from 188.166.218.64 port 52168:11: Bye Bye [preauth]
Feb 20 09:42:24 np0005625204.localdomain sshd[291422]: Disconnected from invalid user tommy 188.166.218.64 port 52168 [preauth]
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: from='client.34233 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005625199.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: Removed host np0005625199.localdomain
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: host np0005625199.localdomain `cephadm ls` failed: Cannot decode JSON: 
                                                           Traceback (most recent call last):
                                                             File "/usr/share/ceph/mgr/cephadm/serve.py", line 1540, in _run_cephadm_json
                                                               return json.loads(''.join(out))
                                                             File "/lib64/python3.9/json/__init__.py", line 346, in loads
                                                               return _default_decoder.decode(s)
                                                             File "/lib64/python3.9/json/decoder.py", line 337, in decode
                                                               obj, end = self.raw_decode(s, idx=_w(s, 0).end())
                                                             File "/lib64/python3.9/json/decoder.py", line 355, in raw_decode
                                                               raise JSONDecodeError("Expecting value", s, err.value) from None
                                                           json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: executing refresh((['np0005625199.localdomain', 'np0005625200.localdomain', 'np0005625201.localdomain', 'np0005625202.localdomain', 'np0005625203.localdomain', 'np0005625204.localdomain'],)) failed.
                                                           Traceback (most recent call last):
                                                             File "/usr/share/ceph/mgr/cephadm/utils.py", line 94, in do_work
                                                               return f(*arg)
                                                             File "/usr/share/ceph/mgr/cephadm/serve.py", line 317, in refresh
                                                               and not self.mgr.inventory.has_label(host, SpecialHostLabels.NO_MEMORY_AUTOTUNE)
                                                             File "/usr/share/ceph/mgr/cephadm/inventory.py", line 253, in has_label
                                                               host = self._get_stored_name(host)
                                                             File "/usr/share/ceph/mgr/cephadm/inventory.py", line 181, in _get_stored_name
                                                               self.assert_host(host)
                                                             File "/usr/share/ceph/mgr/cephadm/inventory.py", line 209, in assert_host
                                                               raise OrchestratorError('host %s does not exist' % host)
                                                           orchestrator._interface.OrchestratorError: host np0005625199.localdomain does not exist
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)...
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:24 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain
Feb 20 09:42:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:25.091 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:25.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:25.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:42:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:25.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:25.130 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:25.131 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:42:25 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:42:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:42:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:42:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:42:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:42:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625200 (monmap changed)...
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625200 on np0005625200.localdomain
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:27 np0005625204.localdomain sshd[291462]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)...
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:42:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:28 np0005625204.localdomain sshd[291462]: Invalid user sol from 45.148.10.240 port 58852
Feb 20 09:42:28 np0005625204.localdomain sshd[291462]: Connection closed by invalid user sol 45.148.10.240 port 58852 [preauth]
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)...
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)...
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)...
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:42:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:30.132 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:30.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:30.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:42:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:30.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:30.134 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:30.137 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:30 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:42:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:42:32 np0005625204.localdomain podman[291464]: 2026-02-20 09:42:32.146520229 +0000 UTC m=+0.084937307 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:42:32 np0005625204.localdomain podman[291464]: 2026-02-20 09:42:32.154735715 +0000 UTC m=+0.093152783 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:42:32 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: from='client.27071 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: Saving service mon spec with placement label:mon
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:33 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc38000 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 20 09:42:33 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:42:33 np0005625204.localdomain ceph-mon[288586]: paxos.2).electionLogic(26) init, last seen epoch 26
Feb 20 09:42:33 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:33 np0005625204.localdomain sudo[291488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:42:33 np0005625204.localdomain sudo[291488]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:33 np0005625204.localdomain sudo[291488]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:35.137 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: paxos.2).electionLogic(27) init, last seen epoch 27, mid-election, bumping
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='client.27076 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005625202"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: Remove daemons mon.np0005625202
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: Safe to remove mon.np0005625202: new quorum should be ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203'] (from ['np0005625201', 'np0005625200', 'np0005625204', 'np0005625203'])
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: Removing monitor np0005625202 from monmap...
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: Removing daemon mon.np0005625202 from np0005625202.localdomain -- ports []
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)...
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625203 in quorum (ranks 0,1,3)
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204,np0005625203 in quorum (ranks 0,1,2,3)
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: monmap epoch 8
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:42:33.617921+0000
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: osdmap e85: 6 total, 6 up, 6 in
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: mgrmap e19: np0005625201.mtnyvu(active, since 52s), standbys: np0005625199.ileebh, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:42:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:42:39 np0005625204.localdomain systemd[1]: tmp-crun.VqS38Z.mount: Deactivated successfully.
Feb 20 09:42:39 np0005625204.localdomain podman[291506]: 2026-02-20 09:42:39.159171352 +0000 UTC m=+0.093909528 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:42:39 np0005625204.localdomain podman[291506]: 2026-02-20 09:42:39.198203391 +0000 UTC m=+0.132941607 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:42:39 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:40.139 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:40.141 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:40.141 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:42:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:40.141 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:40 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:40.177 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:40.178 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)...
Feb 20 09:42:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain
Feb 20 09:42:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:42:41 np0005625204.localdomain podman[291529]: 2026-02-20 09:42:41.145316932 +0000 UTC m=+0.082765763 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:42:41 np0005625204.localdomain podman[291529]: 2026-02-20 09:42:41.165048434 +0000 UTC m=+0.102497255 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., release=1770267347, vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc.)
Feb 20 09:42:41 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)...
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain
Feb 20 09:42:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)...
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:42 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:42:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:42:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:42:45 np0005625204.localdomain systemd[1]: tmp-crun.mdJ80Z.mount: Deactivated successfully.
Feb 20 09:42:45 np0005625204.localdomain podman[291550]: 2026-02-20 09:42:45.167258292 +0000 UTC m=+0.096279077 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.179 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.182 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.215 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.216 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:45 np0005625204.localdomain systemd[1]: tmp-crun.Kc9oeL.mount: Deactivated successfully.
Feb 20 09:42:45 np0005625204.localdomain podman[291551]: 2026-02-20 09:42:45.232136877 +0000 UTC m=+0.159004328 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 20 09:42:45 np0005625204.localdomain podman[291551]: 2026-02-20 09:42:45.269075635 +0000 UTC m=+0.195943106 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:42:45 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:42:45 np0005625204.localdomain podman[291550]: 2026-02-20 09:42:45.287529738 +0000 UTC m=+0.216550593 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:42:45 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.756 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.757 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.757 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.757 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:42:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:45.758 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3587761072' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.222 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.311 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.311 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.544 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.546 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11855MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.547 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.547 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.610 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.611 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.611 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:42:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:46.651 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/198306316' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/3587761072' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:46 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e8 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/724040001' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:47.098 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:42:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:47.105 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:42:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:47.126 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:42:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:47.128 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:42:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:47.129 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:42:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:42:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:42:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:42:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/2963859637' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/3696267818' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/724040001' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/4259480945' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:42:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:42:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18262 "" "Go-http-client/1.1"
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.129 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.130 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.130 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.130 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:48.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:42:48 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:42:48 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:42:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:42:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.720 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.720 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:42:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.970 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.971 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.971 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:42:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:49.972 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:42:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:50.216 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:50.219 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:50.293 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:42:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:50.309 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:42:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:50.309 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: from='client.27288 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005625202.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:42:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Feb 20 09:42:52 np0005625204.localdomain sudo[291638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:52 np0005625204.localdomain sudo[291638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:52 np0005625204.localdomain sudo[291638]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:52 np0005625204.localdomain sudo[291656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:52 np0005625204.localdomain sudo[291656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Feb 20 09:42:52 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc38160 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: paxos.2).electionLogic(32) init, last seen epoch 32
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 2026-02-20 09:42:52.793717759 +0000 UTC m=+0.080156535 container create 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:42:52 np0005625204.localdomain systemd[1]: Started libpod-conmon-9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1.scope.
Feb 20 09:42:52 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 2026-02-20 09:42:52.761281456 +0000 UTC m=+0.047720232 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 2026-02-20 09:42:52.870363537 +0000 UTC m=+0.156802313 container init 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, release=1770267347, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, version=7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 2026-02-20 09:42:52.884097928 +0000 UTC m=+0.170536684 container start 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, release=1770267347, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True)
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 2026-02-20 09:42:52.884306225 +0000 UTC m=+0.170745051 container attach 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, release=1770267347, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, vcs-type=git, io.buildah.version=1.42.2, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:42:52 np0005625204.localdomain wonderful_bartik[291705]: 167 167
Feb 20 09:42:52 np0005625204.localdomain podman[291690]: 2026-02-20 09:42:52.889987845 +0000 UTC m=+0.176426631 container died 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Feb 20 09:42:52 np0005625204.localdomain systemd[1]: libpod-9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1.scope: Deactivated successfully.
Feb 20 09:42:52 np0005625204.localdomain podman[291710]: 2026-02-20 09:42:52.983183149 +0000 UTC m=+0.081196716 container remove 9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wonderful_bartik, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, ceph=True, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, RELEASE=main, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z)
Feb 20 09:42:52 np0005625204.localdomain systemd[1]: libpod-conmon-9016318605e59cbac999f1ebce129056afa3b4d31e38f17e7c75e196178158c1.scope: Deactivated successfully.
Feb 20 09:42:53 np0005625204.localdomain sudo[291656]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-520da4a85081cd80df64788a7f69a185268d89edac0e4e6baa5016e415001ea9-merged.mount: Deactivated successfully.
Feb 20 09:42:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:42:55 np0005625204.localdomain podman[291730]: 2026-02-20 09:42:55.148074059 +0000 UTC m=+0.086046231 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 09:42:55 np0005625204.localdomain podman[291730]: 2026-02-20 09:42:55.185060388 +0000 UTC m=+0.123032560 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:42:55 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:42:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:55.220 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:42:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:55.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:55.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:42:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:55.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:55.223 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:42:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:42:55.225 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:42:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:42:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:42:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:42:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:42:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:42:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204 in quorum (ranks 0,1,2)
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: monmap epoch 9
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:42:52.462377+0000
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: osdmap e85: 6 total, 6 up, 6 in
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: mgrmap e19: np0005625201.mtnyvu(active, since 71s), standbys: np0005625199.ileebh, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: Health check failed: 2/5 mons down, quorum np0005625201,np0005625200,np0005625204 (MON_DOWN)
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: Health detail: HEALTH_WARN 2/5 mons down, quorum np0005625201,np0005625200,np0005625204
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: [WRN] MON_DOWN: 2/5 mons down, quorum np0005625201,np0005625200,np0005625204
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]:     mon.np0005625203 (rank 3) addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] is down (out of quorum)
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]:     mon.np0005625202 (rank 4) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:57 np0005625204.localdomain sudo[291749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:57 np0005625204.localdomain sudo[291749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:57 np0005625204.localdomain sudo[291749]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:57 np0005625204.localdomain sudo[291767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:57 np0005625204.localdomain sudo[291767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 2026-02-20 09:42:58.204458779 +0000 UTC m=+0.078249848 container create fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, com.redhat.component=rhceph-container, RELEASE=main, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, version=7, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:42:58 np0005625204.localdomain systemd[1]: Started libpod-conmon-fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a.scope.
Feb 20 09:42:58 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 2026-02-20 09:42:58.174285614 +0000 UTC m=+0.048076683 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 2026-02-20 09:42:58.282980343 +0000 UTC m=+0.156771412 container init fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 2026-02-20 09:42:58.292247381 +0000 UTC m=+0.166038410 container start fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, release=1770267347, GIT_BRANCH=main, io.buildah.version=1.42.2)
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 2026-02-20 09:42:58.292423996 +0000 UTC m=+0.166215055 container attach fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347)
Feb 20 09:42:58 np0005625204.localdomain nostalgic_blackburn[291816]: 167 167
Feb 20 09:42:58 np0005625204.localdomain systemd[1]: libpod-fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a.scope: Deactivated successfully.
Feb 20 09:42:58 np0005625204.localdomain podman[291801]: 2026-02-20 09:42:58.299143938 +0000 UTC m=+0.172935017 container died fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, version=7, release=1770267347, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:42:58 np0005625204.localdomain podman[291821]: 2026-02-20 09:42:58.396808946 +0000 UTC m=+0.086924527 container remove fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nostalgic_blackburn, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, build-date=2026-02-09T10:25:24Z, ceph=True, com.redhat.component=rhceph-container)
Feb 20 09:42:58 np0005625204.localdomain systemd[1]: libpod-conmon-fdcfcdfa81c34c405246c06ad028bd416a594490ca487a5ff82d3ba9336c203a.scope: Deactivated successfully.
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: paxos.2).electionLogic(35) init, last seen epoch 35, mid-election, bumping
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:42:58 np0005625204.localdomain sudo[291767]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625200 calling monitor election
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625200,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3,4)
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: monmap epoch 9
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:42:52.462377+0000
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625200
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: osdmap e85: 6 total, 6 up, 6 in
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: mgrmap e19: np0005625201.mtnyvu(active, since 72s), standbys: np0005625199.ileebh, np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: Health check cleared: MON_DOWN (was: 2/5 mons down, quorum np0005625201,np0005625200,np0005625204)
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: Cluster is now healthy
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:42:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:58 np0005625204.localdomain sudo[291845]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:58 np0005625204.localdomain sudo[291845]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:58 np0005625204.localdomain sudo[291845]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:58 np0005625204.localdomain sudo[291863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:58 np0005625204.localdomain sudo[291863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d772c0e9da178ffb07cddfe2e1af16f00670a6862566c26671fc13066a1c6e9b-merged.mount: Deactivated successfully.
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 2026-02-20 09:42:59.226312397 +0000 UTC m=+0.068702921 container create c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, architecture=x86_64, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:42:59 np0005625204.localdomain systemd[1]: Started libpod-conmon-c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf.scope.
Feb 20 09:42:59 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 2026-02-20 09:42:59.28946598 +0000 UTC m=+0.131856494 container init c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vcs-type=git, ceph=True, vendor=Red Hat, Inc., version=7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main)
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 2026-02-20 09:42:59.198868984 +0000 UTC m=+0.041259488 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:42:59 np0005625204.localdomain adoring_blackwell[291912]: 167 167
Feb 20 09:42:59 np0005625204.localdomain systemd[1]: libpod-c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf.scope: Deactivated successfully.
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 2026-02-20 09:42:59.303744669 +0000 UTC m=+0.146135193 container start c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vcs-type=git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 2026-02-20 09:42:59.303982527 +0000 UTC m=+0.146373081 container attach c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, distribution-scope=public, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, vcs-type=git, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:42:59 np0005625204.localdomain podman[291897]: 2026-02-20 09:42:59.306230763 +0000 UTC m=+0.148621337 container died c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, architecture=x86_64)
Feb 20 09:42:59 np0005625204.localdomain podman[291917]: 2026-02-20 09:42:59.389256193 +0000 UTC m=+0.074717272 container remove c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_blackwell, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Feb 20 09:42:59 np0005625204.localdomain systemd[1]: libpod-conmon-c45470d73a2aabbf4579cb42a8744370e205977abb5b70044501e7dae6b64dcf.scope: Deactivated successfully.
Feb 20 09:42:59 np0005625204.localdomain sudo[291863]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)...
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:42:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:42:59 np0005625204.localdomain sudo[291941]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:42:59 np0005625204.localdomain sudo[291941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:42:59 np0005625204.localdomain sudo[291941]: pam_unix(sudo:session): session closed for user root
Feb 20 09:42:59 np0005625204.localdomain sudo[291959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:42:59 np0005625204.localdomain sudo[291959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:00 np0005625204.localdomain systemd[1]: tmp-crun.CXnr9P.mount: Deactivated successfully.
Feb 20 09:43:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3888d7f4117587f36bc7a7537526994b01e2d40b6eca482e4f33aa91a3c8ec33-merged.mount: Deactivated successfully.
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 
Feb 20 09:43:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:00.227 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:00.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:00.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:00.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 2026-02-20 09:43:00.240275779 +0000 UTC m=+0.085944738 container create a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, version=7, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container)
Feb 20 09:43:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:00.261 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:00.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936.scope.
Feb 20 09:43:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 2026-02-20 09:43:00.200000421 +0000 UTC m=+0.045669410 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 2026-02-20 09:43:00.312707081 +0000 UTC m=+0.158376040 container init a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 2026-02-20 09:43:00.323045571 +0000 UTC m=+0.168714530 container start a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, io.buildah.version=1.42.2, RELEASE=main, GIT_BRANCH=main, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:43:00 np0005625204.localdomain lucid_hopper[292009]: 167 167
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 2026-02-20 09:43:00.323313958 +0000 UTC m=+0.168982917 container attach a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.openshift.expose-services=, name=rhceph)
Feb 20 09:43:00 np0005625204.localdomain systemd[1]: libpod-a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936.scope: Deactivated successfully.
Feb 20 09:43:00 np0005625204.localdomain podman[291994]: 2026-02-20 09:43:00.328965598 +0000 UTC m=+0.174634617 container died a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhceph, ceph=True, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main)
Feb 20 09:43:00 np0005625204.localdomain podman[292014]: 2026-02-20 09:43:00.415796321 +0000 UTC m=+0.077315798 container remove a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hopper, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:43:00 np0005625204.localdomain systemd[1]: libpod-conmon-a020b2e7fa554a15924fdc3b64f4ddc0df4ac094402af3bf0bf46bc75b25a936.scope: Deactivated successfully.
Feb 20 09:43:00 np0005625204.localdomain sudo[291959]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:00 np0005625204.localdomain sudo[292030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:00 np0005625204.localdomain sudo[292030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:00 np0005625204.localdomain sudo[292030]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:43:00 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:00 np0005625204.localdomain sudo[292048]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:00 np0005625204.localdomain sudo[292048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 2026-02-20 09:43:01.103665726 +0000 UTC m=+0.074450063 container create 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, io.openshift.expose-services=, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, version=7, RELEASE=main, distribution-scope=public, name=rhceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1770267347)
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: Started libpod-conmon-802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2.scope.
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 2026-02-20 09:43:01.167584603 +0000 UTC m=+0.138368890 container init 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public)
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 2026-02-20 09:43:01.07309878 +0000 UTC m=+0.043883097 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 2026-02-20 09:43:01.177715866 +0000 UTC m=+0.148500153 container start 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, GIT_CLEAN=True)
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 2026-02-20 09:43:01.178217352 +0000 UTC m=+0.149001669 container attach 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, version=7, RELEASE=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, GIT_BRANCH=main, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z)
Feb 20 09:43:01 np0005625204.localdomain brave_clarke[292097]: 167 167
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: libpod-802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2.scope: Deactivated successfully.
Feb 20 09:43:01 np0005625204.localdomain podman[292082]: 2026-02-20 09:43:01.181365346 +0000 UTC m=+0.152149663 container died 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, ceph=True, version=7, GIT_CLEAN=True, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container)
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3c4e09e03ff688918d040ca7f542af7a49eeb95eb68544f947d554e00306a41c-merged.mount: Deactivated successfully.
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: tmp-crun.dvpS09.mount: Deactivated successfully.
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f748b64a3207ee2b2c1e27a7d254cc878c50407a5d8a5ee1c916bad482448952-merged.mount: Deactivated successfully.
Feb 20 09:43:01 np0005625204.localdomain podman[292102]: 2026-02-20 09:43:01.295050985 +0000 UTC m=+0.105101952 container remove 802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_clarke, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, vcs-type=git, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7)
Feb 20 09:43:01 np0005625204.localdomain systemd[1]: libpod-conmon-802e3d703fbe55331e82837a71496c0eabc9a69c72e7ea15b83c5e461698fbf2.scope: Deactivated successfully.
Feb 20 09:43:01 np0005625204.localdomain sudo[292048]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:01 np0005625204.localdomain sudo[292119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:01 np0005625204.localdomain sudo[292119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:01 np0005625204.localdomain sudo[292119]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:01 np0005625204.localdomain sudo[292137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:43:01 np0005625204.localdomain sudo[292137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:01 np0005625204.localdomain ceph-mon[288586]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:01 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:43:01 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:43:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:02 np0005625204.localdomain sudo[292137]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:43:03 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/2262134840' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:43:03 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/2262134840' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:43:03 np0005625204.localdomain podman[292186]: 2026-02-20 09:43:03.151405564 +0000 UTC m=+0.082819574 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:43:03 np0005625204.localdomain podman[292186]: 2026-02-20 09:43:03.162393064 +0000 UTC m=+0.093807094 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:43:03 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:43:04 np0005625204.localdomain sudo[292209]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:43:04 np0005625204.localdomain sudo[292209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292209]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain ceph-mon[288586]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:04 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/2943791233' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:43:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:43:04 np0005625204.localdomain sudo[292227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:43:04 np0005625204.localdomain sudo[292227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292227]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:04 np0005625204.localdomain sudo[292245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292245]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:04 np0005625204.localdomain sudo[292263]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292263]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292281]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:04 np0005625204.localdomain sudo[292281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292281]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:04 np0005625204.localdomain sudo[292315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292315]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292333]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:04 np0005625204.localdomain sudo[292333]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292333]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:43:04 np0005625204.localdomain sudo[292351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292351]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:04 np0005625204.localdomain sudo[292369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292369]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:04 np0005625204.localdomain sudo[292387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292387]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:04 np0005625204.localdomain sudo[292405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292405]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:04 np0005625204.localdomain sudo[292423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:04 np0005625204.localdomain sudo[292423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:04 np0005625204.localdomain sudo[292423]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:05 np0005625204.localdomain sudo[292441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:05 np0005625204.localdomain sudo[292441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:05 np0005625204.localdomain sudo[292441]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:05 np0005625204.localdomain sudo[292475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:05 np0005625204.localdomain sudo[292475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:05 np0005625204.localdomain sudo[292475]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:05.263 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:05.265 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:05.265 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:05.265 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:05.266 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:05.268 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:05 np0005625204.localdomain sudo[292493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:05 np0005625204.localdomain sudo[292493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:05 np0005625204.localdomain sudo[292493]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:05 np0005625204.localdomain sudo[292511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:05 np0005625204.localdomain sudo[292511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:05 np0005625204.localdomain sudo[292511]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:05 np0005625204.localdomain sudo[292529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:43:05 np0005625204.localdomain sudo[292529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:05 np0005625204.localdomain sudo[292529]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:43:06.006 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:43:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:43:06.006 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:43:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:43:06.007 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='client.44119 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: Reconfig service osd.default_drive_group
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:43:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 e86: 6 total, 6 up, 6 in
Feb 20 09:43:07 np0005625204.localdomain sshd[289362]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:43:07 np0005625204.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Feb 20 09:43:07 np0005625204.localdomain systemd[1]: session-65.scope: Consumed 18.516s CPU time.
Feb 20 09:43:07 np0005625204.localdomain systemd-logind[759]: Session 65 logged out. Waiting for processes to exit.
Feb 20 09:43:07 np0005625204.localdomain systemd-logind[759]: Removed session 65.
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)...
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 562 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' 
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.14196 172.18.0.105:0/2584856206' entity='mgr.np0005625201.mtnyvu' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/863103056' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: Activating manager daemon np0005625199.ileebh
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: osdmap e86: 6 total, 6 up, 6 in
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 20 09:43:07 np0005625204.localdomain ceph-mon[288586]: mgrmap e20: np0005625199.ileebh(active, starting, since 0.0630848s), standbys: np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:43:07 np0005625204.localdomain sshd[292547]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:43:07 np0005625204.localdomain sshd[292547]: error: kex_exchange_identification: banner line contains invalid characters
Feb 20 09:43:07 np0005625204.localdomain sshd[292547]: banner exchange: Connection from 91.238.181.96 port 65449: invalid format
Feb 20 09:43:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:43:10 np0005625204.localdomain podman[292548]: 2026-02-20 09:43:10.144369226 +0000 UTC m=+0.081594368 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:43:10 np0005625204.localdomain podman[292548]: 2026-02-20 09:43:10.157114808 +0000 UTC m=+0.094339940 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:43:10 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:43:10 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:10.298 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:10.300 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:10.301 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5031 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:10.301 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:10.302 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:10.305 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:43:12 np0005625204.localdomain systemd[1]: tmp-crun.SJzBBv.mount: Deactivated successfully.
Feb 20 09:43:12 np0005625204.localdomain podman[292570]: 2026-02-20 09:43:12.144286511 +0000 UTC m=+0.084015270 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, container_name=openstack_network_exporter, config_id=openstack_network_exporter, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Feb 20 09:43:12 np0005625204.localdomain podman[292570]: 2026-02-20 09:43:12.162061153 +0000 UTC m=+0.101789962 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1770267347, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:43:12 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:43:12 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625201.mtnyvu started
Feb 20 09:43:13 np0005625204.localdomain ceph-mon[288586]: mgrmap e21: np0005625199.ileebh(active, starting, since 5s), standbys: np0005625200.ypbkax, np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:15.303 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:43:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:43:16 np0005625204.localdomain systemd[1]: tmp-crun.JQQxd9.mount: Deactivated successfully.
Feb 20 09:43:16 np0005625204.localdomain podman[292591]: 2026-02-20 09:43:16.152875551 +0000 UTC m=+0.091013380 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 20 09:43:16 np0005625204.localdomain podman[292591]: 2026-02-20 09:43:16.185321744 +0000 UTC m=+0.123459553 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 20 09:43:16 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:43:16 np0005625204.localdomain podman[292590]: 2026-02-20 09:43:16.208486968 +0000 UTC m=+0.148428401 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:43:16 np0005625204.localdomain podman[292590]: 2026-02-20 09:43:16.3012843 +0000 UTC m=+0.241225763 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:43:16 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 1002...
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Activating special unit Exit the Session...
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Removed slice User Background Tasks Slice.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped target Main User Target.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped target Basic System.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped target Paths.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped target Sockets.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped target Timers.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Closed D-Bus User Message Bus Socket.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Stopped Create User's Volatile Files and Directories.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Removed slice User Application Slice.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Reached target Shutdown.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Finished Exit the Session.
Feb 20 09:43:17 np0005625204.localdomain systemd[26592]: Reached target Exit the Session.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: user@1002.service: Deactivated successfully.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 1002.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: user@1002.service: Consumed 11.619s CPU time, read 0B from disk, written 7.0K to disk.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1002...
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: run-user-1002.mount: Deactivated successfully.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: user-runtime-dir@1002.service: Deactivated successfully.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1002.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 1002.
Feb 20 09:43:17 np0005625204.localdomain systemd[1]: user-1002.slice: Consumed 3min 57.069s CPU time.
Feb 20 09:43:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:43:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:43:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:43:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:43:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:43:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18254 "" "Go-http-client/1.1"
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.206 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.213 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'baed3cf0-0519-4d64-b0aa-7fe254dec918', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.208001', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '95faf6d2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '9bc7dffff159f89e07b21258329444f84059afe8234c9e7f7e79ab50c09c1236'}]}, 'timestamp': '2026-02-20 09:43:18.214749', '_unique_id': '7b7f6041f4b44af5a89378186d8a5749'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.216 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.217 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.217 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '531b533e-887b-474f-a6f2-a92b598cbaa3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.217694', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '95fb814c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'daf4cce22be5cb3551d73e43beded9fbec350b1c232e1d21ea0428ae404830fe'}]}, 'timestamp': '2026-02-20 09:43:18.218207', '_unique_id': 'ab4d8c58e04d4103a12e198e2d1aea44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.219 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.250 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.251 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cfd1ca9-402e-4fe0-8855-0c6857f43787', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.220349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96008872-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '6e66dce6ac5ec77b182591fce5909c77df4bef9a8ba8f0a8c9d28ed1c6202527'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.220349', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9600992a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '2599f8b3db8133089deab6e687f6ac13104386e4edf293ff42dbaddb664ed1d0'}]}, 'timestamp': '2026-02-20 09:43:18.251553', '_unique_id': 'ccc99a28f5d64d8197269646b4594167'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.252 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.264 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.265 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6f2ab90-5687-49e3-ad36-4b550e174853', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.253820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9602b6f6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '4c3cb180e0ed8d2f813d1ca4227f42b75a11f869380d8d3bf49bb960d942bd6a'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.253820', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9602c8d0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': 'e37565ac37d0cfe0b88f66d8fb08899ddb4bb155f3d9a359235818a5331397f0'}]}, 'timestamp': '2026-02-20 09:43:18.265922', '_unique_id': '64266122b76948b2bb269fd430622654'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.266 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0af25826-ac38-4447-ba89-ed9cbffb4c1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.268120', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960331da-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '0fb4daf7647d6c53e0ccc3a095f2d43bf1c5f674ae1ba8f1206fc0a73608f363'}]}, 'timestamp': '2026-02-20 09:43:18.268595', '_unique_id': '55513b72ba8a4e4db4da9f63ba513803'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.285 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 13990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8ff7006-6faa-4148-a2d5-8cd224e07557', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13990000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:43:18.270759', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '9605c6a2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.524129167, 'message_signature': 'f29f28b657a17f4b9c0e4a7b4c72a4555eb2cd02d32d2248dd95d034fb0811dd'}]}, 'timestamp': '2026-02-20 09:43:18.285500', '_unique_id': '15fe47e4c8434cd08383dd042cbe2710'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.286 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.287 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.287 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.288 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfe34221-283f-40e8-8573-e089067c8acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.287994', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960639f2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'c30f8ec4dcba8988f68786e481ae67826ca2ef12b021ea6a08391ed2abcc3921'}]}, 'timestamp': '2026-02-20 09:43:18.288464', '_unique_id': '949625cbd2b2428683fd15605ae9d63c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.289 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.290 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b59db32-4a96-450a-8926-02cfb7d2ac57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.290578', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '96069ff0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '3f3a59cf4275d2204616a86eff7edfde05a9df9d0e0e6138f9a261a315628a33'}]}, 'timestamp': '2026-02-20 09:43:18.291077', '_unique_id': '411f0307068e4ca59c639b3ac21d483b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.293 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.293 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8574e3e8-5f11-412f-9017-612e9d0643df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.293404', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '96070ce2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'd661d49f779d9581bad5ff0086481ebc3f757406f3db35560a56ac1a2bf4a20a'}]}, 'timestamp': '2026-02-20 09:43:18.293919', '_unique_id': '320e8414064f47bfa7d9079b7a203325'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.296 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e6dd7e7-a2b5-4e27-bbd8-96dd0898b602', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.295992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960771c8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '310202cb0cc557f12deab0d6169c439c2d5222de2a9636e60647bc45eae09a66'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.295992', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960781c2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '1fd1b530f7d3431f91290ca64ec1fddabcf8e6b1d8353fcb78802673e18638ec'}]}, 'timestamp': '2026-02-20 09:43:18.296857', '_unique_id': '83994accae994c69b7eca1d217b2a876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.297 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5fecb5b6-97d1-4f6d-93dd-2c96588c139b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.298995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9607e720-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'a62cc5dcbc2a430e4a2e8ddb9af8115102175a51aeed1a0c191a8752e7aa914e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.298995', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9607f710-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '62f0a3f86e34d365aa75455558dea6a38a328159390bd57d019d660142f0f4ac'}]}, 'timestamp': '2026-02-20 09:43:18.299857', '_unique_id': 'ff6324fa509f416d9b08851ac4d387ff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c64ace2b-fd18-4967-99f5-4a91c1227c80', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.302043', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '96085e62-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '2e751a97eda1c247112db5464b5520ae432b8006cbb313ca156f3738a7bff5c6'}]}, 'timestamp': '2026-02-20 09:43:18.302502', '_unique_id': '5df6b4c8d7f549d68cc8ee6bcfa31ad1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.305 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.305 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20eda941-73b1-4129-a662-576a87825324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.305030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '9608d4f0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'da9cdc032db6075d704c3e33a9de75ef62d5aba9aeb6a7a983fe00e71b8070ae'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.305030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '9608e6f2-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '30b0c4fcd9019ee7962ac6141d4a79d7545603d6e58ad7a9a41469c214678b36'}]}, 'timestamp': '2026-02-20 09:43:18.306001', '_unique_id': '7779f90e5acb42e5bef2b74f3fe96385'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.308 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26a5b4b9-e545-4c73-bc7b-3ec82b08f296', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.308132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '96094bec-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '18ca41596036b9fc7772f00f7a8bbfab56f3204611eb65936daa1cb01c398aba'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.308132', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '96095d26-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '8ff85905451a98fdc361163ad5151d76596e58f213949234e7eec928358f4c89'}]}, 'timestamp': '2026-02-20 09:43:18.308997', '_unique_id': 'ff66cdabedb54dfda9d7840d7fd560a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.310 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e8223d4-b32d-4607-954d-e9ef49338a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.311104', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '9609c054-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '9a30cd04ba3588994ec10fd8eb2d747bdcf7e342901af91502bca811fb2c8030'}]}, 'timestamp': '2026-02-20 09:43:18.311559', '_unique_id': 'd0b4c744fd954bb782fb186fffc45728'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba3964db-bc0c-49c6-a858-a43dce90c384', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.313593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960a227e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '6cc696e2a4b96d668cf28c3d9a47f4f88298a0e322bb0212c1ccb858fbbe6b06'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.313593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960a3386-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.493049175, 'message_signature': '4f16917163a53b1520a0476f0a8e0789bc1fc0ec8acdd1d0e9ebb6ab1bf4e200'}]}, 'timestamp': '2026-02-20 09:43:18.314482', '_unique_id': 'e0d02c335ecd4f8b8e1706310121bc98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1ae96e96-7c1c-4b2e-a048-342642bfc82e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.316773', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960a9dee-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': '0c478b48081fcd1eb084878c10b4cbb9d3e4d851b0b06132809ada16f8be3720'}]}, 'timestamp': '2026-02-20 09:43:18.317237', '_unique_id': '2461ce13138347acaf519a75b6097833'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7d38276-7115-489b-a2f9-5dfc62c1d94e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:43:18.319397', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '960b077a-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.524129167, 'message_signature': '74afc0aee02c01a1c5ba85c4166cfcc25472d5573be97e2b805ea44d4d090277'}]}, 'timestamp': '2026-02-20 09:43:18.319923', '_unique_id': 'acb40f8f85d94391958ccf35899231eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a657617e-073b-4b07-be7a-34ee5a4b744b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.322041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960b6bc0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '0a5557cc5d40fd7e6919fb30581323acbfb3b3d224bb4199c8fcb994b6f18613'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.322041', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960b7d7c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': '62bc397318ef4fb14721e959c46d463a7080214db6c4f7bfad22508bbd629a64'}]}, 'timestamp': '2026-02-20 09:43:18.322951', '_unique_id': 'f0d42480dbe14d7783d5b200e22c94b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5994a3db-31d9-4eaa-b973-0287c1184aaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:43:18.325529', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '960bf7c0-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.447198101, 'message_signature': 'dff897eee1be37e94535e593ac5bed20306a04c1464cfd6fefbac5b2ca4889cb'}]}, 'timestamp': '2026-02-20 09:43:18.326200', '_unique_id': '87168adcec314bd3a4e2a9ddbe02ccc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24e05402-250e-4aba-a7ce-e70af4e8f413', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:43:18.327818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '960c49fa-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'bb6f72b465852fac262dfc752f74f0c8aede9667b90ca9864b5966095ec0da43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:43:18.327818', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '960c547c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11157.459550251, 'message_signature': 'daef8fcd6fdbad86cfdf17002cda0bbc08a011ffd85c47f5ab209893764ffc9a'}]}, 'timestamp': '2026-02-20 09:43:18.328360', '_unique_id': '75afe92aca104fb0b39a9cba55eec963'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:43:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:43:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:43:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:20.306 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:21 np0005625204.localdomain sshd[292634]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:43:21 np0005625204.localdomain sshd[292634]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:43:24 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 20 09:43:24 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:24.961275) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:43:24 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 20 09:43:24 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580604961362, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 13927, "num_deletes": 767, "total_data_size": 23964475, "memory_usage": 24959952, "flush_reason": "Manual Compaction"}
Feb 20 09:43:24 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605029379, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 15075047, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 13932, "table_properties": {"data_size": 15012630, "index_size": 34791, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 275335, "raw_average_key_size": 25, "raw_value_size": 14829963, "raw_average_value_size": 1396, "num_data_blocks": 1344, "num_entries": 10618, "num_filter_entries": 10618, "num_deletions": 765, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 1771580480, "file_creation_time": 1771580604, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 68153 microseconds, and 32482 cpu microseconds.
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.029433) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 15075047 bytes OK
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.029455) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.031515) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.031529) EVENT_LOG_v1 {"time_micros": 1771580605031525, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.031544) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 23876971, prev total WAL file size 23876971, number of live WAL files 2.
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.035420) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(14MB) 8(1887B)]
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605035529, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 15076934, "oldest_snapshot_seqno": -1}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 9856 keys, 15063228 bytes, temperature: kUnknown
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605107313, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 15063228, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15002809, "index_size": 34718, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24645, "raw_key_size": 262283, "raw_average_key_size": 26, "raw_value_size": 14830021, "raw_average_value_size": 1504, "num_data_blocks": 1342, "num_entries": 9856, "num_filter_entries": 9856, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580605, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.107972) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 15063228 bytes
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.109902) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 209.4 rd, 209.2 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(14.4, 0.0 +0.0 blob) out(14.4 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 10623, records dropped: 767 output_compression: NoCompression
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.109941) EVENT_LOG_v1 {"time_micros": 1771580605109922, "job": 4, "event": "compaction_finished", "compaction_time_micros": 72012, "compaction_time_cpu_micros": 40311, "output_level": 6, "num_output_files": 1, "total_output_size": 15063228, "num_input_records": 10623, "num_output_records": 9856, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605113205, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580605113308, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:43:25.035286) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:43:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:25.308 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:25.310 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:25.311 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:25.311 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:25.353 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:25.353 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:43:26 np0005625204.localdomain podman[292636]: 2026-02-20 09:43:26.142168883 +0000 UTC m=+0.081155105 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:43:26 np0005625204.localdomain podman[292636]: 2026-02-20 09:43:26.158519493 +0000 UTC m=+0.097505755 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:43:26 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:43:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:43:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:43:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:43:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:43:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:43:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:43:30 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:30.355 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:30.357 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:30.357 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:30.357 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:30.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:30.361 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:43:34 np0005625204.localdomain podman[292656]: 2026-02-20 09:43:34.159963179 +0000 UTC m=+0.087057661 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:43:34 np0005625204.localdomain podman[292656]: 2026-02-20 09:43:34.171100423 +0000 UTC m=+0.098194915 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:43:34 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:43:35 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:35.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:39 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 e87: 6 total, 6 up, 6 in
Feb 20 09:43:40 np0005625204.localdomain ceph-mon[288586]: Activating manager daemon np0005625200.ypbkax
Feb 20 09:43:40 np0005625204.localdomain ceph-mon[288586]: Manager daemon np0005625199.ileebh is unresponsive, replacing it with standby daemon np0005625200.ypbkax
Feb 20 09:43:40 np0005625204.localdomain ceph-mon[288586]: osdmap e87: 6 total, 6 up, 6 in
Feb 20 09:43:40 np0005625204.localdomain ceph-mon[288586]: mgrmap e22: np0005625200.ypbkax(active, starting, since 0.0464926s), standbys: np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:40 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:40.363 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:40.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:40.366 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:40.366 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:40.399 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:40.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:40 np0005625204.localdomain sshd[292679]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:43:40 np0005625204.localdomain sshd[292679]: Accepted publickey for ceph-admin from 192.168.122.104 port 50688 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 1002.
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Feb 20 09:43:40 np0005625204.localdomain systemd-logind[759]: New session 66 of user ceph-admin.
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Starting User Manager for UID 1002...
Feb 20 09:43:40 np0005625204.localdomain podman[292681]: 2026-02-20 09:43:40.599424195 +0000 UTC m=+0.100422892 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:43:40 np0005625204.localdomain podman[292681]: 2026-02-20 09:43:40.61428363 +0000 UTC m=+0.115282367 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Queued start job for default target Main User Target.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Created slice User Application Slice.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Reached target Paths.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Reached target Timers.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Starting D-Bus User Message Bus Socket...
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Starting Create User's Volatile Files and Directories...
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Listening on D-Bus User Message Bus Socket.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Reached target Sockets.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Finished Create User's Volatile Files and Directories.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Reached target Basic System.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Reached target Main User Target.
Feb 20 09:43:40 np0005625204.localdomain systemd[292696]: Startup finished in 144ms.
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Started User Manager for UID 1002.
Feb 20 09:43:40 np0005625204.localdomain systemd[1]: Started Session 66 of User ceph-admin.
Feb 20 09:43:40 np0005625204.localdomain sshd[292679]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:43:40 np0005625204.localdomain sudo[292722]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:40 np0005625204.localdomain sudo[292722]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:40 np0005625204.localdomain sudo[292722]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:41 np0005625204.localdomain sudo[292740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 09:43:41 np0005625204.localdomain sudo[292740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr metadata", "who": "np0005625200.ypbkax", "id": "np0005625200.ypbkax"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: Manager daemon np0005625200.ypbkax is now available
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: removing stray HostCache host record np0005625199.localdomain.devices.0
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"}]': finished
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625199.localdomain.devices.0"}]': finished
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/mirror_snapshot_schedule"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/mirror_snapshot_schedule"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/trash_purge_schedule"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625200.ypbkax/trash_purge_schedule"} : dispatch
Feb 20 09:43:41 np0005625204.localdomain sudo[292740]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:41 np0005625204.localdomain sudo[292779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:41 np0005625204.localdomain sudo[292779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:41 np0005625204.localdomain sudo[292779]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:41 np0005625204.localdomain sudo[292797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:43:41 np0005625204.localdomain sudo[292797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: mgrmap e23: np0005625200.ypbkax(active, since 1.12428s), standbys: np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='client.34262 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:42 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Bus STARTING
Feb 20 09:43:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:43:42 np0005625204.localdomain podman[292857]: 2026-02-20 09:43:42.556729291 +0000 UTC m=+0.085556726 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, version=9.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:43:42 np0005625204.localdomain podman[292857]: 2026-02-20 09:43:42.641145742 +0000 UTC m=+0.169973147 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, managed_by=edpm_ansible, version=9.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Feb 20 09:43:42 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:43:42 np0005625204.localdomain systemd[1]: tmp-crun.4x2imb.mount: Deactivated successfully.
Feb 20 09:43:42 np0005625204.localdomain podman[292906]: 2026-02-20 09:43:42.786547472 +0000 UTC m=+0.100934488 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:43:42 np0005625204.localdomain podman[292906]: 2026-02-20 09:43:42.898061425 +0000 UTC m=+0.212448511 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Serving on https://172.18.0.104:7150
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Client ('172.18.0.104', 40458) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Serving on http://172.18.0.104:8765
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:41] ENGINE Bus STARTED
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: mgrmap e24: np0005625200.ypbkax(active, since 2s), standbys: np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:43 np0005625204.localdomain sudo[292797]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:43 np0005625204.localdomain sudo[293028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:43 np0005625204.localdomain sudo[293028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:43 np0005625204.localdomain sudo[293028]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:43 np0005625204.localdomain sudo[293046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:43:43 np0005625204.localdomain sudo[293046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:44 np0005625204.localdomain sudo[293046]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:44 np0005625204.localdomain sudo[293096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:44 np0005625204.localdomain sudo[293096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:44 np0005625204.localdomain sudo[293096]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:44 np0005625204.localdomain sudo[293114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:43:44 np0005625204.localdomain sudo[293114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:44 np0005625204.localdomain sudo[293114]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:43:45 np0005625204.localdomain sudo[293151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293151]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:43:45 np0005625204.localdomain sudo[293169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293169]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:45 np0005625204.localdomain sudo[293187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:45 np0005625204.localdomain sudo[293187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293187]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:45 np0005625204.localdomain sudo[293205]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293205]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:45 np0005625204.localdomain sudo[293223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293223]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.443 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5044 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.445 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.449 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: Saving service mon spec with placement label:mon
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: mgrmap e25: np0005625200.ypbkax(active, since 4s), standbys: np0005625202.arwxwo, np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:43:45 np0005625204.localdomain sudo[293257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:45 np0005625204.localdomain sudo[293257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293257]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:45 np0005625204.localdomain sudo[293275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293275]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:43:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:45.739 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:43:45 np0005625204.localdomain sudo[293293]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:43:45 np0005625204.localdomain sudo[293293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293293]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:45 np0005625204.localdomain sudo[293312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293312]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:45 np0005625204.localdomain sudo[293330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293330]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:45 np0005625204.localdomain sudo[293367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:45 np0005625204.localdomain sudo[293367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:45 np0005625204.localdomain sudo[293367]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain sudo[293385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:46 np0005625204.localdomain sudo[293385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293385]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain sudo[293403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:46 np0005625204.localdomain sudo[293403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293403]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/117158667' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.210 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.269 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.270 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:43:46 np0005625204.localdomain sudo[293439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:43:46 np0005625204.localdomain sudo[293439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293439]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain sudo[293464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:46 np0005625204.localdomain sudo[293464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:43:46 np0005625204.localdomain sudo[293464]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain podman[293456]: 2026-02-20 09:43:46.421746207 +0000 UTC m=+0.113535835 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:43:46 np0005625204.localdomain podman[293456]: 2026-02-20 09:43:46.463095407 +0000 UTC m=+0.154885035 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:43:46 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:43:46 np0005625204.localdomain sudo[293499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain sudo[293499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293499]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.538 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.540 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11867MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.540 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.542 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:43:46 np0005625204.localdomain podman[293493]: 2026-02-20 09:43:46.558790186 +0000 UTC m=+0.132244106 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: from='client.27196 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625202", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:46 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/117158667' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:46 np0005625204.localdomain sudo[293522]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:43:46 np0005625204.localdomain sudo[293522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293522]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain podman[293493]: 2026-02-20 09:43:46.637991021 +0000 UTC m=+0.211444951 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:43:46 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.657 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.657 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:43:46 np0005625204.localdomain sudo[293552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:43:46 np0005625204.localdomain sudo[293552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293552]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:46.707 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:43:46 np0005625204.localdomain sudo[293572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:46 np0005625204.localdomain sudo[293572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293572]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain sudo[293591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:46 np0005625204.localdomain sudo[293591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293591]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:46 np0005625204.localdomain sudo[293626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:46 np0005625204.localdomain sudo[293626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:46 np0005625204.localdomain sudo[293626]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:47 np0005625204.localdomain sudo[293662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293662]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293680]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:47 np0005625204.localdomain sudo[293680]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293680]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1470667621' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:47 np0005625204.localdomain sudo[293698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293698]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:47.189 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:43:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:47.197 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:43:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:47.220 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:43:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:47.223 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:43:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:47.224 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:43:47 np0005625204.localdomain sudo[293718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:47 np0005625204.localdomain sudo[293718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293718]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:47 np0005625204.localdomain sudo[293736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293736]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:43:47 np0005625204.localdomain sudo[293754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293754]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:47 np0005625204.localdomain sudo[293772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293772]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:43:47 np0005625204.localdomain sudo[293790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293790]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/1470667621' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:47 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/3259045040' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:43:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:43:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:43:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:43:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:43:47 np0005625204.localdomain sudo[293824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:43:47 np0005625204.localdomain sudo[293824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293824]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:43:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18258 "" "Go-http-client/1.1"
Feb 20 09:43:47 np0005625204.localdomain sudo[293842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:43:47 np0005625204.localdomain sudo[293842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293842]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:47 np0005625204.localdomain sudo[293860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:43:47 np0005625204.localdomain sudo[293860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:47 np0005625204.localdomain sudo[293860]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:48.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:48.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:48 np0005625204.localdomain sudo[293878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:43:48 np0005625204.localdomain sudo[293878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:48 np0005625204.localdomain sudo[293878]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:48.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:48.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/1060514335' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/2312330214' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/2137806559' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:48 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/4196256663' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:43:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:49.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:49.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:43:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:49.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625200 (monmap changed)...
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625200 on np0005625200.localdomain
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625200.ypbkax", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:43:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.028 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.028 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.029 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.029 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.433 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.445 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.449 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.452 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.452 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.453 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.454 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:50.454 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625200.ypbkax (monmap changed)...
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625200.ypbkax on np0005625200.localdomain
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:43:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:51.449 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:51.450 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)...
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:43:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)...
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/3972118785' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:43:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)...
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' 
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 ' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.24104 172.18.0.104:0/2092071211' entity='mgr.np0005625200.ypbkax' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 e88: 6 total, 6 up, 6 in
Feb 20 09:43:54 np0005625204.localdomain sshd[292679]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:43:54 np0005625204.localdomain systemd[1]: session-66.scope: Deactivated successfully.
Feb 20 09:43:54 np0005625204.localdomain systemd[1]: session-66.scope: Consumed 6.710s CPU time.
Feb 20 09:43:54 np0005625204.localdomain systemd-logind[759]: Session 66 logged out. Waiting for processes to exit.
Feb 20 09:43:54 np0005625204.localdomain systemd-logind[759]: Removed session 66.
Feb 20 09:43:54 np0005625204.localdomain sshd[293896]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:43:54 np0005625204.localdomain sshd[293896]: Accepted publickey for ceph-admin from 192.168.122.106 port 46868 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 09:43:54 np0005625204.localdomain systemd-logind[759]: New session 68 of user ceph-admin.
Feb 20 09:43:54 np0005625204.localdomain systemd[1]: Started Session 68 of User ceph-admin.
Feb 20 09:43:54 np0005625204.localdomain sshd[293896]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:43:54 np0005625204.localdomain sudo[293900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:54 np0005625204.localdomain sudo[293900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:54 np0005625204.localdomain sudo[293900]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:54 np0005625204.localdomain sudo[293918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:43:54 np0005625204.localdomain sudo[293918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/3880794004' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: Activating manager daemon np0005625202.arwxwo
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: osdmap e88: 6 total, 6 up, 6 in
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/3880794004' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: mgrmap e26: np0005625202.arwxwo(active, starting, since 0.0432458s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625200"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: Manager daemon np0005625202.arwxwo is now available
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/mirror_snapshot_schedule"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/mirror_snapshot_schedule"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/trash_purge_schedule"} : dispatch
Feb 20 09:43:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/trash_purge_schedule"} : dispatch
Feb 20 09:43:55 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:43:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:55.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:55.480 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:43:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:55.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:43:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:55.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:55.483 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:43:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:43:55.483 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:43:55 np0005625204.localdomain systemd[1]: tmp-crun.8BfIHi.mount: Deactivated successfully.
Feb 20 09:43:55 np0005625204.localdomain podman[294004]: 2026-02-20 09:43:55.628188687 +0000 UTC m=+0.096686570 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, name=rhceph, version=7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, ceph=True, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64)
Feb 20 09:43:55 np0005625204.localdomain podman[294004]: 2026-02-20 09:43:55.758160614 +0000 UTC m=+0.226658487 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1770267347, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, RELEASE=main, io.buildah.version=1.42.2, vcs-type=git, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:43:56 np0005625204.localdomain ceph-mon[288586]: mgrmap e27: np0005625202.arwxwo(active, since 1.05623s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:56 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Bus STARTING
Feb 20 09:43:56 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Serving on http://172.18.0.106:8765
Feb 20 09:43:56 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Serving on https://172.18.0.106:7150
Feb 20 09:43:56 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Bus STARTED
Feb 20 09:43:56 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:43:55] ENGINE Client ('172.18.0.106', 42946) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:43:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:43:56 np0005625204.localdomain podman[294092]: 2026-02-20 09:43:56.298242828 +0000 UTC m=+0.094670050 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:43:56 np0005625204.localdomain podman[294092]: 2026-02-20 09:43:56.30800384 +0000 UTC m=+0.104431092 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:43:56 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:43:56 np0005625204.localdomain sudo[293918]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:56 np0005625204.localdomain sudo[294141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:56 np0005625204.localdomain sudo[294141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:56 np0005625204.localdomain sudo[294141]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:43:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:43:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:43:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:43:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:43:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:43:56 np0005625204.localdomain sudo[294159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:43:56 np0005625204.localdomain sudo[294159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: mgrmap e28: np0005625202.arwxwo(active, since 2s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:57 np0005625204.localdomain sudo[294159]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:57 np0005625204.localdomain sudo[294208]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:43:57 np0005625204.localdomain sudo[294208]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:57 np0005625204.localdomain sudo[294208]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:57 np0005625204.localdomain sudo[294226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:43:57 np0005625204.localdomain sudo[294226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:57 np0005625204.localdomain sudo[294226]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:57 np0005625204.localdomain sshd[294262]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:43:58 np0005625204.localdomain sudo[294264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:43:58 np0005625204.localdomain sudo[294264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294264]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:43:58 np0005625204.localdomain sudo[294282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294282]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sshd[294262]: Invalid user titu from 18.221.252.160 port 56388
Feb 20 09:43:58 np0005625204.localdomain sshd[294262]: Received disconnect from 18.221.252.160 port 56388:11: Bye Bye [preauth]
Feb 20 09:43:58 np0005625204.localdomain sshd[294262]: Disconnected from invalid user titu 18.221.252.160 port 56388 [preauth]
Feb 20 09:43:58 np0005625204.localdomain sudo[294300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:58 np0005625204.localdomain sudo[294300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294300]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:58 np0005625204.localdomain sudo[294318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294318]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:58 np0005625204.localdomain sudo[294336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294336]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:58 np0005625204.localdomain sudo[294370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294370]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd/host:np0005625200", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: mgrmap e29: np0005625202.arwxwo(active, since 4s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:43:58 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain sudo[294388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:43:58 np0005625204.localdomain sudo[294388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294388]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:43:58 np0005625204.localdomain sudo[294406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294406]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:58 np0005625204.localdomain sudo[294424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294424]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:43:58 np0005625204.localdomain sudo[294442]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294442]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:58 np0005625204.localdomain sudo[294460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:58 np0005625204.localdomain sudo[294460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:58 np0005625204.localdomain sudo[294460]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294478]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:59 np0005625204.localdomain sudo[294478]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294478]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:59 np0005625204.localdomain sudo[294496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294496]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:59 np0005625204.localdomain sudo[294530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294530]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294548]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:43:59 np0005625204.localdomain sudo[294548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294548]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:43:59 np0005625204.localdomain sudo[294566]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294566]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:43:59 np0005625204.localdomain sudo[294584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294584]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:43:59 np0005625204.localdomain sudo[294602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294602]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:59 np0005625204.localdomain sudo[294620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294620]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:43:59 np0005625204.localdomain sudo[294638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294638]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294656]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:59 np0005625204.localdomain sudo[294656]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294656]: pam_unix(sudo:session): session closed for user root
Feb 20 09:43:59 np0005625204.localdomain sudo[294690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:43:59 np0005625204.localdomain sudo[294690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:43:59 np0005625204.localdomain sudo[294690]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625200.ypbkax started
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain sudo[294708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:44:00 np0005625204.localdomain sudo[294708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294708]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain sudo[294726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain sudo[294726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294726]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:00 np0005625204.localdomain sudo[294744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:44:00 np0005625204.localdomain sudo[294744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294744]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain sudo[294762]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:44:00 np0005625204.localdomain sudo[294762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294762]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain sudo[294780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:44:00 np0005625204.localdomain sudo[294780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294780]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain sudo[294798]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:00 np0005625204.localdomain sudo[294798]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294798]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:00.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:00.486 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:00.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:00.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:00 np0005625204.localdomain sudo[294816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:44:00 np0005625204.localdomain sudo[294816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294816]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:00.538 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:00.539 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:00 np0005625204.localdomain sudo[294850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:44:00 np0005625204.localdomain sudo[294850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294850]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain sudo[294868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:44:00 np0005625204.localdomain sudo[294868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294868]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:00 np0005625204.localdomain sudo[294886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:00 np0005625204.localdomain sudo[294886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:00 np0005625204.localdomain sudo[294886]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: Updating np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: mgrmap e30: np0005625202.arwxwo(active, since 6s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625200.ypbkax", "id": "np0005625200.ypbkax"} : dispatch
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:44:01 np0005625204.localdomain sudo[294905]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:44:01 np0005625204.localdomain sudo[294905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:01 np0005625204.localdomain sudo[294905]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/2943728084' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:44:02 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/2943728084' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:03 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:04.997236) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580644997355, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1806, "num_deletes": 254, "total_data_size": 9515829, "memory_usage": 10070552, "flush_reason": "Manual Compaction"}
Feb 20 09:44:04 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645029221, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5847284, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13939, "largest_seqno": 15738, "table_properties": {"data_size": 5839303, "index_size": 4614, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 20862, "raw_average_key_size": 22, "raw_value_size": 5822041, "raw_average_value_size": 6383, "num_data_blocks": 197, "num_entries": 912, "num_filter_entries": 912, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580605, "oldest_key_time": 1771580605, "file_creation_time": 1771580644, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 32048 microseconds, and 11488 cpu microseconds.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.029296) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5847284 bytes OK
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.029336) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.031159) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.031182) EVENT_LOG_v1 {"time_micros": 1771580645031176, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.031214) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 9506701, prev total WAL file size 9514805, number of live WAL files 2.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.033035) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353036' seq:72057594037927935, type:22 .. '6D6772737461740033373537' seq:0, type:0; will stop at (end)
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5710KB)], [15(14MB)]
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645033115, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 20910512, "oldest_snapshot_seqno": -1}
Feb 20 09:44:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10235 keys, 18649997 bytes, temperature: kUnknown
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645121184, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 18649997, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18590071, "index_size": 33265, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25605, "raw_key_size": 271622, "raw_average_key_size": 26, "raw_value_size": 18413740, "raw_average_value_size": 1799, "num_data_blocks": 1289, "num_entries": 10235, "num_filter_entries": 10235, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.121536) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 18649997 bytes
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.123036) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.1 rd, 211.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.6, 14.4 +0.0 blob) out(17.8 +0.0 blob), read-write-amplify(6.8) write-amplify(3.2) OK, records in: 10768, records dropped: 533 output_compression: NoCompression
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.123063) EVENT_LOG_v1 {"time_micros": 1771580645123052, "job": 6, "event": "compaction_finished", "compaction_time_micros": 88184, "compaction_time_cpu_micros": 49158, "output_level": 6, "num_output_files": 1, "total_output_size": 18649997, "num_input_records": 10768, "num_output_records": 10235, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645123947, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645126158, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.032874) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126233) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126245) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126248) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.126851) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645126974, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 275, "num_deletes": 264, "total_data_size": 20879, "memory_usage": 28472, "flush_reason": "Manual Compaction"}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645129475, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 13496, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15740, "largest_seqno": 16013, "table_properties": {"data_size": 11691, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4192, "raw_average_key_size": 15, "raw_value_size": 8092, "raw_average_value_size": 29, "num_data_blocks": 2, "num_entries": 274, "num_filter_entries": 274, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580645, "oldest_key_time": 1771580645, "file_creation_time": 1771580645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 2635 microseconds, and 1199 cpu microseconds.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.129517) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 13496 bytes OK
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.129540) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131089) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131105) EVENT_LOG_v1 {"time_micros": 1771580645131101, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131133) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 18714, prev total WAL file size 18714, number of live WAL files 2.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131736) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303039' seq:72057594037927935, type:22 .. '6B760031323734' seq:0, type:0; will stop at (end)
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(13KB)], [18(17MB)]
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645131778, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18663493, "oldest_snapshot_seqno": -1}
Feb 20 09:44:05 np0005625204.localdomain podman[294923]: 2026-02-20 09:44:05.150336912 +0000 UTC m=+0.087465153 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:44:05 np0005625204.localdomain podman[294923]: 2026-02-20 09:44:05.188118125 +0000 UTC m=+0.125246376 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 9972 keys, 17685845 bytes, temperature: kUnknown
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645203821, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17685845, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17629019, "index_size": 30805, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 267681, "raw_average_key_size": 26, "raw_value_size": 17458464, "raw_average_value_size": 1750, "num_data_blocks": 1165, "num_entries": 9972, "num_filter_entries": 9972, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580645, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.204200) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17685845 bytes
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.207188) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 258.5 rd, 244.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 17.8 +0.0 blob) out(16.9 +0.0 blob), read-write-amplify(2693.3) write-amplify(1310.5) OK, records in: 10509, records dropped: 537 output_compression: NoCompression
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.207209) EVENT_LOG_v1 {"time_micros": 1771580645207199, "job": 8, "event": "compaction_finished", "compaction_time_micros": 72205, "compaction_time_cpu_micros": 37141, "output_level": 6, "num_output_files": 1, "total_output_size": 17685845, "num_input_records": 10509, "num_output_records": 9972, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645207350, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580645209149, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.131660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209246) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209256) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:05.209269) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:05 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:05 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:44:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:05.538 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:44:06.007 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:44:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:44:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:44:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:44:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:44:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: from='client.34378 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:08 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:44:08 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:44:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:44:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='client.27331 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625200", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:44:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:10 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:10 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 20 09:44:10 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@2(peon) e10  my rank is now 1 (was 2)
Feb 20 09:44:10 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44000 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Feb 20 09:44:10 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44160 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Feb 20 09:44:10 np0005625204.localdomain ceph-osd[32226]: --2- [v2:172.18.0.108:6800/2098983975,v1:172.18.0.108:6801/2098983975] >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x55bf91f9d800 0x55bf8ee57180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Feb 20 09:44:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:10.541 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:10.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:10.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:10.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:10.585 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:10.586 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:44:11 np0005625204.localdomain podman[294946]: 2026-02-20 09:44:11.128725476 +0000 UTC m=+0.069238420 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:44:11 np0005625204.localdomain podman[294946]: 2026-02-20 09:44:11.166102804 +0000 UTC m=+0.106615738 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:44:11 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: paxos.1).electionLogic(38) init, last seen epoch 38
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='client.27341 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005625200"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: Remove daemons mon.np0005625200
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: Safe to remove mon.np0005625200: new quorum should be ['np0005625201', 'np0005625204', 'np0005625203', 'np0005625202'] (from ['np0005625201', 'np0005625204', 'np0005625203', 'np0005625202'])
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: Removing monitor np0005625200 from monmap...
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon rm", "name": "np0005625200"} : dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: Removing daemon mon.np0005625200 from np0005625200.localdomain -- ports []
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625203,np0005625202 in quorum (ranks 0,1,2,3)
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: monmap epoch 10
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:44:10.215299+0000
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: osdmap e88: 6 total, 6 up, 6 in
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: mgrmap e30: np0005625202.arwxwo(active, since 18s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:44:13 np0005625204.localdomain podman[294969]: 2026-02-20 09:44:13.144583523 +0000 UTC m=+0.084000998 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.7, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:44:13 np0005625204.localdomain podman[294969]: 2026-02-20 09:44:13.158900188 +0000 UTC m=+0.098317623 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, version=9.7, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter)
Feb 20 09:44:13 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:44:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:15 np0005625204.localdomain sudo[294989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:44:15 np0005625204.localdomain sudo[294989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:15 np0005625204.localdomain sudo[294989]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625203 (monmap changed)...
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:15 np0005625204.localdomain sudo[295007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:15 np0005625204.localdomain sudo[295007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:15.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:15.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:15.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:15.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:15.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:15.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:15 np0005625204.localdomain podman[295039]: 
Feb 20 09:44:15 np0005625204.localdomain podman[295039]: 2026-02-20 09:44:15.924793323 +0000 UTC m=+0.079724317 container create ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, distribution-scope=public, version=7, build-date=2026-02-09T10:25:24Z, RELEASE=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:44:15 np0005625204.localdomain systemd[1]: Started libpod-conmon-ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997.scope.
Feb 20 09:44:15 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:44:15 np0005625204.localdomain podman[295039]: 2026-02-20 09:44:15.89217896 +0000 UTC m=+0.047109984 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:44:16 np0005625204.localdomain podman[295039]: 2026-02-20 09:44:16.002138331 +0000 UTC m=+0.157069315 container init ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, release=1770267347, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, RELEASE=main)
Feb 20 09:44:16 np0005625204.localdomain podman[295039]: 2026-02-20 09:44:16.013446101 +0000 UTC m=+0.168377085 container start ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.buildah.version=1.42.2, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 20 09:44:16 np0005625204.localdomain podman[295039]: 2026-02-20 09:44:16.013702628 +0000 UTC m=+0.168633642 container attach ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container)
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: libpod-ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997.scope: Deactivated successfully.
Feb 20 09:44:16 np0005625204.localdomain reverent_noether[295054]: 167 167
Feb 20 09:44:16 np0005625204.localdomain podman[295039]: 2026-02-20 09:44:16.0190705 +0000 UTC m=+0.174001494 container died ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, release=1770267347, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:44:16 np0005625204.localdomain podman[295059]: 2026-02-20 09:44:16.123867755 +0000 UTC m=+0.091437278 container remove ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_noether, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, io.openshift.expose-services=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z)
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: libpod-conmon-ba20205c6d30a5994410c0dc88b8325ebc22c669f3b596107d27aa2be5933997.scope: Deactivated successfully.
Feb 20 09:44:16 np0005625204.localdomain sudo[295007]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:16 np0005625204.localdomain sudo[295075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:44:16 np0005625204.localdomain sudo[295075]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:16 np0005625204.localdomain sudo[295075]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:16 np0005625204.localdomain sudo[295093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:16 np0005625204.localdomain sudo[295093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: from='client.27452 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625200.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: Removed label mon from host np0005625200.localdomain
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:44:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 
Feb 20 09:44:16 np0005625204.localdomain podman[295127]: 2026-02-20 09:44:16.836768661 +0000 UTC m=+0.091548081 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 2026-02-20 09:44:16.847580937 +0000 UTC m=+0.081622600 container create 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.42.2, ceph=True, build-date=2026-02-09T10:25:24Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, description=Red Hat Ceph Storage 7)
Feb 20 09:44:16 np0005625204.localdomain podman[295127]: 2026-02-20 09:44:16.868358075 +0000 UTC m=+0.123137515 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: Started libpod-conmon-3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114.scope.
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:44:16 np0005625204.localdomain podman[295128]: 2026-02-20 09:44:16.894942257 +0000 UTC m=+0.149597703 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 2026-02-20 09:44:16.918805922 +0000 UTC m=+0.152847525 container init 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, architecture=x86_64, build-date=2026-02-09T10:25:24Z, version=7, GIT_CLEAN=True, CEPH_POINT_RELEASE=, release=1770267347, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc.)
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 2026-02-20 09:44:16.819922695 +0000 UTC m=+0.053964308 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 2026-02-20 09:44:16.929127254 +0000 UTC m=+0.163168877 container start 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, GIT_CLEAN=True, GIT_BRANCH=main, release=1770267347, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 2026-02-20 09:44:16.930356248 +0000 UTC m=+0.164397841 container attach 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.component=rhceph-container, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1770267347, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:44:16 np0005625204.localdomain competent_mirzakhani[295183]: 167 167
Feb 20 09:44:16 np0005625204.localdomain podman[295128]: 2026-02-20 09:44:16.930485753 +0000 UTC m=+0.185141169 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2aeb1a08d248c8a968aaccb4f356f1ba0c1d0611474ef7f1749fd8ddc73e3f34-merged.mount: Deactivated successfully.
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: libpod-3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114.scope: Deactivated successfully.
Feb 20 09:44:16 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:44:16 np0005625204.localdomain podman[295141]: 2026-02-20 09:44:16.985970592 +0000 UTC m=+0.220012225 container died 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, release=1770267347, version=7, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.expose-services=, ceph=True, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:44:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-6b01395a0477de3600c6caafe1127fb6de453160cbc61868ec55cd50b5eea17d-merged.mount: Deactivated successfully.
Feb 20 09:44:17 np0005625204.localdomain podman[295188]: 2026-02-20 09:44:17.021929389 +0000 UTC m=+0.081685151 container remove 3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=competent_mirzakhani, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, architecture=x86_64, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, CEPH_POINT_RELEASE=, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph)
Feb 20 09:44:17 np0005625204.localdomain systemd[1]: libpod-conmon-3be258306d905e670972c477cb51a8757f188dcd161cc2a119176e4bb519c114.scope: Deactivated successfully.
Feb 20 09:44:17 np0005625204.localdomain sudo[295093]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:17 np0005625204.localdomain sudo[295212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:44:17 np0005625204.localdomain sudo[295212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:17 np0005625204.localdomain sudo[295212]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:17 np0005625204.localdomain sudo[295230]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:17 np0005625204.localdomain sudo[295230]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.0 (monmap changed)...
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='client.44283 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625200.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: Removed label mgr from host np0005625200.localdomain
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:44:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:44:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:44:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:44:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:44:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:44:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18262 "" "Go-http-client/1.1"
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 2026-02-20 09:44:17.880442276 +0000 UTC m=+0.064196627 container create 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, GIT_BRANCH=main, CEPH_POINT_RELEASE=, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, distribution-scope=public, RELEASE=main, ceph=True, build-date=2026-02-09T10:25:24Z)
Feb 20 09:44:17 np0005625204.localdomain systemd[1]: Started libpod-conmon-4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68.scope.
Feb 20 09:44:17 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 2026-02-20 09:44:17.942853701 +0000 UTC m=+0.126608052 container init 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, ceph=True, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 2026-02-20 09:44:17.850286163 +0000 UTC m=+0.034040554 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 2026-02-20 09:44:17.952777662 +0000 UTC m=+0.136532013 container start 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64)
Feb 20 09:44:17 np0005625204.localdomain upbeat_curran[295276]: 167 167
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 2026-02-20 09:44:17.953004999 +0000 UTC m=+0.136759350 container attach 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=)
Feb 20 09:44:17 np0005625204.localdomain systemd[1]: libpod-4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68.scope: Deactivated successfully.
Feb 20 09:44:17 np0005625204.localdomain podman[295261]: 2026-02-20 09:44:17.958858514 +0000 UTC m=+0.142612865 container died 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-type=git, build-date=2026-02-09T10:25:24Z, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:44:18 np0005625204.localdomain systemd[1]: tmp-crun.0ZvC2d.mount: Deactivated successfully.
Feb 20 09:44:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-7915fbe5bfbe5ba878e439366042180654b88659784cc214a7e4c22c72ebb1dc-merged.mount: Deactivated successfully.
Feb 20 09:44:18 np0005625204.localdomain podman[295281]: 2026-02-20 09:44:18.055117047 +0000 UTC m=+0.089167354 container remove 4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_curran, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, release=1770267347, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph)
Feb 20 09:44:18 np0005625204.localdomain systemd[1]: libpod-conmon-4214b5d6b15737b880a4c2b691a24be505cd6fa23ddef66d5c65dc8bd8e06a68.scope: Deactivated successfully.
Feb 20 09:44:18 np0005625204.localdomain sudo[295230]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:18 np0005625204.localdomain sudo[295304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:44:18 np0005625204.localdomain sudo[295304]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:18 np0005625204.localdomain sudo[295304]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:18 np0005625204.localdomain sudo[295322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:18 np0005625204.localdomain sudo[295322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)...
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 2026-02-20 09:44:18.893843463 +0000 UTC m=+0.077318508 container create b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.42.2, release=1770267347, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git)
Feb 20 09:44:18 np0005625204.localdomain systemd[1]: Started libpod-conmon-b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5.scope.
Feb 20 09:44:18 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 2026-02-20 09:44:18.959952904 +0000 UTC m=+0.143427949 container init b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, RELEASE=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.42.2, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 2026-02-20 09:44:18.86545841 +0000 UTC m=+0.048933485 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 2026-02-20 09:44:18.968881506 +0000 UTC m=+0.152356561 container start b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., release=1770267347, GIT_CLEAN=True, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z)
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 2026-02-20 09:44:18.969347709 +0000 UTC m=+0.152822784 container attach b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64)
Feb 20 09:44:18 np0005625204.localdomain dazzling_diffie[295373]: 167 167
Feb 20 09:44:18 np0005625204.localdomain systemd[1]: libpod-b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5.scope: Deactivated successfully.
Feb 20 09:44:18 np0005625204.localdomain podman[295358]: 2026-02-20 09:44:18.972331433 +0000 UTC m=+0.155806508 container died b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:44:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-337adc02fe9baf23e554155ed2c678d7a7aef48b39255cae15d0e3b5a45b57c6-merged.mount: Deactivated successfully.
Feb 20 09:44:19 np0005625204.localdomain podman[295378]: 2026-02-20 09:44:19.064082409 +0000 UTC m=+0.084799399 container remove b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=dazzling_diffie, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347)
Feb 20 09:44:19 np0005625204.localdomain systemd[1]: libpod-conmon-b05429dc938dc957368bbd998a857cfd494b9346e91e9370ba948f651daf7ff5.scope: Deactivated successfully.
Feb 20 09:44:19 np0005625204.localdomain sudo[295322]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:19 np0005625204.localdomain sudo[295395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:44:19 np0005625204.localdomain sudo[295395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:19 np0005625204.localdomain sudo[295395]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:19 np0005625204.localdomain sudo[295413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:19 np0005625204.localdomain sudo[295413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='client.27464 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625200.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: Removed label _admin from host np0005625200.localdomain
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 2026-02-20 09:44:19.768895238 +0000 UTC m=+0.074791857 container create f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, distribution-scope=public, CEPH_POINT_RELEASE=, version=7, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:44:19 np0005625204.localdomain systemd[1]: Started libpod-conmon-f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4.scope.
Feb 20 09:44:19 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 2026-02-20 09:44:19.828937996 +0000 UTC m=+0.134834615 container init f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, io.buildah.version=1.42.2, GIT_BRANCH=main, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 2026-02-20 09:44:19.738849688 +0000 UTC m=+0.044746377 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 2026-02-20 09:44:19.838213618 +0000 UTC m=+0.144110257 container start f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, release=1770267347, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, version=7, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 2026-02-20 09:44:19.838499897 +0000 UTC m=+0.144396566 container attach f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:44:19 np0005625204.localdomain suspicious_sanderson[295463]: 167 167
Feb 20 09:44:19 np0005625204.localdomain systemd[1]: libpod-f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4.scope: Deactivated successfully.
Feb 20 09:44:19 np0005625204.localdomain podman[295448]: 2026-02-20 09:44:19.841820841 +0000 UTC m=+0.147717490 container died f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Feb 20 09:44:19 np0005625204.localdomain sshd[295466]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:44:19 np0005625204.localdomain podman[295469]: 2026-02-20 09:44:19.936216721 +0000 UTC m=+0.081864597 container remove f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_sanderson, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, architecture=x86_64, release=1770267347, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, RELEASE=main, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:44:19 np0005625204.localdomain systemd[1]: libpod-conmon-f88027c6426e514a30c8546d10d8b8294396e043e962fd6f2fbc6904386af7f4.scope: Deactivated successfully.
Feb 20 09:44:20 np0005625204.localdomain sudo[295413]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:20 np0005625204.localdomain sshd[295466]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:44:20 np0005625204.localdomain sudo[295486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:44:20 np0005625204.localdomain sudo[295486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:20 np0005625204.localdomain sudo[295486]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:20 np0005625204.localdomain sudo[295504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:20 np0005625204.localdomain sudo[295504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:44:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 
Feb 20 09:44:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:20.626 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:20.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:20.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:20.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:20.666 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:20.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 2026-02-20 09:44:20.686052893 +0000 UTC m=+0.121306143 container create ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, version=7, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, distribution-scope=public, release=1770267347, architecture=x86_64, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=)
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 2026-02-20 09:44:20.603906399 +0000 UTC m=+0.039159719 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:44:20 np0005625204.localdomain systemd[1]: Started libpod-conmon-ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f.scope.
Feb 20 09:44:20 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 2026-02-20 09:44:20.749169379 +0000 UTC m=+0.184422659 container init ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_CLEAN=True, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 2026-02-20 09:44:20.75877024 +0000 UTC m=+0.194023490 container start ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., release=1770267347, com.redhat.component=rhceph-container, version=7, ceph=True)
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 2026-02-20 09:44:20.758979226 +0000 UTC m=+0.194232546 container attach ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:44:20 np0005625204.localdomain systemd[1]: tmp-crun.YWRtJV.mount: Deactivated successfully.
Feb 20 09:44:20 np0005625204.localdomain gifted_mendel[295555]: 167 167
Feb 20 09:44:20 np0005625204.localdomain systemd[1]: libpod-ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f.scope: Deactivated successfully.
Feb 20 09:44:20 np0005625204.localdomain podman[295539]: 2026-02-20 09:44:20.763736911 +0000 UTC m=+0.198990261 container died ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, release=1770267347, vcs-type=git, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True)
Feb 20 09:44:20 np0005625204.localdomain podman[295560]: 2026-02-20 09:44:20.856266538 +0000 UTC m=+0.080266312 container remove ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gifted_mendel, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, RELEASE=main, version=7, io.openshift.expose-services=, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2)
Feb 20 09:44:20 np0005625204.localdomain systemd[1]: libpod-conmon-ba4a2f4998f77b66980bf36c305ff93f24de731e045cc330dca0fa5654dd124f.scope: Deactivated successfully.
Feb 20 09:44:20 np0005625204.localdomain sudo[295504]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:20 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ffe47c171a5ec023f1a2d56eb315ba19babf288d1155f2f24243248737177bbf-merged.mount: Deactivated successfully.
Feb 20 09:44:21 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625204 (monmap changed)...
Feb 20 09:44:21 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:44:21 np0005625204.localdomain ceph-mon[288586]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:22 np0005625204.localdomain sudo[295576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:44:22 np0005625204.localdomain sudo[295576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:22 np0005625204.localdomain sudo[295576]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:22 np0005625204.localdomain sudo[295594]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:44:22 np0005625204.localdomain sudo[295594]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:22 np0005625204.localdomain sudo[295594]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:22 np0005625204.localdomain sudo[295612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:22 np0005625204.localdomain sudo[295612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:22 np0005625204.localdomain sudo[295612]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:22 np0005625204.localdomain sudo[295630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:22 np0005625204.localdomain sudo[295630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:22 np0005625204.localdomain sudo[295630]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:22 np0005625204.localdomain sudo[295648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:22 np0005625204.localdomain sudo[295648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:22 np0005625204.localdomain sudo[295648]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:23 np0005625204.localdomain sudo[295682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295682]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:23 np0005625204.localdomain sudo[295700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295700]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain sudo[295718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295718]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:44:23 np0005625204.localdomain sudo[295736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295736]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:44:23 np0005625204.localdomain sudo[295754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295754]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:23 np0005625204.localdomain sudo[295772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295772]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: Removing np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:23 np0005625204.localdomain sudo[295790]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:23 np0005625204.localdomain sudo[295790]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295790]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:23 np0005625204.localdomain sudo[295808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295808]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:23 np0005625204.localdomain sudo[295842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295842]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295860]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:23 np0005625204.localdomain sudo[295860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295860]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:23 np0005625204.localdomain sudo[295878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:23 np0005625204.localdomain sudo[295878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:23 np0005625204.localdomain sudo[295878]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: Removing np0005625200.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: Removing np0005625200.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:25 np0005625204.localdomain ceph-mon[288586]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:25 np0005625204.localdomain ceph-mon[288586]: Removing daemon mgr.np0005625200.ypbkax from np0005625200.localdomain -- ports [8765]
Feb 20 09:44:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:25.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:25.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:25.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:25.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:25.703 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:25.704 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:44:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:44:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:44:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:44:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:44:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:44:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:44:27 np0005625204.localdomain podman[295896]: 2026-02-20 09:44:27.138783873 +0000 UTC m=+0.073906242 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:44:27 np0005625204.localdomain podman[295896]: 2026-02-20 09:44:27.14998426 +0000 UTC m=+0.085106659 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Feb 20 09:44:27 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:44:27 np0005625204.localdomain sudo[295915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:44:27 np0005625204.localdomain sudo[295915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:27 np0005625204.localdomain sudo[295915]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "mgr.np0005625200.ypbkax"} : dispatch
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "mgr.np0005625200.ypbkax"} : dispatch
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005625200.ypbkax"}]': finished
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:44:28 np0005625204.localdomain ceph-mon[288586]: Removing key for mgr.np0005625200.ypbkax
Feb 20 09:44:29 np0005625204.localdomain sudo[295933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:44:29 np0005625204.localdomain sudo[295933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:29 np0005625204.localdomain sudo[295933]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625200", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.043437) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670043514, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1289, "num_deletes": 251, "total_data_size": 2103769, "memory_usage": 2147472, "flush_reason": "Manual Compaction"}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670052155, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1203162, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16018, "largest_seqno": 17302, "table_properties": {"data_size": 1197410, "index_size": 2903, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15654, "raw_average_key_size": 22, "raw_value_size": 1184691, "raw_average_value_size": 1692, "num_data_blocks": 125, "num_entries": 700, "num_filter_entries": 700, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580645, "oldest_key_time": 1771580645, "file_creation_time": 1771580670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 8756 microseconds, and 4192 cpu microseconds.
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.052203) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1203162 bytes OK
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.052226) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056049) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056076) EVENT_LOG_v1 {"time_micros": 1771580670056071, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056099) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2097075, prev total WAL file size 2097075, number of live WAL files 2.
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056797) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1174KB)], [21(16MB)]
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670056841, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 18889007, "oldest_snapshot_seqno": -1}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10134 keys, 15189048 bytes, temperature: kUnknown
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670125784, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 15189048, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15132495, "index_size": 30148, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25349, "raw_key_size": 272356, "raw_average_key_size": 26, "raw_value_size": 14960414, "raw_average_value_size": 1476, "num_data_blocks": 1137, "num_entries": 10134, "num_filter_entries": 10134, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580670, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.126101) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 15189048 bytes
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.127712) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 273.5 rd, 219.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 16.9 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(28.3) write-amplify(12.6) OK, records in: 10672, records dropped: 538 output_compression: NoCompression
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.127740) EVENT_LOG_v1 {"time_micros": 1771580670127727, "job": 10, "event": "compaction_finished", "compaction_time_micros": 69067, "compaction_time_cpu_micros": 41206, "output_level": 6, "num_output_files": 1, "total_output_size": 15189048, "num_input_records": 10672, "num_output_records": 10134, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670128029, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580670130744, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.056708) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130813) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:44:30.130828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625200 (monmap changed)...
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625200 on np0005625200.localdomain
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: from='client.27472 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005625200.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: Added label _no_schedule to host np0005625200.localdomain
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625200.localdomain
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:44:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:30.705 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:30.708 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)...
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='client.27484 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005625200.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)...
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)...
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='client.34398 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005625200.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain"} : dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain"} : dispatch
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain"}]': finished
Feb 20 09:44:32 np0005625204.localdomain ceph-mon[288586]: Removed host np0005625200.localdomain
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:44:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:35 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:35.707 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:36 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:36 np0005625204.localdomain podman[295951]: 2026-02-20 09:44:36.141322561 +0000 UTC m=+0.080263612 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:44:36 np0005625204.localdomain podman[295951]: 2026-02-20 09:44:36.154075073 +0000 UTC m=+0.093016134 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:44:36 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:44:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:44:39 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:40 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:44:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:40.709 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:40.711 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:40.711 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:40.712 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:40.758 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:40.759 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='client.27496 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: Saving service mon spec with placement label:mon
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:41 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:44:41 np0005625204.localdomain sudo[295976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:44:41 np0005625204.localdomain sudo[295976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:44:41 np0005625204.localdomain sudo[295976]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:42 np0005625204.localdomain systemd[1]: tmp-crun.M9G0kb.mount: Deactivated successfully.
Feb 20 09:44:42 np0005625204.localdomain podman[295994]: 2026-02-20 09:44:42.096729354 +0000 UTC m=+0.091760977 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:44:42 np0005625204.localdomain podman[295994]: 2026-02-20 09:44:42.131062455 +0000 UTC m=+0.126094078 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:44:42 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:44:42 np0005625204.localdomain ceph-mon[288586]: from='client.27504 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625203", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:44:42 np0005625204.localdomain ceph-mon[288586]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:43 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc442c0 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Feb 20 09:44:43 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:44:43 np0005625204.localdomain ceph-mon[288586]: paxos.1).electionLogic(40) init, last seen epoch 40
Feb 20 09:44:43 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:43 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:44:44 np0005625204.localdomain podman[296017]: 2026-02-20 09:44:44.140144318 +0000 UTC m=+0.079410957 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:44:44 np0005625204.localdomain podman[296017]: 2026-02-20 09:44:44.183123605 +0000 UTC m=+0.122390204 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, version=9.7, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git)
Feb 20 09:44:44 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.747 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.747 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.748 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.761 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.764 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.764 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.764 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.790 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:45.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:44:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:44:47 np0005625204.localdomain systemd[1]: tmp-crun.xd8oAh.mount: Deactivated successfully.
Feb 20 09:44:47 np0005625204.localdomain podman[296046]: 2026-02-20 09:44:47.149514541 +0000 UTC m=+0.086741595 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Feb 20 09:44:47 np0005625204.localdomain podman[296047]: 2026-02-20 09:44:47.201350747 +0000 UTC m=+0.136305897 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:44:47 np0005625204.localdomain podman[296047]: 2026-02-20 09:44:47.210237538 +0000 UTC m=+0.145192688 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 20 09:44:47 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:44:47 np0005625204.localdomain podman[296046]: 2026-02-20 09:44:47.260780187 +0000 UTC m=+0.198007241 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 20 09:44:47 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:44:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:44:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:44:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:44:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:44:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:44:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18259 "" "Go-http-client/1.1"
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: paxos.1).electionLogic(43) init, last seen epoch 43, mid-election, bumping
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e11 handle_timecheck drop unexpected msg
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:48 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:44:48 np0005625204.localdomain sudo[296088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:44:48 np0005625204.localdomain sudo[296088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:48 np0005625204.localdomain sudo[296088]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:48 np0005625204.localdomain sudo[296106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:44:48 np0005625204.localdomain sudo[296106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:48 np0005625204.localdomain sudo[296106]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:48 np0005625204.localdomain sudo[296124]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:48 np0005625204.localdomain sudo[296124]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:48 np0005625204.localdomain sudo[296124]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:48 np0005625204.localdomain sudo[296142]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:48 np0005625204.localdomain sudo[296142]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:48 np0005625204.localdomain sudo[296142]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:48 np0005625204.localdomain sudo[296160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:48 np0005625204.localdomain sudo[296160]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:48 np0005625204.localdomain sudo[296160]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:48 np0005625204.localdomain sudo[296203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:48 np0005625204.localdomain sudo[296203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:48 np0005625204.localdomain sudo[296203]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain sudo[296221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:44:49 np0005625204.localdomain sudo[296221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296221]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain sudo[296239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:44:49 np0005625204.localdomain sudo[296239]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296239]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain sudo[296257]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:44:49 np0005625204.localdomain sudo[296257]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296257]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3982866868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.277 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:44:49 np0005625204.localdomain sudo[296275]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:44:49 np0005625204.localdomain sudo[296275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296275]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.357 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.358 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:44:49 np0005625204.localdomain sudo[296295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:49 np0005625204.localdomain sudo[296295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296295]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Remove daemons mon.np0005625203
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Safe to remove mon.np0005625203: new quorum should be ['np0005625201', 'np0005625204', 'np0005625202'] (from ['np0005625201', 'np0005625204', 'np0005625202'])
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Removing monitor np0005625203 from monmap...
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon rm", "name": "np0005625203"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Removing daemon mon.np0005625203 from np0005625203.localdomain -- ports []
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: monmap epoch 11
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:44:43.337910+0000
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: osdmap e88: 6 total, 6 up, 6 in
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mgrmap e30: np0005625202.arwxwo(active, since 54s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Health check failed: 1/3 mons down, quorum np0005625201,np0005625204 (MON_DOWN)
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625202 in quorum (ranks 0,1,2)
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: monmap epoch 11
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:44:43.337910+0000
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: osdmap e88: 6 total, 6 up, 6 in
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: mgrmap e30: np0005625202.arwxwo(active, since 54s), standbys: np0005625203.lonygy, np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625201,np0005625204)
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Cluster is now healthy
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/1442551253' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/1469195323' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/3982866868' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:49 np0005625204.localdomain sudo[296313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:44:49 np0005625204.localdomain sudo[296313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296313]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain sudo[296331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:49 np0005625204.localdomain sudo[296331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296331]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.590 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.592 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11789MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.593 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.593 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.659 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:44:49 np0005625204.localdomain sudo[296365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:49 np0005625204.localdomain sudo[296365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296365]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:49.697 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:44:49 np0005625204.localdomain sudo[296383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:44:49 np0005625204.localdomain sudo[296383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296383]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:49 np0005625204.localdomain sudo[296402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:49 np0005625204.localdomain sudo[296402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:49 np0005625204.localdomain sudo[296402]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1776286500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.117 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.123 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.140 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.143 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.143 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.550s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:44:50 np0005625204.localdomain sudo[296441]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:44:50 np0005625204.localdomain sudo[296441]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:44:50 np0005625204.localdomain sudo[296441]: pam_unix(sudo:session): session closed for user root
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/4195120872' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/2014991144' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/1776286500' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625201.mtnyvu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.826 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:50.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625201.mtnyvu (monmap changed)...
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625201.mtnyvu on np0005625201.localdomain
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625201", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.144 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.144 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.145 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.145 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.397 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.398 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.398 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.399 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625201 (monmap changed)...
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625201 on np0005625201.localdomain
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:52 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.754 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.774 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.775 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.775 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.776 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.777 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.777 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:44:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:52.778 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:44:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:44:53 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:44:53 np0005625204.localdomain ceph-mon[288586]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:53 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:44:54 np0005625204.localdomain ceph-mon[288586]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:55 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:44:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:55.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:55.829 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:44:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:55.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:44:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:55.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:55.864 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:44:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:44:55.865 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:44:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:44:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:44:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:44:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:44:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:44:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='client.44339 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005625203.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mon.np0005625203 on np0005625203.localdomain
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:44:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:44:57 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:44:58 np0005625204.localdomain podman[296459]: 2026-02-20 09:44:58.150045402 +0000 UTC m=+0.083918884 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:44:58 np0005625204.localdomain podman[296459]: 2026-02-20 09:44:58.158596925 +0000 UTC m=+0.092470377 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute)
Feb 20 09:44:58 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:44:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:44:58 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:44:58 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:44:58 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:44:58 np0005625204.localdomain ceph-mon[288586]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:44:58 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:44:59 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:00 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:00.865 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:00.866 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:00.867 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:00.867 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:00.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:00.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:01 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/4145115626' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/4145115626' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:45:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:03 np0005625204.localdomain sudo[296480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:03 np0005625204.localdomain sudo[296480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:03 np0005625204.localdomain sudo[296480]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:03 np0005625204.localdomain sudo[296498]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:03 np0005625204.localdomain sudo[296498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:03 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 2026-02-20 09:45:03.651742568 +0000 UTC m=+0.068195250 container create c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, release=1770267347, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:45:03 np0005625204.localdomain systemd[1]: Started libpod-conmon-c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c.scope.
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 2026-02-20 09:45:03.621741599 +0000 UTC m=+0.038194341 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:03 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 2026-02-20 09:45:03.742261559 +0000 UTC m=+0.158714271 container init c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=1770267347, version=7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7)
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 2026-02-20 09:45:03.753512667 +0000 UTC m=+0.169965379 container start c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.42.2, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, CEPH_POINT_RELEASE=, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347)
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 2026-02-20 09:45:03.753874757 +0000 UTC m=+0.170327529 container attach c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., RELEASE=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, ceph=True, io.openshift.tags=rhceph ceph, version=7, release=1770267347)
Feb 20 09:45:03 np0005625204.localdomain zealous_napier[296547]: 167 167
Feb 20 09:45:03 np0005625204.localdomain systemd[1]: libpod-c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c.scope: Deactivated successfully.
Feb 20 09:45:03 np0005625204.localdomain podman[296532]: 2026-02-20 09:45:03.758405285 +0000 UTC m=+0.174858027 container died c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, version=7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:45:03 np0005625204.localdomain podman[296553]: 2026-02-20 09:45:03.864229799 +0000 UTC m=+0.093515646 container remove c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_napier, io.buildah.version=1.42.2, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vcs-type=git, distribution-scope=public, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, release=1770267347, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7)
Feb 20 09:45:03 np0005625204.localdomain systemd[1]: libpod-conmon-c50206042ac2a2baf0ae098f66b5aa9ccbc03bdf7ee1d5dac32d0caa5027448c.scope: Deactivated successfully.
Feb 20 09:45:03 np0005625204.localdomain sudo[296498]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:04 np0005625204.localdomain sudo[296570]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:04 np0005625204.localdomain sudo[296570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:04 np0005625204.localdomain sudo[296570]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:04 np0005625204.localdomain sudo[296588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:04 np0005625204.localdomain sudo[296588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:45:04 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 2026-02-20 09:45:04.611788316 +0000 UTC m=+0.082580376 container create 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.42.2, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347)
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: Started libpod-conmon-81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485.scope.
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-243ff6385feb9fe0152935974108027b4116b4fd6887fd9fc30f44497ac86910-merged.mount: Deactivated successfully.
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 2026-02-20 09:45:04.580070069 +0000 UTC m=+0.050862169 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 2026-02-20 09:45:04.687538429 +0000 UTC m=+0.158330489 container init 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public)
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: tmp-crun.IJ4vwV.mount: Deactivated successfully.
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 2026-02-20 09:45:04.700821975 +0000 UTC m=+0.171614045 container start 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, release=1770267347, io.openshift.tags=rhceph ceph)
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 2026-02-20 09:45:04.701171735 +0000 UTC m=+0.171963835 container attach 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, version=7, RELEASE=main)
Feb 20 09:45:04 np0005625204.localdomain recursing_meninsky[296637]: 167 167
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: libpod-81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485.scope: Deactivated successfully.
Feb 20 09:45:04 np0005625204.localdomain podman[296622]: 2026-02-20 09:45:04.704943101 +0000 UTC m=+0.175735201 container died 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.buildah.version=1.42.2, RELEASE=main, GIT_BRANCH=main, version=7, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f3047779d42050ad8bd16eda4af33a2351292539bb0a987173088378b2d9e83e-merged.mount: Deactivated successfully.
Feb 20 09:45:04 np0005625204.localdomain podman[296642]: 2026-02-20 09:45:04.805692721 +0000 UTC m=+0.086882158 container remove 81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_meninsky, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:45:04 np0005625204.localdomain systemd[1]: libpod-conmon-81af0f3c2252b843a7358c5c2b3f9b34d9c3bca28987f6266a65d8fc8e2d4485.scope: Deactivated successfully.
Feb 20 09:45:05 np0005625204.localdomain sudo[296588]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:05 np0005625204.localdomain sudo[296665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:05 np0005625204.localdomain sudo[296665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:05 np0005625204.localdomain sudo[296665]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:05 np0005625204.localdomain sudo[296683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:05 np0005625204.localdomain sudo[296683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.0 (monmap changed)...
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 2026-02-20 09:45:05.697575412 +0000 UTC m=+0.083464762 container create a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, vendor=Red Hat, Inc., vcs-type=git, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, name=rhceph)
Feb 20 09:45:05 np0005625204.localdomain systemd[1]: Started libpod-conmon-a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78.scope.
Feb 20 09:45:05 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 2026-02-20 09:45:05.665053662 +0000 UTC m=+0.050943072 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 2026-02-20 09:45:05.774330593 +0000 UTC m=+0.160219933 container init a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, GIT_BRANCH=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1770267347, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.buildah.version=1.42.2, name=rhceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True)
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 2026-02-20 09:45:05.78942172 +0000 UTC m=+0.175311070 container start a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_CLEAN=True)
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 2026-02-20 09:45:05.789776121 +0000 UTC m=+0.175665471 container attach a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, architecture=x86_64, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.expose-services=, distribution-scope=public, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:45:05 np0005625204.localdomain crazy_cerf[296732]: 167 167
Feb 20 09:45:05 np0005625204.localdomain systemd[1]: libpod-a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78.scope: Deactivated successfully.
Feb 20 09:45:05 np0005625204.localdomain podman[296717]: 2026-02-20 09:45:05.792608731 +0000 UTC m=+0.178498081 container died a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.42.2, release=1770267347, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:45:05 np0005625204.localdomain sshd[296749]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:45:05 np0005625204.localdomain sshd[296751]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:45:05 np0005625204.localdomain podman[296737]: 2026-02-20 09:45:05.882226986 +0000 UTC m=+0.082206347 container remove a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=crazy_cerf, build-date=2026-02-09T10:25:24Z, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, ceph=True, release=1770267347, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:45:05 np0005625204.localdomain systemd[1]: libpod-conmon-a5b15936fbcab2612aeeef0dc27c6a738325c367cfb251443cd8f9954d9a8e78.scope: Deactivated successfully.
Feb 20 09:45:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:05.906 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:05.910 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:05.910 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:05.910 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:05.950 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:05.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:45:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:45:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:45:06.008 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:45:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:45:06.011 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:45:06 np0005625204.localdomain sudo[296683]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:06 np0005625204.localdomain sudo[296763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:06 np0005625204.localdomain sudo[296763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:45:06 np0005625204.localdomain sudo[296763]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:06 np0005625204.localdomain sshd[296801]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:45:06 np0005625204.localdomain sudo[296782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:06 np0005625204.localdomain sudo[296782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:06 np0005625204.localdomain podman[296781]: 2026-02-20 09:45:06.333613885 +0000 UTC m=+0.089244816 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:45:06 np0005625204.localdomain podman[296781]: 2026-02-20 09:45:06.344615396 +0000 UTC m=+0.100246347 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.3 (monmap changed)...
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:45:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:06 np0005625204.localdomain sshd[296751]: Invalid user laurent from 57.128.218.144 port 36496
Feb 20 09:45:06 np0005625204.localdomain sshd[296751]: Received disconnect from 57.128.218.144 port 36496:11: Bye Bye [preauth]
Feb 20 09:45:06 np0005625204.localdomain sshd[296751]: Disconnected from invalid user laurent 57.128.218.144 port 36496 [preauth]
Feb 20 09:45:06 np0005625204.localdomain sshd[296801]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: tmp-crun.NWRdsv.mount: Deactivated successfully.
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c28de7092b6016f41a23dbffffb648be93dfe59aed1724101236e677c0d4345b-merged.mount: Deactivated successfully.
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 2026-02-20 09:45:06.833566528 +0000 UTC m=+0.075126877 container create d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_BRANCH=main, GIT_CLEAN=True, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, build-date=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: Started libpod-conmon-d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309.scope.
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 2026-02-20 09:45:06.907317475 +0000 UTC m=+0.148877824 container init d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_CLEAN=True)
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 2026-02-20 09:45:06.808789887 +0000 UTC m=+0.050350276 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 2026-02-20 09:45:06.918904112 +0000 UTC m=+0.160464471 container start d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 2026-02-20 09:45:06.919155549 +0000 UTC m=+0.160715918 container attach d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_CLEAN=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, version=7, ceph=True, io.openshift.tags=rhceph ceph)
Feb 20 09:45:06 np0005625204.localdomain interesting_diffie[296854]: 167 167
Feb 20 09:45:06 np0005625204.localdomain systemd[1]: libpod-d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309.scope: Deactivated successfully.
Feb 20 09:45:06 np0005625204.localdomain podman[296839]: 2026-02-20 09:45:06.924430899 +0000 UTC m=+0.165991308 container died d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1770267347, name=rhceph, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2)
Feb 20 09:45:07 np0005625204.localdomain podman[296859]: 2026-02-20 09:45:07.056467114 +0000 UTC m=+0.123397302 container remove d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_diffie, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., distribution-scope=public, version=7, GIT_CLEAN=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, RELEASE=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:45:07 np0005625204.localdomain systemd[1]: libpod-conmon-d5aec0b979e89c6b2c0f7f41e32501c4b260db36798978155e8fdd4ccfafa309.scope: Deactivated successfully.
Feb 20 09:45:07 np0005625204.localdomain sudo[296782]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:07 np0005625204.localdomain sshd[296749]: Invalid user sol from 45.148.10.240 port 43454
Feb 20 09:45:07 np0005625204.localdomain sudo[296877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:07 np0005625204.localdomain sudo[296877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:07 np0005625204.localdomain sudo[296877]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:07 np0005625204.localdomain sudo[296895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:07 np0005625204.localdomain sudo[296895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:07 np0005625204.localdomain sshd[296749]: Connection closed by invalid user sol 45.148.10.240 port 43454 [preauth]
Feb 20 09:45:07 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-a4179095fe9a7ad288376c95fe3a3aabd501a0c0f28ed0d3d3a5a9a945c1eb6c-merged.mount: Deactivated successfully.
Feb 20 09:45:07 np0005625204.localdomain podman[296929]: 
Feb 20 09:45:07 np0005625204.localdomain podman[296929]: 2026-02-20 09:45:07.819093847 +0000 UTC m=+0.069570499 container create 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, vcs-type=git, name=rhceph, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:45:07 np0005625204.localdomain systemd[1]: Started libpod-conmon-13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879.scope.
Feb 20 09:45:07 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:07 np0005625204.localdomain podman[296929]: 2026-02-20 09:45:07.881722428 +0000 UTC m=+0.132199040 container init 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, CEPH_POINT_RELEASE=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2026-02-09T10:25:24Z, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1770267347, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, distribution-scope=public)
Feb 20 09:45:07 np0005625204.localdomain podman[296929]: 2026-02-20 09:45:07.892441332 +0000 UTC m=+0.142917954 container start 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Feb 20 09:45:07 np0005625204.localdomain podman[296929]: 2026-02-20 09:45:07.892594356 +0000 UTC m=+0.143070968 container attach 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, distribution-scope=public, version=7, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, name=rhceph, ceph=True, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, release=1770267347)
Feb 20 09:45:07 np0005625204.localdomain podman[296929]: 2026-02-20 09:45:07.793287777 +0000 UTC m=+0.043764459 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:07 np0005625204.localdomain affectionate_johnson[296944]: 167 167
Feb 20 09:45:07 np0005625204.localdomain systemd[1]: libpod-13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879.scope: Deactivated successfully.
Feb 20 09:45:07 np0005625204.localdomain podman[296949]: 2026-02-20 09:45:07.964312405 +0000 UTC m=+0.056836658 container died 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.42.2, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:45:07 np0005625204.localdomain podman[296949]: 2026-02-20 09:45:07.999690405 +0000 UTC m=+0.092214598 container remove 13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=affectionate_johnson, RELEASE=main, distribution-scope=public, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:45:08 np0005625204.localdomain systemd[1]: libpod-conmon-13c7a176f797696285215208030804c1aa0601bc79737ae46692b5058db47879.scope: Deactivated successfully.
Feb 20 09:45:08 np0005625204.localdomain sudo[296895]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:08 np0005625204.localdomain sudo[296965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:08 np0005625204.localdomain sudo[296965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:08 np0005625204.localdomain sudo[296965]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:08 np0005625204.localdomain sudo[296983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:45:08 np0005625204.localdomain sudo[296983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:08 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:45:08 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:45:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:08 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:08 np0005625204.localdomain systemd[1]: tmp-crun.ZfCamy.mount: Deactivated successfully.
Feb 20 09:45:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f324a384ad3aaa47e636805019c1f363abeaa08ce5bdbfa356d3180ce18ba22f-merged.mount: Deactivated successfully.
Feb 20 09:45:08 np0005625204.localdomain sudo[296983]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:09 np0005625204.localdomain ceph-mon[288586]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/2738563585' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:45:10 np0005625204.localdomain sudo[297032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:45:10 np0005625204.localdomain sudo[297032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:10.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:10.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:10.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:10.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:10 np0005625204.localdomain sudo[297032]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:10.988 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:10.989 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:11 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:11 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:45:13 np0005625204.localdomain ceph-mon[288586]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:13 np0005625204.localdomain podman[297050]: 2026-02-20 09:45:13.16520293 +0000 UTC m=+0.099515776 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:45:13 np0005625204.localdomain podman[297050]: 2026-02-20 09:45:13.177170929 +0000 UTC m=+0.111483805 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:45:13 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:45:13 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:15 np0005625204.localdomain podman[297074]: 2026-02-20 09:45:15.151801319 +0000 UTC m=+0.089879534 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:45:15 np0005625204.localdomain podman[297074]: 2026-02-20 09:45:15.196274687 +0000 UTC m=+0.134352932 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, version=9.7, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Feb 20 09:45:15 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:15.990 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:15.992 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:15.992 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:15.993 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:16.029 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:16.030 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:16 np0005625204.localdomain ceph-mon[288586]: from='client.27552 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:45:16 np0005625204.localdomain ceph-mon[288586]: Reconfig service osd.default_drive_group
Feb 20 09:45:16 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:16 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 e89: 6 total, 6 up, 6 in
Feb 20 09:45:16 np0005625204.localdomain sshd[293896]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:45:16 np0005625204.localdomain systemd[1]: session-68.scope: Deactivated successfully.
Feb 20 09:45:16 np0005625204.localdomain systemd[1]: session-68.scope: Consumed 18.697s CPU time.
Feb 20 09:45:16 np0005625204.localdomain systemd-logind[759]: Session 68 logged out. Waiting for processes to exit.
Feb 20 09:45:16 np0005625204.localdomain systemd-logind[759]: Removed session 68.
Feb 20 09:45:16 np0005625204.localdomain sshd[297095]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:45:17 np0005625204.localdomain sshd[297095]: Accepted publickey for ceph-admin from 192.168.122.107 port 51380 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 09:45:17 np0005625204.localdomain systemd-logind[759]: New session 69 of user ceph-admin.
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Feb 20 09:45:17 np0005625204.localdomain sshd[297095]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 ' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26977 172.18.0.106:0/872747270' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/2448153276' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: Activating manager daemon np0005625203.lonygy
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/2448153276' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: mgrmap e31: np0005625203.lonygy(active, starting, since 0.0383473s), standbys: np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr metadata", "who": "np0005625200.ypbkax", "id": "np0005625200.ypbkax"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: Manager daemon np0005625203.lonygy is now available
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: removing stray HostCache host record np0005625200.localdomain.devices.0
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"}]': finished
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625200.localdomain.devices.0"}]': finished
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/mirror_snapshot_schedule"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/mirror_snapshot_schedule"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/trash_purge_schedule"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625203.lonygy/trash_purge_schedule"} : dispatch
Feb 20 09:45:17 np0005625204.localdomain sudo[297099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:17 np0005625204.localdomain sudo[297099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:45:17 np0005625204.localdomain sudo[297099]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:45:17 np0005625204.localdomain sudo[297119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:45:17 np0005625204.localdomain sudo[297119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: tmp-crun.ntW3B7.mount: Deactivated successfully.
Feb 20 09:45:17 np0005625204.localdomain podman[297118]: 2026-02-20 09:45:17.394886363 +0000 UTC m=+0.099872546 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: tmp-crun.9YT7Kd.mount: Deactivated successfully.
Feb 20 09:45:17 np0005625204.localdomain podman[297117]: 2026-02-20 09:45:17.432024364 +0000 UTC m=+0.140725663 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:45:17 np0005625204.localdomain podman[297117]: 2026-02-20 09:45:17.437082996 +0000 UTC m=+0.145784315 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:45:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:17 np0005625204.localdomain podman[297118]: 2026-02-20 09:45:17.478592081 +0000 UTC m=+0.183578264 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 20 09:45:17 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:45:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:45:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:45:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:45:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:45:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:45:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18260 "" "Go-http-client/1.1"
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.207 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.238 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.239 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '367fc62c-6498-45e7-afb6-ab87b48faf43', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.208821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd854e26-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'a8db71b9a2c7b8c85e64334e074e205fd69d673cb5a1a2621c68e8e8593ad0fd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.208821', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd856532-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'adba15823ed78bc62f98a5c63d35da7222e5babbebe49867e6edd58042daeb6e'}]}, 'timestamp': '2026-02-20 09:45:18.240108', '_unique_id': 'eede449e3288439d8c22de8ae1937dcd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.241 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.243 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.243 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.243 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.244 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '877cf184-b1df-4a7b-94f1-b4da196b9856', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.243490', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd85ff38-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '766cc0fc2bf26313c9a081f016dd903bba7d3b0e61ae893e8f1af7527d1dc825'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.243490', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd861284-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '7584f571f0678c5073e2245d53d4c987f2df830dd6bca80902aeb16ccd23c1d0'}]}, 'timestamp': '2026-02-20 09:45:18.244558', '_unique_id': '5d6b25a127d042ce966b7d55ce70f54a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.246 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.247 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f714f747-e9c9-43cf-8af9-52da11b86a68', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.246810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd867cce-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'da1800765ee8e32361ba6d48928a26b45eb26adb729232c7f356975c4606a07e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.246810', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd868d0e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'f50052b7e7b1ee4a2351d78711809bbbe87a10b8210e64ffc4fe2af4de3e4597'}]}, 'timestamp': '2026-02-20 09:45:18.247738', '_unique_id': 'c59516deecee4e2cbcbc1f3eb286fffc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.248 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.250 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbba8e16-2b67-4de1-9e29-9e1d81295e7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:45:18.250202', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'dd89d194-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.507337117, 'message_signature': '5d3ac0d1646d56e6246a044f6124cd30f94341be8db9ccad3edb448315e06710'}]}, 'timestamp': '2026-02-20 09:45:18.269138', '_unique_id': '2ad34cd50c554ff89983eee30cb12979'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.270 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a531ba0c-39be-4d1e-a90d-d724949dd0e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.272076', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8b42d6-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '2f8f3e92190b89e4c69537318ce23ee564c808579a8136cb949b9ecc820a7f10'}]}, 'timestamp': '2026-02-20 09:45:18.278571', '_unique_id': 'ec560b0fc1654ba6a85e655679936fb1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a946216a-e10a-4a98-ba0d-c5dc7feab83d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.280862', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8bb248-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'c631bf36fec03d7e485d40cdfdd6d47ef81120b8b3fd38b450e0f460f4a39460'}]}, 'timestamp': '2026-02-20 09:45:18.281489', '_unique_id': 'bb6af1931abe4813a95d5c3063d15573'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.283 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd209d69e-9114-424a-af25-b206f06cd3fc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.283947', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8c27c8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'c1bd8821ac47dc7a315d5263e72bd74c627eb3f756ba0220b4a2a8b298501021'}]}, 'timestamp': '2026-02-20 09:45:18.284419', '_unique_id': '61546c3517ae4e2e9d445807c9ce916b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.286 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47232825-9ee3-41c5-b62c-87e4c8a3c5b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.286575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd8c8fba-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '5a384b0eba87ea4ebd4e4208a83bba05198965125897b0305ceabd925b120c04'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.286575', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd8ca004-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'dd3e2889dac1bc0acf7e55a58c81a6cc721d8c991b91052e9f2802973b080a03'}]}, 'timestamp': '2026-02-20 09:45:18.287463', '_unique_id': '4f29714967894f1a913d9e4907f33d03'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.290 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85f05462-ec54-45ee-a335-ce207bf07852', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.289959', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8d1250-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'f08c5f611addf5412580c4d3b5aa2d92d549071ba0237dbbb4de07d048f1c1f2'}]}, 'timestamp': '2026-02-20 09:45:18.290417', '_unique_id': 'bbccbad1ec9445d1a49db069a424dde3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af5ccc52-2891-4a52-b387-3391180a0bdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.292559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd8eedbe-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '98a6d5d8c244c564537f259518bba9f5a1f85538a019b3f18afd49ac92bfa07b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.292559', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd8effac-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '6fe982bce55c8dcc990deaa1ab71a49fe3066fd6b45c12bcd4b4d595dcab7269'}]}, 'timestamp': '2026-02-20 09:45:18.303020', '_unique_id': '33f481cb1c494fbdb38bd8605145112a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.305 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.305 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '971062ab-6e5b-4128-9c15-6ec56f10ae3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.305406', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8f6f96-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '8408b71cf6e14d29a0579e1803832c7275e75e815c93a20a140a448fd48365d0'}]}, 'timestamp': '2026-02-20 09:45:18.305922', '_unique_id': '1536b7e96eaf4342bc742bdb98ac0242'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8f2469ec-08c4-44e2-a127-2a90a88b6262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.308172', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd8fda26-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '46666ff1bf222c867dcf556f4abebb9a2c64f556855d8dfd0d2ae6dc4fbe6697'}]}, 'timestamp': '2026-02-20 09:45:18.308680', '_unique_id': '9eb1536c9ad344f895143db909a522d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 14610000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ad704ad-1afe-4c3c-b4d1-bd18632bcca2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 14610000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:45:18.311144', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'dd905050-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.507337117, 'message_signature': '040282d18e07eb00485c4f3f32850a2b306ef76718a49b0d0e8ec5b214fe945d'}]}, 'timestamp': '2026-02-20 09:45:18.311782', '_unique_id': 'c82f0e93b1b243529ef77b0f3b6da6a2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec6e6830-38e8-4580-ba05-0f27f3c0b949', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.314080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd90c918-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': 'f639e7d6ce34d62c5280bbce8a1415619665da19dd5b8f206c40c0c0cd8a2190'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.314080', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd90dc82-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '58ff575fa5897a37a063c91a2bf7d6c933cc904c7ad755f7911405291bc00bd0'}]}, 'timestamp': '2026-02-20 09:45:18.315228', '_unique_id': '63a2bcad244a4b5a8a67bd2646bc5c6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.316 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'aa7f4b46-6f1d-462a-87bc-11c21c251517', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.317559', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd914a50-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '039002ba55de6a585ccad887c9c63c85a23cbffd8c7ef456ea6d74051a5ffef8'}]}, 'timestamp': '2026-02-20 09:45:18.318065', '_unique_id': '92a801b93531412285d7eb6ba7de6c29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6897e52-ff21-4892-b271-8f82f4a41c04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.320195', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd91af5e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '9317671077ef1ac99fcbb069349a4ab7d41222a6d0b798002731f8e8dad05c29'}]}, 'timestamp': '2026-02-20 09:45:18.320696', '_unique_id': 'dbc4ade550e049268db9e4f5e3f39faa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.321 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '16be9827-dddc-46c0-abe0-5bfa72d9fdb4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.322924', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd9219f8-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': 'f081ef4b7dec0cb0a2b30aaabe39b3a85d8fff7152114317b15e70e507a831d4'}]}, 'timestamp': '2026-02-20 09:45:18.323380', '_unique_id': '01e703f9e23c44fc93ebdff522e1861a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b691dc05-73d3-4f7f-bf5f-0e5aa92e3617', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.325663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd928564-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'a439bcbf775aab2478f8a367fc2fe72dba63dd255552e9e206c5bdfc53eff876'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.325663', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd929590-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '75d28e24803291058f53a9ace8488fbaaf92c72c2ac8a08e0e49d844e4b7c6c3'}]}, 'timestamp': '2026-02-20 09:45:18.326518', '_unique_id': 'b1bfe8c553744d4d8e7104e4c55bdd04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain podman[297247]: 2026-02-20 09:45:18.329287296 +0000 UTC m=+0.128295971 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, distribution-scope=public, ceph=True, architecture=x86_64, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, release=1770267347, CEPH_POINT_RELEASE=)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '26959704-3b5a-4ca8-a915-f0c3e1599bf4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.328556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd92f3be-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '2eb9d1a7432b4abac7245da59afb90b0e0c72c41ce2ee8ff395ccc60e033bf70'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.328556', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd92fd6e-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.531793799, 'message_signature': '9dba557492af9f0267f31df94764b703e9f88f0382b5ed02da695bb450487a21'}]}, 'timestamp': '2026-02-20 09:45:18.329097', '_unique_id': 'da8ccdfb5ff64f97b04c7151b0cc57d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3108811e-4e75-4327-b4aa-4d93cc8f7a32', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:45:18.330389', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'dd933a2c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': '6bb3a3280ba5cd0f3fd3e97a42b3298204dff36b487cefeaaf65a30a7ff11a05'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:45:18.330389', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'dd9344cc-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.448033619, 'message_signature': 'f6921ebb47d94ede52369ee953f41062723a46b034311e04350df62e853ac3ee'}]}, 'timestamp': '2026-02-20 09:45:18.330926', '_unique_id': 'ded9b9d8de2e4147b885f551fbc92f96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.332 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5326f92-3236-405d-b57b-d9d65f551b18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:45:18.332232', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'dd93825c-0e40-11f1-9294-fa163ef029e2', 'monotonic_time': 11277.511288508, 'message_signature': '1eddd0a9ed2ad04b5393972659187f7c3bb2bfe4677cd72ef611bc90a3bd8680'}]}, 'timestamp': '2026-02-20 09:45:18.332516', '_unique_id': 'be9e124a95d94ffdaafa39610df62120'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:45:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:45:18.333 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:45:18 np0005625204.localdomain podman[297247]: 2026-02-20 09:45:18.433523704 +0000 UTC m=+0.232532389 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph)
Feb 20 09:45:18 np0005625204.localdomain ceph-mon[288586]: mgrmap e32: np0005625203.lonygy(active, since 1.05473s), standbys: np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:45:18 np0005625204.localdomain ceph-mon[288586]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:19 np0005625204.localdomain sudo[297119]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:19 np0005625204.localdomain sudo[297366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:19 np0005625204.localdomain sudo[297366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:19 np0005625204.localdomain sudo[297366]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:19 np0005625204.localdomain sudo[297384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:45:19 np0005625204.localdomain sudo[297384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Bus STARTING
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Serving on http://172.18.0.107:8765
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Serving on https://172.18.0.107:7150
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Bus STARTED
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: [20/Feb/2026:09:45:18] ENGINE Client ('172.18.0.107', 58052) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: mgrmap e33: np0005625203.lonygy(active, since 2s), standbys: np0005625204.exgrzx, np0005625200.ypbkax, np0005625201.mtnyvu
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:19 np0005625204.localdomain sudo[297384]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:20 np0005625204.localdomain sudo[297433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:20 np0005625204.localdomain sudo[297433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:20 np0005625204.localdomain sudo[297433]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:20 np0005625204.localdomain sudo[297451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:45:20 np0005625204.localdomain sudo[297451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:20 np0005625204.localdomain sudo[297451]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:20 np0005625204.localdomain sudo[297487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:45:20 np0005625204.localdomain sudo[297487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:20 np0005625204.localdomain sudo[297487]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:20 np0005625204.localdomain sudo[297505]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:45:20 np0005625204.localdomain sudo[297505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:20 np0005625204.localdomain sudo[297505]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain sudo[297523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:21 np0005625204.localdomain sudo[297523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297523]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:21.030 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:21.033 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:21.033 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:21.034 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:21.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:21.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:21 np0005625204.localdomain sudo[297541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:21 np0005625204.localdomain sudo[297541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297541]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: mgrmap e34: np0005625203.lonygy(active, since 3s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd/host:np0005625201", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:21 np0005625204.localdomain sudo[297559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:21 np0005625204.localdomain sudo[297559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297559]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain sudo[297593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:21 np0005625204.localdomain sudo[297593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297593]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain sudo[297611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:21 np0005625204.localdomain sudo[297611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297611]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain sudo[297629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:45:21 np0005625204.localdomain sudo[297629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297629]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:21 np0005625204.localdomain sudo[297647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:45:21 np0005625204.localdomain sudo[297647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297647]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:21 np0005625204.localdomain sudo[297665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:45:21 np0005625204.localdomain sudo[297665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297665]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:21 np0005625204.localdomain sudo[297683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:21 np0005625204.localdomain sudo[297683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:21 np0005625204.localdomain sudo[297683]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: mgrmap e35: np0005625203.lonygy(active, since 4s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: Standby manager daemon np0005625202.arwxwo started
Feb 20 09:45:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:22 np0005625204.localdomain sudo[297701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:22 np0005625204.localdomain sudo[297701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297701]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain sudo[297719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:22 np0005625204.localdomain sudo[297719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297719]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain sudo[297753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:22 np0005625204.localdomain sudo[297753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297753]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain sudo[297771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:22 np0005625204.localdomain sudo[297771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297771]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain sudo[297789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:22 np0005625204.localdomain sudo[297789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297789]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain sudo[297807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:45:22 np0005625204.localdomain sudo[297807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297807]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:22 np0005625204.localdomain sudo[297825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:45:22 np0005625204.localdomain sudo[297825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:22 np0005625204.localdomain sudo[297825]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[297843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[297843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297843]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[297861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:23 np0005625204.localdomain sudo[297861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297861]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[297879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[297879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297879]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[297913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[297913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297913]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[297931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[297931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297931]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[297949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain sudo[297949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297949]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: mgrmap e36: np0005625203.lonygy(active, since 5s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:45:23 np0005625204.localdomain sudo[297967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:45:23 np0005625204.localdomain sudo[297967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297967]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:23 np0005625204.localdomain sudo[297985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:45:23 np0005625204.localdomain sudo[297985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[297985]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[298003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[298003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[298003]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[298021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:23 np0005625204.localdomain sudo[298021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[298021]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[298039]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[298039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[298039]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:23 np0005625204.localdomain sudo[298073]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:45:23 np0005625204.localdomain sudo[298073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:23 np0005625204.localdomain sudo[298073]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:24 np0005625204.localdomain sudo[298091]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:45:24 np0005625204.localdomain sudo[298091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:24 np0005625204.localdomain sudo[298091]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:24 np0005625204.localdomain sudo[298109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:45:24 np0005625204.localdomain sudo[298109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:24 np0005625204.localdomain sudo[298109]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:24 np0005625204.localdomain sudo[298127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:45:24 np0005625204.localdomain sudo[298127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:24 np0005625204.localdomain sudo[298127]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:45:24 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:25 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:26.067 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:26.068 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:26.068 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:26.069 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:26.101 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:26.101 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:26 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:45:26 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:26 np0005625204.localdomain ceph-mon[288586]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 0 B/s wr, 16 op/s
Feb 20 09:45:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:45:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:45:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:45:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:45:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:45:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:27 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:45:28 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:45:29 np0005625204.localdomain podman[298146]: 2026-02-20 09:45:29.157776715 +0000 UTC m=+0.087470845 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 20 09:45:29 np0005625204.localdomain podman[298146]: 2026-02-20 09:45:29.19718996 +0000 UTC m=+0.126884060 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 20 09:45:29 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='client.44411 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:45:29 np0005625204.localdomain sudo[298165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:29 np0005625204.localdomain sudo[298165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:29 np0005625204.localdomain sudo[298165]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:29 np0005625204.localdomain sudo[298183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:29 np0005625204.localdomain sudo[298183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:29 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 2026-02-20 09:45:30.089468122 +0000 UTC m=+0.074519190 container create 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.expose-services=, io.buildah.version=1.42.2)
Feb 20 09:45:30 np0005625204.localdomain systemd[1]: Started libpod-conmon-99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5.scope.
Feb 20 09:45:30 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 2026-02-20 09:45:30.059875124 +0000 UTC m=+0.044926222 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 2026-02-20 09:45:30.161591042 +0000 UTC m=+0.146642110 container init 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, distribution-scope=public, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, io.openshift.expose-services=, release=1770267347, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 2026-02-20 09:45:30.17281067 +0000 UTC m=+0.157861728 container start 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, ceph=True, io.buildah.version=1.42.2, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, RELEASE=main, CEPH_POINT_RELEASE=)
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 2026-02-20 09:45:30.173142909 +0000 UTC m=+0.158193977 container attach 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Feb 20 09:45:30 np0005625204.localdomain strange_kowalevski[298233]: 167 167
Feb 20 09:45:30 np0005625204.localdomain systemd[1]: libpod-99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5.scope: Deactivated successfully.
Feb 20 09:45:30 np0005625204.localdomain podman[298217]: 2026-02-20 09:45:30.176474423 +0000 UTC m=+0.161525521 container died 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, RELEASE=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1770267347, com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.42.2, name=rhceph, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:30 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e26cf461a5b57e070504f6098c3d69cb97fbde0e08a3c5b6e0107a6ec968115-merged.mount: Deactivated successfully.
Feb 20 09:45:30 np0005625204.localdomain podman[298238]: 2026-02-20 09:45:30.283516541 +0000 UTC m=+0.095295337 container remove 99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=strange_kowalevski, architecture=x86_64, release=1770267347, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, ceph=True, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:45:30 np0005625204.localdomain systemd[1]: libpod-conmon-99d4b225cfb86a8b5d5f3b0635ac448e4708be2ac718db6fbbfeab47699224a5.scope: Deactivated successfully.
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:45:30 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:30 np0005625204.localdomain sudo[298183]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:30 np0005625204.localdomain sudo[298261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:30 np0005625204.localdomain sudo[298261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:30 np0005625204.localdomain sudo[298261]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:30 np0005625204.localdomain sudo[298279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:30 np0005625204.localdomain sudo[298279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:31.102 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:31.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:31.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:31.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:31.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:31.134 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 2026-02-20 09:45:31.214797755 +0000 UTC m=+0.062801587 container create 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:45:31 np0005625204.localdomain systemd[1]: Started libpod-conmon-3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5.scope.
Feb 20 09:45:31 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 2026-02-20 09:45:31.269177744 +0000 UTC m=+0.117181576 container init 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, ceph=True, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 2026-02-20 09:45:31.277597352 +0000 UTC m=+0.125601184 container start 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 2026-02-20 09:45:31.277832139 +0000 UTC m=+0.125835981 container attach 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, release=1770267347, io.buildah.version=1.42.2, io.openshift.expose-services=, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7)
Feb 20 09:45:31 np0005625204.localdomain musing_murdock[298328]: 167 167
Feb 20 09:45:31 np0005625204.localdomain systemd[1]: libpod-3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5.scope: Deactivated successfully.
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 2026-02-20 09:45:31.282057398 +0000 UTC m=+0.130061310 container died 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, vendor=Red Hat, Inc., release=1770267347, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, ceph=True, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:45:31 np0005625204.localdomain podman[298313]: 2026-02-20 09:45:31.193996217 +0000 UTC m=+0.042000039 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:31 np0005625204.localdomain podman[298333]: 2026-02-20 09:45:31.373015621 +0000 UTC m=+0.081848416 container remove 3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_murdock, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph)
Feb 20 09:45:31 np0005625204.localdomain systemd[1]: libpod-conmon-3f75a7596c91c6e1b5afce14129710e285947bd705ab12e4b5da881771336ce5.scope: Deactivated successfully.
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='client.27612 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: Saving service mon spec with placement label:mon
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:31 np0005625204.localdomain sudo[298279]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:31 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:31 np0005625204.localdomain sudo[298356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:45:31 np0005625204.localdomain sudo[298356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:31 np0005625204.localdomain sudo[298356]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:31 np0005625204.localdomain sudo[298374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:31 np0005625204.localdomain sudo[298374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 2026-02-20 09:45:32.187839482 +0000 UTC m=+0.075569999 container create aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, GIT_CLEAN=True, release=1770267347, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, io.openshift.expose-services=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:45:32 np0005625204.localdomain systemd[1]: Started libpod-conmon-aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e.scope.
Feb 20 09:45:32 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3e84cb3051dbbf5b8837dc91ec5eea1c25d8cb305defe352c3a2a0e0e26fe4c6-merged.mount: Deactivated successfully.
Feb 20 09:45:32 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 2026-02-20 09:45:32.157680889 +0000 UTC m=+0.045411396 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 2026-02-20 09:45:32.257453412 +0000 UTC m=+0.145183919 container init aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 2026-02-20 09:45:32.266776275 +0000 UTC m=+0.154506802 container start aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, ceph=True, version=7, architecture=x86_64)
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 2026-02-20 09:45:32.267099584 +0000 UTC m=+0.154830161 container attach aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=1770267347, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, build-date=2026-02-09T10:25:24Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:45:32 np0005625204.localdomain recursing_ramanujan[298424]: 167 167
Feb 20 09:45:32 np0005625204.localdomain systemd[1]: libpod-aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e.scope: Deactivated successfully.
Feb 20 09:45:32 np0005625204.localdomain podman[298409]: 2026-02-20 09:45:32.270709196 +0000 UTC m=+0.158439733 container died aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.openshift.expose-services=, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container)
Feb 20 09:45:32 np0005625204.localdomain podman[298429]: 2026-02-20 09:45:32.3729882 +0000 UTC m=+0.090836211 container remove aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_ramanujan, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, ceph=True)
Feb 20 09:45:32 np0005625204.localdomain systemd[1]: libpod-conmon-aa1d9f593a6ea88e32dd56b1897020aafbbd1503f43988d39661096e53f3573e.scope: Deactivated successfully.
Feb 20 09:45:32 np0005625204.localdomain sudo[298374]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625204 (monmap changed)...
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.537510) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732537567, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2849, "num_deletes": 256, "total_data_size": 8905305, "memory_usage": 9185624, "flush_reason": "Manual Compaction"}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732559009, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 5327455, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17307, "largest_seqno": 20151, "table_properties": {"data_size": 5315499, "index_size": 7501, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 30618, "raw_average_key_size": 22, "raw_value_size": 5289399, "raw_average_value_size": 3920, "num_data_blocks": 326, "num_entries": 1349, "num_filter_entries": 1349, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580670, "oldest_key_time": 1771580670, "file_creation_time": 1771580732, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 21557 microseconds, and 11212 cpu microseconds.
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.559064) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 5327455 bytes OK
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.559090) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.561386) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.561408) EVENT_LOG_v1 {"time_micros": 1771580732561401, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.561434) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 8891394, prev total WAL file size 8891686, number of live WAL files 2.
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.563555) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(5202KB)], [24(14MB)]
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732563631, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 20516503, "oldest_snapshot_seqno": -1}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10934 keys, 18161002 bytes, temperature: kUnknown
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732643726, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 18161002, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18098403, "index_size": 34146, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27397, "raw_key_size": 291577, "raw_average_key_size": 26, "raw_value_size": 17911752, "raw_average_value_size": 1638, "num_data_blocks": 1308, "num_entries": 10934, "num_filter_entries": 10934, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580732, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.644299) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 18161002 bytes
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.646133) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 256.0 rd, 226.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.1, 14.5 +0.0 blob) out(17.3 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 11483, records dropped: 549 output_compression: NoCompression
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.646164) EVENT_LOG_v1 {"time_micros": 1771580732646151, "job": 12, "event": "compaction_finished", "compaction_time_micros": 80140, "compaction_time_cpu_micros": 46405, "output_level": 6, "num_output_files": 1, "total_output_size": 18161002, "num_input_records": 11483, "num_output_records": 10934, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732647338, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580732650083, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.563434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650199) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:45:32 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:45:32.650208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:45:32 np0005625204.localdomain sudo[298445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:45:32 np0005625204.localdomain sudo[298445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:32 np0005625204.localdomain sudo[298445]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:33 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-089952a015ce864469b7c1e49d7055a1e46e3372bdbe6f559830c6fae9a5ec3f-merged.mount: Deactivated successfully.
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='client.27618 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625203", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:33 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625201 (monmap changed)...
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.200:0/191592331' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:45:34 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:35 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:36.135 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:36.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:36.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:36.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:36.187 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:36.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:36 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:45:37 np0005625204.localdomain podman[298463]: 2026-02-20 09:45:37.166282447 +0000 UTC m=+0.099953758 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:45:37 np0005625204.localdomain podman[298463]: 2026-02-20 09:45:37.176887688 +0000 UTC m=+0.110558979 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:45:37 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:45:37 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 20 09:45:37 np0005625204.localdomain ceph-mon[288586]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:37 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:38 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:39 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44580 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Feb 20 09:45:39 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:45:39 np0005625204.localdomain ceph-mon[288586]: paxos.1).electionLogic(46) init, last seen epoch 46
Feb 20 09:45:39 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:41.189 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:41.190 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:42.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:42.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 09:45:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:45:44 np0005625204.localdomain systemd[292696]: Starting Mark boot as successful...
Feb 20 09:45:44 np0005625204.localdomain podman[298486]: 2026-02-20 09:45:44.148626169 +0000 UTC m=+0.090167071 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:45:44 np0005625204.localdomain systemd[292696]: Finished Mark boot as successful.
Feb 20 09:45:44 np0005625204.localdomain podman[298486]: 2026-02-20 09:45:44.162982706 +0000 UTC m=+0.104523568 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:45:44 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2,3)
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: monmap epoch 12
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:45:39.346453+0000
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: mgrmap e36: np0005625203.lonygy(active, since 27s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:45:44 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:45:45 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:45 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:45 np0005625204.localdomain ceph-mon[288586]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:45 np0005625204.localdomain ceph-mon[288586]: from='client.27628 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625201", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:45:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:45:46 np0005625204.localdomain podman[298513]: 2026-02-20 09:45:46.132515752 +0000 UTC m=+0.069791185 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, managed_by=edpm_ansible, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7)
Feb 20 09:45:46 np0005625204.localdomain podman[298513]: 2026-02-20 09:45:46.146073765 +0000 UTC m=+0.083349228 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, io.openshift.tags=minimal rhel9, release=1770267347)
Feb 20 09:45:46 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.193 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.193 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.247 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.248 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:46 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44420 mon_map magic: 0 from mon.0 v2:172.18.0.105:3300/0
Feb 20 09:45:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@1(peon) e13  my rank is now 0 (was 1)
Feb 20 09:45:46 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 20 09:45:46 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 20 09:45:46 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Feb 20 09:45:46 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:45:46 np0005625204.localdomain ceph-mon[288586]: paxos.0).electionLogic(50) init, last seen epoch 50
Feb 20 09:45:46 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.733 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.761 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.761 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.762 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.762 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:45:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:46.762 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:45:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:45:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:45:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:45:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:45:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:45:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18269 "" "Go-http-client/1.1"
Feb 20 09:45:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:45:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:45:48 np0005625204.localdomain podman[298546]: 2026-02-20 09:45:48.128139126 +0000 UTC m=+0.067463420 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:45:48 np0005625204.localdomain podman[298546]: 2026-02-20 09:45:48.160982745 +0000 UTC m=+0.100307029 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 09:45:48 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:45:48 np0005625204.localdomain podman[298545]: 2026-02-20 09:45:48.114771337 +0000 UTC m=+0.055573093 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:45:48 np0005625204.localdomain podman[298545]: 2026-02-20 09:45:48.246501694 +0000 UTC m=+0.187303480 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:45:48 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:45:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id
Feb 20 09:45:49 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id
Feb 20 09:45:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id
Feb 20 09:45:50 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e13 handle_auth_request failed to assign global_id
Feb 20 09:45:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:51.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:51.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:51.253 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:51.253 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:51.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:51.278 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:51 np0005625204.localdomain sshd[298588]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202 in quorum (ranks 0,1)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 13
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:45:46.327222+0000
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 34s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005625204,np0005625202 (MON_DOWN)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1/3 mons down, quorum np0005625204,np0005625202
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005625204,np0005625202
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [WRN] :     mon.np0005625203 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: Remove daemons mon.np0005625201
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203'])
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: Removing monitor np0005625201 from monmap...
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625201"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports []
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202 in quorum (ranks 0,1)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: monmap epoch 13
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:45:46.327222+0000
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: mgrmap e36: np0005625203.lonygy(active, since 34s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: Health check failed: 1/3 mons down, quorum np0005625204,np0005625202 (MON_DOWN)
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005625204,np0005625202
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005625204,np0005625202
Feb 20 09:45:51 np0005625204.localdomain ceph-mon[288586]:     mon.np0005625203 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum)
Feb 20 09:45:51 np0005625204.localdomain sudo[298590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:45:51 np0005625204.localdomain sudo[298590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:51 np0005625204.localdomain sudo[298590]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:51 np0005625204.localdomain sudo[298608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:45:51 np0005625204.localdomain sudo[298608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:51 np0005625204.localdomain sudo[298608]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:51 np0005625204.localdomain sudo[298626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:51 np0005625204.localdomain sudo[298626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:51 np0005625204.localdomain sshd[298588]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:45:51 np0005625204.localdomain sudo[298626]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:51 np0005625204.localdomain sudo[298644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:51 np0005625204.localdomain sudo[298644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:51 np0005625204.localdomain sudo[298644]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:51 np0005625204.localdomain sudo[298662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:51 np0005625204.localdomain sudo[298662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:51 np0005625204.localdomain sudo[298662]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:51 np0005625204.localdomain sudo[298696]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:51 np0005625204.localdomain sudo[298696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:51 np0005625204.localdomain sudo[298696]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:52 np0005625204.localdomain sudo[298714]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:45:52 np0005625204.localdomain sudo[298714]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:52 np0005625204.localdomain sudo[298714]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:52 np0005625204.localdomain sudo[298732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:45:52 np0005625204.localdomain sudo[298732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:52 np0005625204.localdomain sudo[298732]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:52 np0005625204.localdomain sudo[298750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:45:52 np0005625204.localdomain sudo[298750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:52 np0005625204.localdomain sudo[298750]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:52 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:45:52 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1126955229' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:52 np0005625204.localdomain sudo[298768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:45:52 np0005625204.localdomain sudo[298768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:52 np0005625204.localdomain sudo[298768]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:52 np0005625204.localdomain sudo[298786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:52 np0005625204.localdomain sudo[298786]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:52 np0005625204.localdomain sudo[298786]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:53 np0005625204.localdomain sudo[298804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:53 np0005625204.localdomain sudo[298804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:53 np0005625204.localdomain sudo[298804]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: paxos.0).electionLogic(53) init, last seen epoch 53, mid-election, bumping
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2)
Feb 20 09:45:53 np0005625204.localdomain sudo[298822]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:53 np0005625204.localdomain sudo[298822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/1126955229' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/3661915845' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:53 np0005625204.localdomain sudo[298822]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 13
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:45:46.327222+0000
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 37s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625204,np0005625202)
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : Cluster is now healthy
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0)
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:45:53 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0)
Feb 20 09:45:54 np0005625204.localdomain sudo[298863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:54 np0005625204.localdomain sudo[298863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain sudo[298863]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain sudo[298883]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:45:54 np0005625204.localdomain sudo[298883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:54 np0005625204.localdomain sudo[298883]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:54 np0005625204.localdomain sudo[298901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:54 np0005625204.localdomain sudo[298901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:54 np0005625204.localdomain sudo[298901]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3402059240' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.335 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 7.573s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.401 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.402 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: monmap epoch 13
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:45:46.327222+0000
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: mgrmap e36: np0005625203.lonygy(active, since 37s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625204,np0005625202)
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: Cluster is now healthy
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.106:0/2284102821' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.107:0/3624088180' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:54 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/3402059240' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.610 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.612 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11742MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.612 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.613 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.715 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.715 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.716 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.781 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.856 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.856 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.879 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.899 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:45:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:54.943 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1060203723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.402 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.407 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.426 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.427 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.428 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.815s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.428 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.428 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.445 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 09:45:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:55.446 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: Deploying daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: from='client.44464 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625201.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: Removed label mon from host np0005625201.localdomain
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.108:0/1060203723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 20 09:45:55 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:56.279 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:56.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:45:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:56.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:45:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:56.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:56.314 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:45:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:56.315 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:45:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:45:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:45:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:45:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:45:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:45:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:45:56 np0005625204.localdomain ceph-mon[288586]: from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625201.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:45:56 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:56 np0005625204.localdomain ceph-mon[288586]: Removed label mgr from host np0005625201.localdomain
Feb 20 09:45:56 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 20 09:45:56 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0)
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0)
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:45:57 np0005625204.localdomain sudo[298943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:45:57 np0005625204.localdomain sudo[298943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:45:57 np0005625204.localdomain sudo[298943]: pam_unix(sudo:session): session closed for user root
Feb 20 09:45:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:57.448 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:57.450 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:57.471 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:57.472 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:45:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:57.472 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e13  adding peer [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to list of hints
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader).monmap v13 adding/updating np0005625201 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] to monitor cluster
Feb 20 09:45:57 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fc44160 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: paxos.0).electionLogic(56) init, last seen epoch 56
Feb 20 09:45:57 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.106 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.106 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.106 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.107 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.467 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.482 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.482 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.483 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.483 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.484 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.484 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:45:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:45:58.485 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:46:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:46:00 np0005625204.localdomain systemd[1]: tmp-crun.0BZQmR.mount: Deactivated successfully.
Feb 20 09:46:00 np0005625204.localdomain podman[298961]: 2026-02-20 09:46:00.143589359 +0000 UTC m=+0.078807271 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 09:46:00 np0005625204.localdomain podman[298961]: 2026-02-20 09:46:00.177519028 +0000 UTC m=+0.112736960 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:46:00 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:46:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:01.316 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:01.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:01.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:01.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:01.351 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:01.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: paxos.0).electionLogic(57) init, last seen epoch 57, mid-election, bumping
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203,np0005625201 in quorum (ranks 0,1,2,3)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 14
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:45:57.556107+0000
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 46s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='client.44479 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625201.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: Removed label _admin from host np0005625201.localdomain
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625201 calling monitor election
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203,np0005625201 in quorum (ranks 0,1,2,3)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: monmap epoch 14
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:45:57.556107+0000
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mgrmap e36: np0005625203.lonygy(active, since 46s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:02 np0005625204.localdomain sudo[298981]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:46:02 np0005625204.localdomain sudo[298981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:02 np0005625204.localdomain sudo[298981]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:02 np0005625204.localdomain sudo[298999]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:46:02 np0005625204.localdomain sudo[298999]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:02 np0005625204.localdomain sudo[298999]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:02 np0005625204.localdomain sudo[299017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:02 np0005625204.localdomain sudo[299017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:02 np0005625204.localdomain sudo[299017]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0)
Feb 20 09:46:02 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0)
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:03 np0005625204.localdomain sudo[299035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:03 np0005625204.localdomain sudo[299035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299035]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:03 np0005625204.localdomain sudo[299053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299053]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299087]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:03 np0005625204.localdomain sudo[299087]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299087]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:03 np0005625204.localdomain sudo[299105]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299105]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:46:03 np0005625204.localdomain sudo[299123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299123]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:03 np0005625204.localdomain sudo[299141]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299141]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:03 np0005625204.localdomain sudo[299159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299159]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/3151353263' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='client.? 172.18.0.32:0/3151353263' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: Removing np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:03 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:03 np0005625204.localdomain sudo[299177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:03 np0005625204.localdomain sudo[299177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299177]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:03 np0005625204.localdomain sudo[299195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:03 np0005625204.localdomain sudo[299195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:03 np0005625204.localdomain sudo[299195]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:04 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:04 np0005625204.localdomain sudo[299213]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:04 np0005625204.localdomain sudo[299213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:04 np0005625204.localdomain sudo[299213]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:04 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:04 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:04 np0005625204.localdomain sshd[299231]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:46:04 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:04 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain sudo[299249]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:05 np0005625204.localdomain sudo[299249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:05 np0005625204.localdomain sudo[299249]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:05 np0005625204.localdomain sudo[299267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:05 np0005625204.localdomain sudo[299267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:05 np0005625204.localdomain sudo[299267]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:05 np0005625204.localdomain sudo[299285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:05 np0005625204.localdomain sudo[299285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:05 np0005625204.localdomain sudo[299285]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 20 09:46:05 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:05.743 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:05.765 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 20 09:46:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:05.766 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:46:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:05.767 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:46:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:05.789 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.022s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:46:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:46:06.009 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:46:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:46:06.009 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:46:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:46:06.010 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:46:06 np0005625204.localdomain sshd[299231]: Received disconnect from 188.166.218.64 port 35428:11: Bye Bye [preauth]
Feb 20 09:46:06 np0005625204.localdomain sshd[299231]: Disconnected from authenticating user root 188.166.218.64 port 35428 [preauth]
Feb 20 09:46:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:06.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:06.355 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:06 np0005625204.localdomain ceph-mon[288586]: Removing daemon mgr.np0005625201.mtnyvu from np0005625201.localdomain -- ports [8765]
Feb 20 09:46:07 np0005625204.localdomain ceph-mon[288586]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command({"prefix": "auth rm", "entity": "mgr.np0005625201.mtnyvu"} v 0)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth rm", "entity": "mgr.np0005625201.mtnyvu"} : dispatch
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix": "auth rm", "entity": "mgr.np0005625201.mtnyvu"}]': finished
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mgr}] v 0)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:08 np0005625204.localdomain systemd[1]: tmp-crun.tgU8Om.mount: Deactivated successfully.
Feb 20 09:46:08 np0005625204.localdomain podman[299303]: 2026-02-20 09:46:08.170613232 +0000 UTC m=+0.097814587 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e14 handle_command mon_command({"prefix": "mon rm", "name": "np0005625201"} v 0)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625201"} : dispatch
Feb 20 09:46:08 np0005625204.localdomain podman[299303]: 2026-02-20 09:46:08.17794928 +0000 UTC m=+0.105150625 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:46:08 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fd600 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Feb 20 09:46:08 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: paxos.0).electionLogic(60) init, last seen epoch 60
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : monmap epoch 15
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : last_changed 2026-02-20T09:46:08.177805+0000
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [DBG] : mgrmap e36: np0005625203.lonygy(active, since 51s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(cluster) log [INF] : overall HEALTH_OK
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 20 09:46:08 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: Removing key for mgr.np0005625201.mtnyvu
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203'])
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: Removing monitor np0005625201 from monmap...
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports []
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 calling monitor election
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625203 calling monitor election
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625202 calling monitor election
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2)
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: monmap epoch 15
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: last_changed 2026-02-20T09:46:08.177805+0000
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: min_mon_release 18 (reef)
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: election_strategy: 1
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mgrmap e36: np0005625203.lonygy(active, since 51s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: overall HEALTH_OK
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005625201.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:46:09 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:46:10 np0005625204.localdomain sudo[299324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:46:10 np0005625204.localdomain sudo[299324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:10 np0005625204.localdomain sudo[299324]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.257847) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770257950, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1216, "num_deletes": 255, "total_data_size": 1196283, "memory_usage": 1228576, "flush_reason": "Manual Compaction"}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770266532, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 943693, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20156, "largest_seqno": 21367, "table_properties": {"data_size": 938360, "index_size": 2484, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 15099, "raw_average_key_size": 21, "raw_value_size": 925920, "raw_average_value_size": 1309, "num_data_blocks": 103, "num_entries": 707, "num_filter_entries": 707, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580732, "oldest_key_time": 1771580732, "file_creation_time": 1771580770, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 8739 microseconds, and 4967 cpu microseconds.
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.266597) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 943693 bytes OK
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.266626) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268284) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268304) EVENT_LOG_v1 {"time_micros": 1771580770268298, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268325) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1189976, prev total WAL file size 1189976, number of live WAL files 2.
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268930) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323733' seq:72057594037927935, type:22 .. '6B760031353239' seq:0, type:0; will stop at (end)
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(921KB)], [27(17MB)]
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770268969, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 19104695, "oldest_snapshot_seqno": -1}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11094 keys, 18043053 bytes, temperature: kUnknown
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770342331, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 18043053, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17980147, "index_size": 34069, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 297635, "raw_average_key_size": 26, "raw_value_size": 17791113, "raw_average_value_size": 1603, "num_data_blocks": 1284, "num_entries": 11094, "num_filter_entries": 11094, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580480, "oldest_key_time": 0, "file_creation_time": 1771580770, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "ff5418ad-30e3-42a0-9ea4-01185f113ffa", "db_session_id": "RDMWWACFW9Z8Q9K53AN8", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.342751) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 18043053 bytes
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.344251) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 260.0 rd, 245.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 17.3 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(39.4) write-amplify(19.1) OK, records in: 11641, records dropped: 547 output_compression: NoCompression
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.344283) EVENT_LOG_v1 {"time_micros": 1771580770344267, "job": 14, "event": "compaction_finished", "compaction_time_micros": 73483, "compaction_time_cpu_micros": 38850, "output_level": 6, "num_output_files": 1, "total_output_size": 18043053, "num_input_records": 11641, "num_output_records": 11094, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770344592, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580770347157, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.268863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347221) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347234) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: rocksdb: (Original Log Time 2026/02/20-09:46:10.347237) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: Added label _no_schedule to host np0005625201.localdomain
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625201.localdomain
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:10 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:46:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:11.356 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:11.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:11.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:11.359 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:11.390 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:11.391 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain.devices.0}] v 0)
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625201.localdomain}] v 0)
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:11 np0005625204.localdomain sudo[299342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:46:11 np0005625204.localdomain sudo[299342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:11 np0005625204.localdomain sudo[299342]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:11 np0005625204.localdomain sudo[299360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:46:11 np0005625204.localdomain sudo[299360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:11 np0005625204.localdomain sudo[299360]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:11 np0005625204.localdomain sudo[299378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:11 np0005625204.localdomain sudo[299378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:11 np0005625204.localdomain sudo[299378]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:11 np0005625204.localdomain sudo[299396]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:11 np0005625204.localdomain sudo[299396]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:11 np0005625204.localdomain sudo[299396]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: from='client.44497 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005625201.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:11 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:11 np0005625204.localdomain sudo[299414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:11 np0005625204.localdomain sudo[299414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:11 np0005625204.localdomain sudo[299414]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} v 0)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"}]': finished
Feb 20 09:46:12 np0005625204.localdomain sudo[299448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:12 np0005625204.localdomain sudo[299448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299448]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:12 np0005625204.localdomain sudo[299466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299466]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:46:12 np0005625204.localdomain sudo[299484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299484]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:12 np0005625204.localdomain sudo[299502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299502]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:12 np0005625204.localdomain sudo[299520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299520]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:12 np0005625204.localdomain sudo[299538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299538]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299556]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:12 np0005625204.localdomain sudo[299556]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299556]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299574]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:12 np0005625204.localdomain sudo[299574]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299574]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain sudo[299608]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:12 np0005625204.localdomain sudo[299608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299608]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain sudo[299626]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:12 np0005625204.localdomain sudo[299626]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain sudo[299626]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"}]': finished
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:12 np0005625204.localdomain sudo[299644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:12 np0005625204.localdomain sudo[299644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain sudo[299644]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:46:12 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain sudo[299662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:46:13 np0005625204.localdomain sudo[299662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:13 np0005625204.localdomain sudo[299662]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='client.44503 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005625201.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: Removed host np0005625201.localdomain
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:13 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:14 np0005625204.localdomain sshd[299680]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:46:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:46:14 np0005625204.localdomain sshd[299680]: Accepted publickey for tripleo-admin from 192.168.122.11 port 59108 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:46:14 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:14 np0005625204.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 20 09:46:14 np0005625204.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 20 09:46:14 np0005625204.localdomain systemd-logind[759]: New session 70 of user tripleo-admin.
Feb 20 09:46:14 np0005625204.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 20 09:46:15 np0005625204.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 20 09:46:15 np0005625204.localdomain podman[299682]: 2026-02-20 09:46:15.039343061 +0000 UTC m=+0.113265265 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 20 09:46:15 np0005625204.localdomain podman[299682]: 2026-02-20 09:46:15.076232415 +0000 UTC m=+0.150154609 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:46:15 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Queued start job for default target Main User Target.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Created slice User Application Slice.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Started Daily Cleanup of User's Temporary Directories.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Reached target Paths.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Reached target Timers.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Starting D-Bus User Message Bus Socket...
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Starting Create User's Volatile Files and Directories...
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Finished Create User's Volatile Files and Directories.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Listening on D-Bus User Message Bus Socket.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Reached target Sockets.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Reached target Basic System.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Reached target Main User Target.
Feb 20 09:46:15 np0005625204.localdomain systemd[299695]: Startup finished in 150ms.
Feb 20 09:46:15 np0005625204.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 20 09:46:15 np0005625204.localdomain systemd[1]: Started Session 70 of User tripleo-admin.
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:46:15 np0005625204.localdomain sshd[299680]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:15 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:46:16 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:16 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:16 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:16 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:16 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 20 09:46:16 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:16 np0005625204.localdomain sudo[299847]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfnhbqtuvomblqtnkyswjizjchbggpie ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771580775.590827-64468-31049326405600/AnsiballZ_lineinfile.py
Feb 20 09:46:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:16.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:16.431 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:16 np0005625204.localdomain sudo[299847]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 09:46:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:16.431 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:46:16 np0005625204.localdomain systemd[1]: tmp-crun.hH9G2T.mount: Deactivated successfully.
Feb 20 09:46:16 np0005625204.localdomain podman[299850]: 2026-02-20 09:46:16.538955473 +0000 UTC m=+0.093472326 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:46:16 np0005625204.localdomain podman[299850]: 2026-02-20 09:46:16.549961124 +0000 UTC m=+0.104477967 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., architecture=x86_64, release=1770267347, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, version=9.7)
Feb 20 09:46:16 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:46:16 np0005625204.localdomain python3[299849]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.105/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 20 09:46:16 np0005625204.localdomain sudo[299847]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:17 np0005625204.localdomain sudo[300012]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wytmkvmumwosvshiqnvcoydtjqshkhqp ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771580776.8434157-64484-73297161628491/AnsiballZ_command.py
Feb 20 09:46:17 np0005625204.localdomain sudo[300012]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 09:46:17 np0005625204.localdomain python3[300014]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.105/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:46:17 np0005625204.localdomain sudo[300012]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:46:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:46:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:46:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:46:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:46:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18270 "" "Go-http-client/1.1"
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:17 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:18 np0005625204.localdomain sudo[300157]: tripleo-admin : TTY=pts/1 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfgmzdsyshisqhpjxyfhfsveullafrnt ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1771580777.5996478-64495-252540225700991/AnsiballZ_command.py
Feb 20 09:46:18 np0005625204.localdomain sudo[300157]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 20 09:46:18 np0005625204.localdomain python3[300159]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.105 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 20 09:46:18 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:46:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:46:19 np0005625204.localdomain systemd[1]: tmp-crun.0ssSMT.mount: Deactivated successfully.
Feb 20 09:46:19 np0005625204.localdomain podman[300162]: 2026-02-20 09:46:19.156197881 +0000 UTC m=+0.089261377 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 20 09:46:19 np0005625204.localdomain podman[300162]: 2026-02-20 09:46:19.189315988 +0000 UTC m=+0.122379524 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:46:19 np0005625204.localdomain podman[300161]: 2026-02-20 09:46:19.203711785 +0000 UTC m=+0.139213260 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:46:19 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:46:19 np0005625204.localdomain podman[300161]: 2026-02-20 09:46:19.244070726 +0000 UTC m=+0.179572171 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 20 09:46:19 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:46:19 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:46:20 np0005625204.localdomain sudo[300157]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:46:20 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:21.433 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:21.435 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:21.435 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:21.435 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:21.436 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:21.438 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:21 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='client.54131 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: Saving service mon spec with placement label:mon
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:22 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:23 np0005625204.localdomain sudo[300219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:23 np0005625204.localdomain sudo[300219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:23 np0005625204.localdomain sudo[300219]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:23 np0005625204.localdomain sudo[300237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:23 np0005625204.localdomain sudo[300237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:23 np0005625204.localdomain ceph-mon[288586]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 2026-02-20 09:46:24.151922625 +0000 UTC m=+0.079582452 container create fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, ceph=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., release=1770267347, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, version=7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:46:24 np0005625204.localdomain systemd[1]: Started libpod-conmon-fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221.scope.
Feb 20 09:46:24 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 2026-02-20 09:46:24.118688234 +0000 UTC m=+0.046348091 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 2026-02-20 09:46:24.231123995 +0000 UTC m=+0.158783822 container init fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, io.buildah.version=1.42.2, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:46:24 np0005625204.localdomain systemd[1]: tmp-crun.S9p2ca.mount: Deactivated successfully.
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 2026-02-20 09:46:24.243900257 +0000 UTC m=+0.171560054 container start fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True)
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 2026-02-20 09:46:24.244164034 +0000 UTC m=+0.171823861 container attach fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_BRANCH=main, GIT_CLEAN=True, version=7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., release=1770267347, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main)
Feb 20 09:46:24 np0005625204.localdomain zealous_nobel[300287]: 167 167
Feb 20 09:46:24 np0005625204.localdomain systemd[1]: libpod-fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221.scope: Deactivated successfully.
Feb 20 09:46:24 np0005625204.localdomain podman[300272]: 2026-02-20 09:46:24.249484775 +0000 UTC m=+0.177144602 container died fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, distribution-scope=public, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, name=rhceph, version=7, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph)
Feb 20 09:46:24 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e15 handle_command mon_command({"prefix": "mon rm", "name": "np0005625204"} v 0)
Feb 20 09:46:24 np0005625204.localdomain ceph-mon[288586]: log_channel(audit) log [INF] : from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625204"} : dispatch
Feb 20 09:46:24 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fe78000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Feb 20 09:46:24 np0005625204.localdomain podman[300292]: 2026-02-20 09:46:24.373196804 +0000 UTC m=+0.113996985 container remove fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_nobel, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, name=rhceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph)
Feb 20 09:46:24 np0005625204.localdomain ceph-mon[288586]: mon.np0005625204@0(leader) e16  removed from monmap, suicide.
Feb 20 09:46:24 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Feb 20 09:46:24 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Feb 20 09:46:24 np0005625204.localdomain systemd[1]: libpod-conmon-fdcc78a4ea01798b1a26760c7a714d69b363938541165fbd87b9fba5dfabd221.scope: Deactivated successfully.
Feb 20 09:46:24 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e261fcf20 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 20 09:46:24 np0005625204.localdomain sudo[300237]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:24 np0005625204.localdomain sudo[300307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:24 np0005625204.localdomain sudo[300307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:24 np0005625204.localdomain sudo[300307]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:24 np0005625204.localdomain podman[300317]: 2026-02-20 09:46:24.479172842 +0000 UTC m=+0.063827187 container died 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_BRANCH=main)
Feb 20 09:46:24 np0005625204.localdomain podman[300317]: 2026-02-20 09:46:24.512027402 +0000 UTC m=+0.096681747 container remove 4047f9576a636e20370f307ede025d867e1e29730d95093c341ef595b0272f04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=rhceph-container, release=1770267347, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:46:24 np0005625204.localdomain sudo[300359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8 --name mon.np0005625204 --force
Feb 20 09:46:24 np0005625204.localdomain sudo[300359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:24 np0005625204.localdomain sudo[300358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:24 np0005625204.localdomain sudo[300358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:24 np0005625204.localdomain sudo[300358]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:24 np0005625204.localdomain sudo[300394]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:24 np0005625204.localdomain sudo[300394]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: tmp-crun.PXEyS3.mount: Deactivated successfully.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-9c21ab3a300335fcf797a3afe00a80fbacfe0d7a6765fcee7b83fec0346a2ac7-merged.mount: Deactivated successfully.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c11b2bd800e85eafefbce1b7fbc7eb3070dcaa8315a480bcc94482489b7dba9d-merged.mount: Deactivated successfully.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8@mon.np0005625204.service: Deactivated successfully.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: Stopped Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8@mon.np0005625204.service: Consumed 11.306s CPU time.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:46:25 np0005625204.localdomain systemd-sysv-generator[300537]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:46:25 np0005625204.localdomain systemd-rc-local-generator[300532]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:25 np0005625204.localdomain sudo[300359]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:25 np0005625204.localdomain podman[300547]: 
Feb 20 09:46:26 np0005625204.localdomain podman[300547]: 2026-02-20 09:46:26.007276761 +0000 UTC m=+0.078583065 container create 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.openshift.expose-services=, release=1770267347, CEPH_POINT_RELEASE=, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:46:26 np0005625204.localdomain systemd[1]: Started libpod-conmon-0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6.scope.
Feb 20 09:46:26 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:26 np0005625204.localdomain podman[300547]: 2026-02-20 09:46:25.977493578 +0000 UTC m=+0.048799912 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:26 np0005625204.localdomain podman[300547]: 2026-02-20 09:46:26.09562789 +0000 UTC m=+0.166934204 container init 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, GIT_CLEAN=True, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, release=1770267347, io.openshift.expose-services=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph)
Feb 20 09:46:26 np0005625204.localdomain podman[300547]: 2026-02-20 09:46:26.108818393 +0000 UTC m=+0.180124697 container start 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, version=7, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container)
Feb 20 09:46:26 np0005625204.localdomain podman[300547]: 2026-02-20 09:46:26.109480851 +0000 UTC m=+0.180787165 container attach 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=1770267347, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, version=7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public)
Feb 20 09:46:26 np0005625204.localdomain blissful_haibt[300563]: 167 167
Feb 20 09:46:26 np0005625204.localdomain systemd[1]: libpod-0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6.scope: Deactivated successfully.
Feb 20 09:46:26 np0005625204.localdomain podman[300547]: 2026-02-20 09:46:26.114476413 +0000 UTC m=+0.185782727 container died 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, GIT_CLEAN=True, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:46:26 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-14e5931968eb84cd13e0c455b2e48f93a206c0cbcfd1915fd787e5cf140c87df-merged.mount: Deactivated successfully.
Feb 20 09:46:26 np0005625204.localdomain podman[300568]: 2026-02-20 09:46:26.222940882 +0000 UTC m=+0.100165685 container remove 0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_haibt, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:46:26 np0005625204.localdomain systemd[1]: libpod-conmon-0915e94891cf35b3c81b728d92a5c488181ebd7262f3f2e3112dd006804b93d6.scope: Deactivated successfully.
Feb 20 09:46:26 np0005625204.localdomain sudo[300394]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:26.439 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:26 np0005625204.localdomain sudo[300593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:26 np0005625204.localdomain sudo[300593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:26 np0005625204.localdomain sudo[300593]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:46:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:46:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:46:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:46:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:46:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:46:26 np0005625204.localdomain sudo[300611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:26 np0005625204.localdomain sudo[300611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 2026-02-20 09:46:27.089981509 +0000 UTC m=+0.078125682 container create 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:46:27 np0005625204.localdomain systemd[1]: Started libpod-conmon-7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406.scope.
Feb 20 09:46:27 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 2026-02-20 09:46:27.054846084 +0000 UTC m=+0.042990317 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 2026-02-20 09:46:27.157489978 +0000 UTC m=+0.145634161 container init 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, ceph=True, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=)
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 2026-02-20 09:46:27.166871063 +0000 UTC m=+0.155015266 container start 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 2026-02-20 09:46:27.167221354 +0000 UTC m=+0.155365547 container attach 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, version=7, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, release=1770267347, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:46:27 np0005625204.localdomain zealous_mendel[300662]: 167 167
Feb 20 09:46:27 np0005625204.localdomain systemd[1]: libpod-7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406.scope: Deactivated successfully.
Feb 20 09:46:27 np0005625204.localdomain podman[300646]: 2026-02-20 09:46:27.170687342 +0000 UTC m=+0.158831545 container died 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, distribution-scope=public, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True)
Feb 20 09:46:27 np0005625204.localdomain systemd[1]: tmp-crun.L49kiF.mount: Deactivated successfully.
Feb 20 09:46:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ba842494c5f1f056888c4128a39db4121e7231ae3394372f2bdd8b93740e679d-merged.mount: Deactivated successfully.
Feb 20 09:46:27 np0005625204.localdomain podman[300667]: 2026-02-20 09:46:27.280994692 +0000 UTC m=+0.095205974 container remove 7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=zealous_mendel, io.openshift.tags=rhceph ceph, ceph=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, io.buildah.version=1.42.2, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:46:27 np0005625204.localdomain systemd[1]: libpod-conmon-7c31ef7c9feb959e99aa558ed826d91098d5241d759a911742d0c64104b35406.scope: Deactivated successfully.
Feb 20 09:46:27 np0005625204.localdomain sudo[300611]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:27 np0005625204.localdomain sudo[300690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:27 np0005625204.localdomain sudo[300690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:27 np0005625204.localdomain sudo[300690]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:27 np0005625204.localdomain sudo[300708]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:27 np0005625204.localdomain sudo[300708]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 2026-02-20 09:46:28.110699153 +0000 UTC m=+0.076542276 container create 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, distribution-scope=public, release=1770267347, name=rhceph, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:46:28 np0005625204.localdomain systemd[1]: Started libpod-conmon-3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96.scope.
Feb 20 09:46:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 2026-02-20 09:46:28.172408129 +0000 UTC m=+0.138251242 container init 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, GIT_BRANCH=main)
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 2026-02-20 09:46:28.080358576 +0000 UTC m=+0.046201719 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 2026-02-20 09:46:28.184221414 +0000 UTC m=+0.150064497 container start 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, RELEASE=main, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, release=1770267347, io.openshift.expose-services=, io.buildah.version=1.42.2)
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 2026-02-20 09:46:28.184397749 +0000 UTC m=+0.150240902 container attach 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, release=1770267347, version=7, name=rhceph, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:46:28 np0005625204.localdomain friendly_khayyam[300757]: 167 167
Feb 20 09:46:28 np0005625204.localdomain systemd[1]: libpod-3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96.scope: Deactivated successfully.
Feb 20 09:46:28 np0005625204.localdomain podman[300742]: 2026-02-20 09:46:28.186543128 +0000 UTC m=+0.152386221 container died 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, GIT_CLEAN=True, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:46:28 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-53bd40fadbf0de131abf706941ba00e28724579547fca12c5978dfd939b2ed9e-merged.mount: Deactivated successfully.
Feb 20 09:46:28 np0005625204.localdomain podman[300762]: 2026-02-20 09:46:28.279744776 +0000 UTC m=+0.081466126 container remove 3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_khayyam, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, distribution-scope=public, RELEASE=main, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Feb 20 09:46:28 np0005625204.localdomain systemd[1]: libpod-conmon-3a04d405280b89b425afd59097db0a06ca0281be56696f8f8d68471c2b76bd96.scope: Deactivated successfully.
Feb 20 09:46:28 np0005625204.localdomain sudo[300708]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:28 np0005625204.localdomain sudo[300779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:28 np0005625204.localdomain sudo[300779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:28 np0005625204.localdomain sudo[300779]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:28 np0005625204.localdomain sudo[300797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:28 np0005625204.localdomain sudo[300797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:28 np0005625204.localdomain podman[300831]: 
Feb 20 09:46:28 np0005625204.localdomain podman[300831]: 2026-02-20 09:46:28.965713421 +0000 UTC m=+0.080745525 container create 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, name=rhceph, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:46:29 np0005625204.localdomain systemd[1]: Started libpod-conmon-67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac.scope.
Feb 20 09:46:29 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:29 np0005625204.localdomain podman[300831]: 2026-02-20 09:46:29.02967363 +0000 UTC m=+0.144705734 container init 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, version=7, com.redhat.component=rhceph-container, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, architecture=x86_64, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:46:29 np0005625204.localdomain podman[300831]: 2026-02-20 09:46:28.93491515 +0000 UTC m=+0.049947304 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:29 np0005625204.localdomain podman[300831]: 2026-02-20 09:46:29.039005294 +0000 UTC m=+0.154037398 container start 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.42.2, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 20 09:46:29 np0005625204.localdomain podman[300831]: 2026-02-20 09:46:29.039251571 +0000 UTC m=+0.154283715 container attach 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:46:29 np0005625204.localdomain stoic_nightingale[300846]: 167 167
Feb 20 09:46:29 np0005625204.localdomain systemd[1]: libpod-67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac.scope: Deactivated successfully.
Feb 20 09:46:29 np0005625204.localdomain podman[300831]: 2026-02-20 09:46:29.041983448 +0000 UTC m=+0.157015582 container died 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container)
Feb 20 09:46:29 np0005625204.localdomain podman[300851]: 2026-02-20 09:46:29.135050541 +0000 UTC m=+0.085122569 container remove 67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_nightingale, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, release=1770267347, vendor=Red Hat, Inc., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, RELEASE=main, build-date=2026-02-09T10:25:24Z)
Feb 20 09:46:29 np0005625204.localdomain systemd[1]: libpod-conmon-67d67b5f68f47ad4ec677e4c0b229364dbc425c7c4de5203f3be449c4f7757ac.scope: Deactivated successfully.
Feb 20 09:46:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-15bbdd6125e1e7916d5dc8e574c690d7fb227c187df110f7f1dbe90376dc2093-merged.mount: Deactivated successfully.
Feb 20 09:46:29 np0005625204.localdomain sudo[300797]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:29 np0005625204.localdomain sudo[300868]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:29 np0005625204.localdomain sudo[300868]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:29 np0005625204.localdomain sudo[300868]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:29 np0005625204.localdomain sudo[300886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:46:29 np0005625204.localdomain sudo[300886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:46:30 np0005625204.localdomain podman[300978]: 2026-02-20 09:46:30.262979069 +0000 UTC m=+0.093024263 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, ceph=True, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:46:30 np0005625204.localdomain systemd[1]: tmp-crun.bEg5c3.mount: Deactivated successfully.
Feb 20 09:46:30 np0005625204.localdomain podman[300997]: 2026-02-20 09:46:30.346860622 +0000 UTC m=+0.078925965 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute)
Feb 20 09:46:30 np0005625204.localdomain podman[300997]: 2026-02-20 09:46:30.35846838 +0000 UTC m=+0.090533723 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:46:30 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:46:30 np0005625204.localdomain podman[300978]: 2026-02-20 09:46:30.395984071 +0000 UTC m=+0.226029255 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, release=1770267347, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 20 09:46:30 np0005625204.localdomain sudo[300886]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:31 np0005625204.localdomain sudo[301101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:31 np0005625204.localdomain sudo[301101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:31 np0005625204.localdomain sudo[301101]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:31 np0005625204.localdomain sudo[301119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:46:31 np0005625204.localdomain sudo[301119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:31.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:31.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:31.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:31.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:31.461 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:31.461 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:31 np0005625204.localdomain sudo[301119]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:31 np0005625204.localdomain sudo[301170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:46:31 np0005625204.localdomain sudo[301170]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:31 np0005625204.localdomain sudo[301170]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:31 np0005625204.localdomain sudo[301188]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:46:31 np0005625204.localdomain sudo[301188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:31 np0005625204.localdomain sudo[301188]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:32 np0005625204.localdomain sudo[301206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301206]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:32 np0005625204.localdomain sudo[301224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301224]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:32 np0005625204.localdomain sudo[301242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301242]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:32 np0005625204.localdomain sudo[301276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301276]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:32 np0005625204.localdomain sudo[301294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301294]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:46:32 np0005625204.localdomain sudo[301312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301312]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:32 np0005625204.localdomain sudo[301330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301330]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301348]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:32 np0005625204.localdomain sudo[301348]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301348]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:32 np0005625204.localdomain sudo[301366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301366]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:32 np0005625204.localdomain sudo[301384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301384]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:32 np0005625204.localdomain sudo[301402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:32 np0005625204.localdomain sudo[301402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:32 np0005625204.localdomain sudo[301402]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:33 np0005625204.localdomain sudo[301436]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:33 np0005625204.localdomain sudo[301436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:33 np0005625204.localdomain sudo[301436]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:33 np0005625204.localdomain sudo[301454]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:33 np0005625204.localdomain sudo[301454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:33 np0005625204.localdomain sudo[301454]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:33 np0005625204.localdomain sudo[301472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:33 np0005625204.localdomain sudo[301472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:33 np0005625204.localdomain sudo[301472]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:33 np0005625204.localdomain sudo[301490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:46:33 np0005625204.localdomain sudo[301490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:33 np0005625204.localdomain sudo[301490]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:33 np0005625204.localdomain sshd[301508]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:46:34 np0005625204.localdomain sshd[301508]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:46:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:36.461 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:36.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:36.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:36.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:36.500 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:36.501 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:37 np0005625204.localdomain sudo[301510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:37 np0005625204.localdomain sudo[301510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:37 np0005625204.localdomain sudo[301510]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:37 np0005625204.localdomain sudo[301528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:37 np0005625204.localdomain sudo[301528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 2026-02-20 09:46:38.196896738 +0000 UTC m=+0.064440754 container create 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, release=1770267347, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z)
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4.scope.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 2026-02-20 09:46:38.268724819 +0000 UTC m=+0.136268825 container init 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, name=rhceph, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph)
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 2026-02-20 09:46:38.171773197 +0000 UTC m=+0.039317213 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 2026-02-20 09:46:38.280513124 +0000 UTC m=+0.148057140 container start 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, GIT_CLEAN=True, vcs-type=git, release=1770267347, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, com.redhat.component=rhceph-container)
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 2026-02-20 09:46:38.280936995 +0000 UTC m=+0.148481041 container attach 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, release=1770267347, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=)
Feb 20 09:46:38 np0005625204.localdomain great_ramanujan[301604]: 167 167
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: libpod-92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4.scope: Deactivated successfully.
Feb 20 09:46:38 np0005625204.localdomain podman[301588]: 2026-02-20 09:46:38.285404631 +0000 UTC m=+0.152948667 container died 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7)
Feb 20 09:46:38 np0005625204.localdomain podman[301603]: 2026-02-20 09:46:38.343892956 +0000 UTC m=+0.107862042 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:46:38 np0005625204.localdomain podman[301618]: 2026-02-20 09:46:38.366190837 +0000 UTC m=+0.070779623 container remove 92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_ramanujan, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, RELEASE=main, io.openshift.tags=rhceph ceph, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: libpod-conmon-92372455b7ecd2f67e2070857af68712015615c43bc7291e71dabad59a6f7bf4.scope: Deactivated successfully.
Feb 20 09:46:38 np0005625204.localdomain podman[301603]: 2026-02-20 09:46:38.402926056 +0000 UTC m=+0.166895142 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 2026-02-20 09:46:38.440049197 +0000 UTC m=+0.044691806 container create 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, vcs-type=git)
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d.scope.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/895573c518763e1c4853baf8292c323f213e2b007d3114578144c11eb56f7ea1/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 2026-02-20 09:46:38.500186518 +0000 UTC m=+0.104829127 container init 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, RELEASE=main, io.buildah.version=1.42.2, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=)
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 2026-02-20 09:46:38.512601988 +0000 UTC m=+0.117244627 container start 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7)
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 2026-02-20 09:46:38.513815043 +0000 UTC m=+0.118457732 container attach 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347)
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 2026-02-20 09:46:38.422176841 +0000 UTC m=+0.026819470 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: libpod-62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d.scope: Deactivated successfully.
Feb 20 09:46:38 np0005625204.localdomain podman[301649]: 2026-02-20 09:46:38.566764661 +0000 UTC m=+0.171407290 container died 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, distribution-scope=public, io.openshift.expose-services=, GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, release=1770267347, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2)
Feb 20 09:46:38 np0005625204.localdomain podman[301691]: 2026-02-20 09:46:38.626793399 +0000 UTC m=+0.054647376 container remove 62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_diffie, ceph=True, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, version=7, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., RELEASE=main, build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: libpod-conmon-62c63c7ce937178d8c98364ee904a0f9ee6fe9deb51a84bd740872723448ad1d.scope: Deactivated successfully.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:46:38 np0005625204.localdomain systemd-sysv-generator[301736]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:46:38 np0005625204.localdomain systemd-rc-local-generator[301732]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: tmp-crun.z5KUlp.mount: Deactivated successfully.
Feb 20 09:46:38 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-877f099077b7ebd5ab419ce7e06f0d641f94be10dea400192a038ac02adb2f37-merged.mount: Deactivated successfully.
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: Reloading.
Feb 20 09:46:39 np0005625204.localdomain systemd-rc-local-generator[301771]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 20 09:46:39 np0005625204.localdomain systemd-sysv-generator[301777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: Starting Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8...
Feb 20 09:46:39 np0005625204.localdomain podman[301839]: 
Feb 20 09:46:39 np0005625204.localdomain podman[301839]: 2026-02-20 09:46:39.671009229 +0000 UTC m=+0.084203193 container create b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, CEPH_POINT_RELEASE=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, io.openshift.expose-services=, name=rhceph)
Feb 20 09:46:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ab379e22583f9c4eb77f0717103a939beb8b20cf7560cd3809c15db78012938/merged/var/lib/ceph/mon/ceph-np0005625204 supports timestamps until 2038 (0x7fffffff)
Feb 20 09:46:39 np0005625204.localdomain podman[301839]: 2026-02-20 09:46:39.728766783 +0000 UTC m=+0.141960747 container init b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, release=1770267347, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhceph, vcs-type=git, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:46:39 np0005625204.localdomain podman[301839]: 2026-02-20 09:46:39.634813965 +0000 UTC m=+0.048007969 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:39 np0005625204.localdomain podman[301839]: 2026-02-20 09:46:39.743938592 +0000 UTC m=+0.157132556 container start b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mon-np0005625204, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, release=1770267347, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:46:39 np0005625204.localdomain bash[301839]: b8341c3094c907ff52dc85cef8921faf70d656a19c0e961cb91363491d2b3a7c
Feb 20 09:46:39 np0005625204.localdomain systemd[1]: Started Ceph mon.np0005625204 for a8557ee9-b55d-5519-942c-cf8f6172f1d8.
Feb 20 09:46:39 np0005625204.localdomain sudo[301528]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: set uid:gid to 167:167 (ceph:ceph)
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mon, pid 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: pidfile_write: ignore empty --pid-file
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: load: jerasure load: lrc 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: RocksDB version: 7.9.2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Git sha 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Compile date 2026-02-06 00:00:00
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: DB SUMMARY
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: DB Session ID:  OMQD63SADIG5WJVO9ZZI
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: CURRENT file:  CURRENT
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: IDENTITY file:  IDENTITY
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005625204/store.db dir, Total Num: 0, files: 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005625204/store.db: 000004.log size: 636 ; 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                         Options.error_if_exists: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.create_if_missing: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                         Options.paranoid_checks: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                                     Options.env: 0x559a1d482a20
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                                Options.info_log: 0x559a1eac8d20
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.max_file_opening_threads: 16
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                              Options.statistics: (nil)
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                               Options.use_fsync: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.max_log_file_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                         Options.allow_fallocate: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                        Options.use_direct_reads: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.create_missing_column_families: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                              Options.db_log_dir: 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                                 Options.wal_dir: 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.advise_random_on_open: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                    Options.write_buffer_manager: 0x559a1ead9540
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                            Options.rate_limiter: (nil)
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.unordered_write: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                               Options.row_cache: None
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                              Options.wal_filter: None
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.allow_ingest_behind: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.two_write_queues: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.manual_wal_flush: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.wal_compression: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.atomic_flush: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.log_readahead_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.allow_data_in_errors: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.db_host_id: __hostname__
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.max_background_jobs: 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.max_background_compactions: -1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.max_subcompactions: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.max_total_wal_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                          Options.max_open_files: -1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                          Options.bytes_per_sync: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:       Options.compaction_readahead_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.max_background_flushes: -1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Compression algorithms supported:
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kZSTD supported: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kXpressCompression supported: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kBZip2Compression supported: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kLZ4Compression supported: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kZlibCompression supported: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kLZ4HCCompression supported: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         kSnappyCompression supported: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:           Options.merge_operator: 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:        Options.compaction_filter: None
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:        Options.compaction_filter_factory: None
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:  Options.sst_partitioner_factory: None
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x559a1eac8980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x559a1eac51f0
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:        Options.write_buffer_size: 33554432
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:  Options.max_write_buffer_number: 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.compression: NoCompression
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:       Options.prefix_extractor: nullptr
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.num_levels: 7
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.compression_opts.level: 32767
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:               Options.compression_opts.strategy: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                  Options.compression_opts.enabled: false
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                        Options.arena_block_size: 1048576
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.disable_auto_compactions: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.table_properties_collectors: 
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.inplace_update_support: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                           Options.bloom_locality: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                    Options.max_successive_merges: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.paranoid_file_checks: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.force_consistency_checks: 1
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.report_bg_io_stats: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                               Options.ttl: 2592000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                       Options.enable_blob_files: false
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                           Options.min_blob_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                          Options.blob_file_size: 268435456
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb:                Options.blob_file_starting_level: 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005625204/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 316f2b4e-6103-43ad-8119-3359f94ef991
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580799803444, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580799805434, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580799805554, "job": 1, "event": "recovery_finished"}
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x559a1eaece00
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: DB pointer 0x559a1ebe2000
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 does not exist in monmap, will attempt to join an existing cluster
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.72 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.72 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.15 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x559a1eac51f0#2 capacity: 512.00 MB usage: 0.98 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 2e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1,0.77 KB,0.000146031%) FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: using public_addr v2:172.18.0.105:0/0 -> [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0]
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: starting mon.np0005625204 rank -1 at public addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] at bind addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005625204 fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(???) e0 preinit fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:40 np0005625204.localdomain systemd[1]: tmp-crun.NNjUiY.mount: Deactivated successfully.
Feb 20 09:46:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:41.502 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:41.503 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:41.504 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:41.504 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:41.536 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:41.537 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:43 np0005625204.localdomain sudo[301896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:43 np0005625204.localdomain sudo[301896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:43 np0005625204.localdomain sudo[301896]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:43 np0005625204.localdomain sudo[301914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:43 np0005625204.localdomain sudo[301914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 2026-02-20 09:46:43.895182934 +0000 UTC m=+0.075340872 container create 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.42.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, ceph=True, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:46:43 np0005625204.localdomain systemd[1]: Started libpod-conmon-701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264.scope.
Feb 20 09:46:43 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 2026-02-20 09:46:43.864357702 +0000 UTC m=+0.044515740 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 2026-02-20 09:46:43.965960996 +0000 UTC m=+0.146118934 container init 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.42.2, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_CLEAN=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z)
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 2026-02-20 09:46:43.975789415 +0000 UTC m=+0.155947383 container start 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, org.opencontainers.image.created=2026-02-09T10:25:24Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, RELEASE=main, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 2026-02-20 09:46:43.976056382 +0000 UTC m=+0.156214340 container attach 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_CLEAN=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, RELEASE=main, release=1770267347, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64)
Feb 20 09:46:43 np0005625204.localdomain busy_taussig[301962]: 167 167
Feb 20 09:46:43 np0005625204.localdomain systemd[1]: libpod-701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264.scope: Deactivated successfully.
Feb 20 09:46:43 np0005625204.localdomain podman[301948]: 2026-02-20 09:46:43.980496137 +0000 UTC m=+0.160654125 container died 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.42.2, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, version=7, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Feb 20 09:46:44 np0005625204.localdomain podman[301967]: 2026-02-20 09:46:44.057049424 +0000 UTC m=+0.066724719 container remove 701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_taussig, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, RELEASE=main, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, version=7, build-date=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:46:44 np0005625204.localdomain systemd[1]: libpod-conmon-701cdd13ec26558ac783f929a592a9136d1a836a43d9419366e8cb98ffe2f264.scope: Deactivated successfully.
Feb 20 09:46:44 np0005625204.localdomain sudo[301914]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:44 np0005625204.localdomain sudo[301983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:44 np0005625204.localdomain sudo[301983]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:44 np0005625204.localdomain sudo[301983]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:44 np0005625204.localdomain sudo[302001]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:46:44 np0005625204.localdomain sudo[302001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:44 np0005625204.localdomain systemd[1]: tmp-crun.tv3LaQ.mount: Deactivated successfully.
Feb 20 09:46:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-92c5f6363cedad4fc4c69b0160487342e6839da5c00a7b13ffebb821e3f3ff20-merged.mount: Deactivated successfully.
Feb 20 09:46:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:46:45 np0005625204.localdomain systemd[1]: tmp-crun.a18jMS.mount: Deactivated successfully.
Feb 20 09:46:45 np0005625204.localdomain podman[302093]: 2026-02-20 09:46:45.149884498 +0000 UTC m=+0.110260130 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.openshift.tags=rhceph ceph, release=1770267347, vendor=Red Hat, Inc., build-date=2026-02-09T10:25:24Z, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:46:45 np0005625204.localdomain systemd[1]: tmp-crun.cREDQI.mount: Deactivated successfully.
Feb 20 09:46:45 np0005625204.localdomain podman[302111]: 2026-02-20 09:46:45.261083823 +0000 UTC m=+0.103873999 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:46:45 np0005625204.localdomain podman[302111]: 2026-02-20 09:46:45.276913782 +0000 UTC m=+0.119703938 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:46:45 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:46:45 np0005625204.localdomain podman[302093]: 2026-02-20 09:46:45.335498099 +0000 UTC m=+0.295873701 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_CLEAN=True, ceph=True, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, vcs-type=git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:46:45 np0005625204.localdomain sudo[302001]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing) e16 sync_obtain_latest_monmap
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing) e16 sync_obtain_latest_monmap obtained monmap e16
Feb 20 09:46:46 np0005625204.localdomain sshd[302240]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:46:46 np0005625204.localdomain sudo[302242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:46:46 np0005625204.localdomain sudo[302242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:46 np0005625204.localdomain sudo[302242]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:46.538 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:46.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:46.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:46:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:46.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:46.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:46.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).mds e17 new map
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).mds e17 print_map
                                                           e17
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        16
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2026-02-20T07:58:28.398421+0000
                                                           modified        2026-02-20T09:40:14.722031+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        83
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26854}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26854 members: 26854
                                                           [mds.mds.np0005625203.zsrwgk{0:26854} state up:active seq 13 addr [v2:172.18.0.107:6808/3334119751,v1:172.18.0.107:6809/3334119751] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005625202.akhmop{-1:17124} state up:standby seq 1 addr [v2:172.18.0.106:6808/3865978972,v1:172.18.0.106:6809/3865978972] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005625204.wnsphl{-1:26848} state up:standby seq 1 addr [v2:172.18.0.108:6808/2508223371,v1:172.18.0.108:6809/2508223371] compat {c=[1],r=[1],i=[17ff]}]
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 3314933000852226048, adjusting msgr requires
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625201 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625203 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625201 is new leader, mons np0005625201,np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2,3)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: monmap epoch 12
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:45:39.346453+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 27s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.27628 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625201", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Remove daemons mon.np0005625201
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203'])
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing monitor np0005625201 from monmap...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports []
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202 in quorum (ranks 0,1)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: monmap epoch 13
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:45:46.327222+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 34s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Health check failed: 1/3 mons down, quorum np0005625204,np0005625202 (MON_DOWN)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Health detail: HEALTH_WARN 1/3 mons down, quorum np0005625204,np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005625204,np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]:     mon.np0005625203 (rank 2) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625201.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1126955229' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3661915845' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625203 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: monmap epoch 13
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:45:46.327222+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 37s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005625204,np0005625202)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Cluster is now healthy
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2284102821' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3624088180' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3402059240' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Deploying daemon mon.np0005625201 on np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44464 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625201.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removed label mon from host np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1060203723' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44473 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625201.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removed label mgr from host np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44479 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005625201.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removed label _admin from host np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625203 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625201 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203,np0005625201 in quorum (ranks 0,1,2,3)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: monmap epoch 14
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:45:57.556107+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 46s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3151353263' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3151353263' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing np0005625201.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing np0005625201.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625201"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing daemon mgr.np0005625201.mtnyvu from np0005625201.localdomain -- ports [8765]
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing key for mgr.np0005625201.mtnyvu
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Safe to remove mon.np0005625201: new quorum should be ['np0005625204', 'np0005625202', 'np0005625203'] (from ['np0005625204', 'np0005625202', 'np0005625203'])
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing monitor np0005625201 from monmap...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing daemon mon.np0005625201 from np0005625201.localdomain -- ports []
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625203 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 is new leader, mons np0005625204,np0005625202,np0005625203 in quorum (ranks 0,1,2)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: monmap epoch 15
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:46:08.177805+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 51s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44494 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005625201.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Added label _no_schedule to host np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44497 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005625201.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain"}]': finished
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44503 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005625201.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removed host np0005625201.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.54131 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Saving service mon spec with placement label:mon
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44515 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005625204"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Remove daemons mon.np0005625204
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Safe to remove mon.np0005625204: new quorum should be ['np0005625202', 'np0005625203'] (from ['np0005625202', 'np0005625203'])
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing monitor np0005625204 from monmap...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon rm", "name": "np0005625204"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Removing daemon mon.np0005625204 from np0005625204.localdomain -- ports []
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625203 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 is new leader, mons np0005625202,np0005625203 in quorum (ranks 0,1)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: monmap epoch 16
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:46:24.360760+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 68s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.0 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.3 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='client.44518 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005625204.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Deploying daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v45: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v46: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: pgmap v47: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(synchronizing).paxosservice(auth 1..43) refresh upgraded, format 0 -> 3
Feb 20 09:46:46 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x562e2fe789a0 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 20 09:46:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:46:47 np0005625204.localdomain podman[302260]: 2026-02-20 09:46:47.149673206 +0000 UTC m=+0.083073097 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, version=9.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-02-05T04:57:10Z, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=)
Feb 20 09:46:47 np0005625204.localdomain podman[302260]: 2026-02-20 09:46:47.164026796 +0000 UTC m=+0.097426627 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:46:47 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:46:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:46:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:46:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:46:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:46:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:47.724 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:46:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18266 "" "Go-http-client/1.1"
Feb 20 09:46:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:47.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:46:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:47.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:46:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:47.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:46:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:47.744 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:46:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:47.744 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:46:47 np0005625204.localdomain sshd[302240]: Received disconnect from 103.191.14.210 port 55448:11: Bye Bye [preauth]
Feb 20 09:46:47 np0005625204.localdomain sshd[302240]: Disconnected from authenticating user root 103.191.14.210 port 55448 [preauth]
Feb 20 09:46:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(probing) e16 handle_auth_request failed to assign global_id
Feb 20 09:46:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(probing) e16 handle_auth_request failed to assign global_id
Feb 20 09:46:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(probing) e16 handle_auth_request failed to assign global_id
Feb 20 09:46:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@-1(probing) e17  my rank is now 2 (was -1)
Feb 20 09:46:48 np0005625204.localdomain ceph-mon[301857]: log_channel(cluster) log [INF] : mon.np0005625204 calling monitor election
Feb 20 09:46:48 np0005625204.localdomain ceph-mon[301857]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 20 09:46:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id
Feb 20 09:46:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:46:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:46:50 np0005625204.localdomain podman[302292]: 2026-02-20 09:46:50.147040509 +0000 UTC m=+0.077975511 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:46:50 np0005625204.localdomain podman[302292]: 2026-02-20 09:46:50.157032335 +0000 UTC m=+0.087967357 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 20 09:46:50 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:46:50 np0005625204.localdomain podman[302291]: 2026-02-20 09:46:50.25672683 +0000 UTC m=+0.190739536 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:46:50 np0005625204.localdomain podman[302291]: 2026-02-20 09:46:50.35426412 +0000 UTC m=+0.288276816 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 09:46:50 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:46:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(electing) e17 handle_auth_request failed to assign global_id
Feb 20 09:46:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:51.576 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:46:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:51.578 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(electing) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mgrc update_daemon_metadata mon.np0005625204 metadata {addrs=[v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable),ceph_version_short=18.2.1-381.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005625204.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005625204.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116612,os=Linux}
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_auth_request failed to assign global_id
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 calling monitor election
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625203 calling monitor election
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: pgmap v48: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204 calling monitor election
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: pgmap v49: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: pgmap v50: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625202 is new leader, mons np0005625202,np0005625203,np0005625204 in quorum (ranks 0,1,2)
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: monmap epoch 17
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: last_changed 2026-02-20T09:46:46.606881+0000
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: created 2026-02-20T07:36:51.191305+0000
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: min_mon_release 18 (reef)
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: election_strategy: 1
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625204
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: fsmap cephfs:1 {0=mds.np0005625203.zsrwgk=up:active} 2 up:standby
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: osdmap e89: 6 total, 6 up, 6 in
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: mgrmap e36: np0005625203.lonygy(active, since 95s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:46:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/542821844' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.680 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.935s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.749 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.751 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: mgrmap e37: np0005625203.lonygy(active, since 95s), standbys: np0005625204.exgrzx, np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/1873611277' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2223140922' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/198507552' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/542821844' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.953 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.955 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11740MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.956 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:46:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:52.956 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.023 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.024 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.024 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.061 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:46:53 np0005625204.localdomain sudo[302365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:46:53 np0005625204.localdomain sudo[302365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:53 np0005625204.localdomain sudo[302365]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1416339821' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.519731) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813519855, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 11856, "num_deletes": 254, "total_data_size": 20741104, "memory_usage": 21988520, "flush_reason": "Manual Compaction"}
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.522 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:46:53 np0005625204.localdomain sudo[302383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:46:53 np0005625204.localdomain sudo[302383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.530 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:46:53 np0005625204.localdomain sudo[302383]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.553 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.557 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:46:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:53.558 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813590524, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 16372900, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 11861, "table_properties": {"data_size": 16308556, "index_size": 34542, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28741, "raw_key_size": 307741, "raw_average_key_size": 26, "raw_value_size": 16114056, "raw_average_value_size": 1404, "num_data_blocks": 1305, "num_entries": 11475, "num_filter_entries": 11475, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580806, "oldest_key_time": 1771580806, "file_creation_time": 1771580813, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 70996 microseconds, and 32368 cpu microseconds.
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.590730) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 16372900 bytes OK
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.590798) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.592765) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.592791) EVENT_LOG_v1 {"time_micros": 1771580813592783, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.592817) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 20658698, prev total WAL file size 20680587, number of live WAL files 2.
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.598005) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130373933' seq:72057594037927935, type:22 .. '7061786F73003131303435' seq:0, type:0; will stop at (end)
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(15MB) 8(1762B)]
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813598144, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 16374662, "oldest_snapshot_seqno": -1}
Feb 20 09:46:53 np0005625204.localdomain sudo[302404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:53 np0005625204.localdomain sudo[302404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:53 np0005625204.localdomain sudo[302404]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 11225 keys, 16369345 bytes, temperature: kUnknown
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813678336, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 16369345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16305678, "index_size": 34510, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28101, "raw_key_size": 302972, "raw_average_key_size": 26, "raw_value_size": 16114461, "raw_average_value_size": 1435, "num_data_blocks": 1304, "num_entries": 11225, "num_filter_entries": 11225, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771580813, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.678750) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 16369345 bytes
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.680716) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 203.9 rd, 203.8 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(15.6, 0.0 +0.0 blob) out(15.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 11480, records dropped: 255 output_compression: NoCompression
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.680749) EVENT_LOG_v1 {"time_micros": 1771580813680733, "job": 4, "event": "compaction_finished", "compaction_time_micros": 80315, "compaction_time_cpu_micros": 46153, "output_level": 6, "num_output_files": 1, "total_output_size": 16369345, "num_input_records": 11480, "num_output_records": 11225, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:46:53 np0005625204.localdomain sudo[302422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813683364, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580813683432, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:46:53.597863) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:46:53 np0005625204.localdomain sudo[302422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:53 np0005625204.localdomain sudo[302422]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:53 np0005625204.localdomain sudo[302440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:53 np0005625204.localdomain sudo[302440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:53 np0005625204.localdomain sudo[302440]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: pgmap v51: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1604508930' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3691043963' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1416339821' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:53 np0005625204.localdomain sudo[302474]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:53 np0005625204.localdomain sudo[302474]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:53 np0005625204.localdomain sudo[302474]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302492]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:46:54 np0005625204.localdomain sudo[302492]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302492]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:46:54 np0005625204.localdomain sudo[302510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302510]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:54 np0005625204.localdomain sudo[302528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302528]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:46:54 np0005625204.localdomain sudo[302546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302546]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:54 np0005625204.localdomain sudo[302564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302564]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:46:54 np0005625204.localdomain sudo[302582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302582]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:54 np0005625204.localdomain sudo[302600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302600]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:54 np0005625204.localdomain sudo[302634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302634]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain sudo[302652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:46:54 np0005625204.localdomain sudo[302652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302652]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:54 np0005625204.localdomain ceph-mon[301857]: from='client.44535 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:46:54 np0005625204.localdomain ceph-mon[301857]: Reconfig service osd.default_drive_group
Feb 20 09:46:54 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:54 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:54 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:46:54 np0005625204.localdomain sudo[302670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:54 np0005625204.localdomain sudo[302670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:54 np0005625204.localdomain sudo[302670]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:55 np0005625204.localdomain sshd[302688]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:46:55 np0005625204.localdomain sudo[302690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:46:55 np0005625204.localdomain sudo[302690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:55 np0005625204.localdomain sudo[302690]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:55.555 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:55.557 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:55.557 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:46:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:55.558 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: pgmap v52: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625202", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:46:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.157 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.157 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.158 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.158 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:46:56 np0005625204.localdomain sshd[302688]: Received disconnect from 203.228.30.198 port 35202:11: Bye Bye [preauth]
Feb 20 09:46:56 np0005625204.localdomain sshd[302688]: Disconnected from authenticating user root 203.228.30.198 port 35202 [preauth]
Feb 20 09:46:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:46:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:46:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:46:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:46:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:46:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625202 (monmap changed)...
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625202 on np0005625202.localdomain
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.836 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.853 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.853 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.854 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.854 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.854 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:46:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:46:56.855 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.2 (monmap changed)...
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.2 on np0005625202.localdomain
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: pgmap v53: 177 pgs: 177 active+clean; 104 MiB data, 587 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 20 09:46:57 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e89 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e89 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 e90: 6 total, 6 up, 6 in
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr handle_mgr_map Activating!
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr handle_mgr_map I am now activating
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005625202"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005625203"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata", "id": "np0005625204"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 0
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 0
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 0
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds metadata"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).mds e17 all = 1
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon metadata"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: balancer
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] Starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] Optimize plan auto_2026-02-20_09:46:58
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Feb 20 09:46:58 np0005625204.localdomain sshd[297095]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:46:58 np0005625204.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Feb 20 09:46:58 np0005625204.localdomain systemd[1]: session-69.scope: Consumed 26.523s CPU time.
Feb 20 09:46:58 np0005625204.localdomain systemd-logind[759]: Session 69 logged out. Waiting for processes to exit.
Feb 20 09:46:58 np0005625204.localdomain systemd-logind[759]: Removed session 69.
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [cephadm WARNING root] removing stray HostCache host record np0005625201.localdomain.devices.0
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [WRN] : removing stray HostCache host record np0005625201.localdomain.devices.0
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: cephadm
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: crash
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: devicehealth
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [devicehealth INFO root] Starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: iostat
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: nfs
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: orchestrator
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: pg_autoscaler
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: progress
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] _maybe_adjust
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Loading...
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f08ca4e71f0>, <progress.module.GhostEvent object at 0x7f08ca4e7220>, <progress.module.GhostEvent object at 0x7f08ca4e7250>, <progress.module.GhostEvent object at 0x7f08ca4e7280>, <progress.module.GhostEvent object at 0x7f08ca4e72b0>, <progress.module.GhostEvent object at 0x7f08ca4e72e0>, <progress.module.GhostEvent object at 0x7f08ca4e7310>, <progress.module.GhostEvent object at 0x7f08ca4e7340>, <progress.module.GhostEvent object at 0x7f08ca4e7370>, <progress.module.GhostEvent object at 0x7f08ca4e73a0>, <progress.module.GhostEvent object at 0x7f08ca4e73d0>, <progress.module.GhostEvent object at 0x7f08ca4e7400>, <progress.module.GhostEvent object at 0x7f08ca4e7430>, <progress.module.GhostEvent object at 0x7f08ca4e7460>, <progress.module.GhostEvent object at 0x7f08ca4e7490>, <progress.module.GhostEvent object at 0x7f08ca4e74c0>, <progress.module.GhostEvent object at 0x7f08ca4e74f0>, <progress.module.GhostEvent object at 0x7f08ca4e7520>, <progress.module.GhostEvent object at 0x7f08ca4e7550>, <progress.module.GhostEvent object at 0x7f08ca4e7580>, <progress.module.GhostEvent object at 0x7f08ca4e75b0>, <progress.module.GhostEvent object at 0x7f08ca4e75e0>, <progress.module.GhostEvent object at 0x7f08ca4e7610>, <progress.module.GhostEvent object at 0x7f08ca4e7640>, <progress.module.GhostEvent object at 0x7f08ca4e7670>, <progress.module.GhostEvent object at 0x7f08ca4e76a0>, <progress.module.GhostEvent object at 0x7f08ca4e76d0>, <progress.module.GhostEvent object at 0x7f08ca4e7700>, <progress.module.GhostEvent object at 0x7f08ca4e7730>, <progress.module.GhostEvent object at 0x7f08ca4e7760>, <progress.module.GhostEvent object at 0x7f08ca4e7790>, <progress.module.GhostEvent object at 0x7f08ca4e77c0>, <progress.module.GhostEvent object at 0x7f08ca4e77f0>, <progress.module.GhostEvent object at 0x7f08ca4e7820>, <progress.module.GhostEvent object at 0x7f08ca4e7850>, <progress.module.GhostEvent object at 0x7f08ca4e7880>, <progress.module.GhostEvent object at 0x7f08ca4e78b0>, <progress.module.GhostEvent object at 0x7f08ca4e78e0>, <progress.module.GhostEvent object at 0x7f08ca4e7910>, <progress.module.GhostEvent object at 0x7f08ca4e7940>, <progress.module.GhostEvent object at 0x7f08ca4e7970>, <progress.module.GhostEvent object at 0x7f08ca4e79a0>, <progress.module.GhostEvent object at 0x7f08ca4e79d0>, <progress.module.GhostEvent object at 0x7f08ca4e7a00>, <progress.module.GhostEvent object at 0x7f08ca4e7a30>, <progress.module.GhostEvent object at 0x7f08ca4e7a60>, <progress.module.GhostEvent object at 0x7f08ca4e7a90>, <progress.module.GhostEvent object at 0x7f08ca4e7ac0>, <progress.module.GhostEvent object at 0x7f08ca4e7af0>, <progress.module.GhostEvent object at 0x7f08ca4e7b20>] historic events
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Loaded OSDMap, ready.
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] recovery thread starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] starting setup
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: rbd_support
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: restful
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [restful INFO root] server_addr: :: server_port: 8003
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: status
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [restful WARNING root] server not running: no certificate configured
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: telemetry
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] PerfHandler: starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_task_task: vms, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: mgr load Constructed class from module: volumes
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_task_task: volumes, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.483+0000 7f08b842b640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:46:58.492+0000 7f08b4c24640 -1 client.0 error registering admin socket command: (17) File exists
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_task_task: images, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_task_task: backups, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] TaskHandler: starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} v 0)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Feb 20 09:46:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] setup complete
Feb 20 09:46:58 np0005625204.localdomain sshd[302847]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:46:58 np0005625204.localdomain sshd[302847]: Accepted publickey for ceph-admin from 192.168.122.108 port 34544 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 09:46:58 np0005625204.localdomain systemd-logind[759]: New session 72 of user ceph-admin.
Feb 20 09:46:58 np0005625204.localdomain systemd[1]: Started Session 72 of User ceph-admin.
Feb 20 09:46:58 np0005625204.localdomain sshd[302847]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:46:58 np0005625204.localdomain sudo[302851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.5 (monmap changed)...
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.5 on np0005625202.localdomain
Feb 20 09:46:58 np0005625204.localdomain sudo[302851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' 
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 ' entity='mgr.np0005625203.lonygy' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.26986 172.18.0.107:0/211199661' entity='mgr.np0005625203.lonygy' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/1025406798' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: Activating manager daemon np0005625204.exgrzx
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: osdmap e90: 6 total, 6 up, 6 in
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/1025406798' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: mgrmap e38: np0005625204.exgrzx(active, starting, since 0.0312476s), standbys: np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625201.mtnyvu", "id": "np0005625201.mtnyvu"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: Manager daemon np0005625204.exgrzx is now available
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: removing stray HostCache host record np0005625201.localdomain.devices.0
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"}]': finished
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005625201.localdomain.devices.0"}]': finished
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/mirror_snapshot_schedule"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625204.exgrzx/trash_purge_schedule"} : dispatch
Feb 20 09:46:58 np0005625204.localdomain sudo[302851]: pam_unix(sudo:session): session closed for user root
Feb 20 09:46:58 np0005625204.localdomain sudo[302869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:46:58 np0005625204.localdomain sudo[302869]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Bus STARTING
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Bus STARTING
Feb 20 09:46:59 np0005625204.localdomain systemd[1]: tmp-crun.TUf2OM.mount: Deactivated successfully.
Feb 20 09:46:59 np0005625204.localdomain podman[302959]: 2026-02-20 09:46:59.809805069 +0000 UTC m=+0.109136035 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, architecture=x86_64, vcs-type=git, RELEASE=main, GIT_BRANCH=main, release=1770267347, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, com.redhat.component=rhceph-container, build-date=2026-02-09T10:25:24Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.42.2, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:46:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1019817561 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Serving on http://172.18.0.108:8765
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Serving on http://172.18.0.108:8765
Feb 20 09:46:59 np0005625204.localdomain podman[302959]: 2026-02-20 09:46:59.960064454 +0000 UTC m=+0.259395420 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.created=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, io.buildah.version=1.42.2, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Serving on https://172.18.0.108:7150
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Serving on https://172.18.0.108:7150
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Bus STARTED
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Bus STARTED
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cherrypy.error] [20/Feb/2026:09:46:59] ENGINE Client ('172.18.0.108', 51320) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:46:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : [20/Feb/2026:09:46:59] ENGINE Client ('172.18.0.108', 51320) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mgrmap e39: np0005625204.exgrzx(active, since 1.04393s), standbys: np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:00 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:47:00 np0005625204.localdomain ceph-mgr[287186]: [devicehealth INFO root] Check health
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:47:00 np0005625204.localdomain podman[303084]: 2026-02-20 09:47:00.597260743 +0000 UTC m=+0.100656526 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:47:00 np0005625204.localdomain podman[303084]: 2026-02-20 09:47:00.609816467 +0000 UTC m=+0.113212320 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:47:00 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:00 np0005625204.localdomain sudo[302869]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:00 np0005625204.localdomain sudo[303130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:00 np0005625204.localdomain sudo[303130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:00 np0005625204.localdomain sudo[303130]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:00 np0005625204.localdomain sudo[303148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:47:00 np0005625204.localdomain sudo[303148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Bus STARTING
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Serving on http://172.18.0.108:8765
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Serving on https://172.18.0.108:7150
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Bus STARTED
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:46:59] ENGINE Client ('172.18.0.108', 51320) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:01 np0005625204.localdomain sudo[303148]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:01.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:01.588 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:01 np0005625204.localdomain sudo[303198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:01 np0005625204.localdomain sudo[303198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:01 np0005625204.localdomain sudo[303198]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:01 np0005625204.localdomain sudo[303216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:47:01 np0005625204.localdomain sudo[303216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO root] Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO root] Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain sudo[303216]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO root] Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain sudo[303253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:47:02 np0005625204.localdomain sudo[303253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303253]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:02 np0005625204.localdomain sudo[303271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:47:02 np0005625204.localdomain sudo[303271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303271]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain sudo[303289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:47:02 np0005625204.localdomain sudo[303289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303289]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain sudo[303307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:02 np0005625204.localdomain sudo[303307]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303307]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain sudo[303325]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:47:02 np0005625204.localdomain sudo[303325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303325]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: mgrmap e40: np0005625204.exgrzx(active, since 3s), standbys: np0005625201.mtnyvu, np0005625202.arwxwo
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3721573066' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3721573066' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mon[301857]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:02 np0005625204.localdomain sudo[303359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:47:02 np0005625204.localdomain sudo[303359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303359]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain sudo[303377]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:47:02 np0005625204.localdomain sudo[303377]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303377]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain sudo[303395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain sudo[303395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303395]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:02 np0005625204.localdomain sudo[303413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:47:02 np0005625204.localdomain sudo[303413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303413]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:02 np0005625204.localdomain sudo[303431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:47:02 np0005625204.localdomain sudo[303431]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:02 np0005625204.localdomain sudo[303431]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:47:03 np0005625204.localdomain sudo[303449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303449]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:03 np0005625204.localdomain sudo[303467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303467]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:47:03 np0005625204.localdomain ceph-mgr[287186]: mgr.server handle_open ignoring open from mgr.np0005625203.lonygy 172.18.0.107:0/1974106064; not ready for session (expect reconnect)
Feb 20 09:47:03 np0005625204.localdomain sudo[303485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303485]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:47:03 np0005625204.localdomain sudo[303519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303519]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:03 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:03 np0005625204.localdomain sudo[303537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:47:03 np0005625204.localdomain sudo[303537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303537]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:03 np0005625204.localdomain sudo[303555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303555]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:03 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:03 np0005625204.localdomain sudo[303573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:47:03 np0005625204.localdomain sudo[303573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303573]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:47:03 np0005625204.localdomain sudo[303591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303591]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:47:03 np0005625204.localdomain sudo[303609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303609]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:03 np0005625204.localdomain sudo[303627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303627]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:03 np0005625204.localdomain sudo[303645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:47:03 np0005625204.localdomain sudo[303645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:03 np0005625204.localdomain sudo[303645]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain sudo[303679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:47:04 np0005625204.localdomain sudo[303679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303679]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain sudo[303697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:47:04 np0005625204.localdomain sudo[303697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303697]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: mgr.server handle_open ignoring open from mgr.np0005625203.lonygy 172.18.0.107:0/1974106064; not ready for session (expect reconnect)
Feb 20 09:47:04 np0005625204.localdomain sudo[303715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain sudo[303715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303715]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain sudo[303733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:47:04 np0005625204.localdomain sudo[303733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303733]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Feb 20 09:47:04 np0005625204.localdomain sudo[303751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:47:04 np0005625204.localdomain sudo[303751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303751]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain sudo[303769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:47:04 np0005625204.localdomain sudo[303769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303769]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain sudo[303787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:04 np0005625204.localdomain sudo[303787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sshd[303804]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:47:04 np0005625204.localdomain sudo[303787]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain sudo[303807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:47:04 np0005625204.localdomain sudo[303807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303807]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Standby manager daemon np0005625203.lonygy started
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 0 B/s wr, 23 op/s
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} v 0)
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:47:04 np0005625204.localdomain sudo[303841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:47:04 np0005625204.localdomain sudo[303841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain sudo[303841]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain sshd[303804]: Received disconnect from 18.221.252.160 port 33586:11: Bye Bye [preauth]
Feb 20 09:47:04 np0005625204.localdomain sshd[303804]: Disconnected from authenticating user root 18.221.252.160 port 33586 [preauth]
Feb 20 09:47:04 np0005625204.localdomain sudo[303859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:47:04 np0005625204.localdomain sudo[303859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303859]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020051072 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:04 np0005625204.localdomain sudo[303877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:04 np0005625204.localdomain sudo[303877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:04 np0005625204.localdomain sudo[303877]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 0 B/s wr, 18 op/s
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] update: starting ev 337b38bf-9a31-4bb9-9a16-f63a4dbf816e (Updating node-proxy deployment (+3 -> 3))
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] complete: finished ev 337b38bf-9a31-4bb9-9a16-f63a4dbf816e (Updating node-proxy deployment (+3 -> 3))
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Completed event 337b38bf-9a31-4bb9-9a16-f63a4dbf816e (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mgrmap e41: np0005625204.exgrzx(active, since 6s), standbys: np0005625201.mtnyvu, np0005625202.arwxwo, np0005625203.lonygy
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:47:05 np0005625204.localdomain sudo[303895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:47:05 np0005625204.localdomain sudo[303895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:05 np0005625204.localdomain sudo[303895]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:05 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:47:05 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:47:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:47:06.010 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:47:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:47:06.010 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:47:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:47:06.011 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:47:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:06.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:06.588 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:47:06 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:47:06 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:47:06 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 0 B/s wr, 18 op/s
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625202.akhmop (monmap changed)...
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625202.akhmop on np0005625202.localdomain
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625202.akhmop", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625202.arwxwo", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:07 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 14 op/s
Feb 20 09:47:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:47:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:47:07 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:47:07 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:47:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 20 09:47:07 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:47:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:07 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:07 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:47:07 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625202.arwxwo (monmap changed)...
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625202.arwxwo on np0005625202.localdomain
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625203", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:47:08 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Writing back 50 completed events
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:08 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Feb 20 09:47:08 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:08 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:47:08 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:47:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:47:09 np0005625204.localdomain podman[303913]: 2026-02-20 09:47:09.136226523 +0000 UTC m=+0.066966134 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:47:09 np0005625204.localdomain podman[303913]: 2026-02-20 09:47:09.177146137 +0000 UTC m=+0.107885768 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:47:09 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 14 op/s
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625203 (monmap changed)...
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625203 on np0005625203.localdomain
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.1 (monmap changed)...
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.1 on np0005625203.localdomain
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:09 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:09 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Feb 20 09:47:09 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:09 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:47:09 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:47:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054674 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:10 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:47:10 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:10 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:10 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:47:10 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:11 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:47:11 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:47:11 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:47:11 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.4 (monmap changed)...
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.4 on np0005625203.localdomain
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625203.zsrwgk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625203.lonygy", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:11.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:12 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:47:12 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:12 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:47:12 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:47:12 np0005625204.localdomain sudo[303937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:12 np0005625204.localdomain sudo[303937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:12 np0005625204.localdomain sudo[303937]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:12 np0005625204.localdomain sudo[303955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:12 np0005625204.localdomain sudo[303955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625203.zsrwgk (monmap changed)...
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625203.zsrwgk on np0005625203.localdomain
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625203.lonygy (monmap changed)...
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625203.lonygy on np0005625203.localdomain
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: Reconfiguring crash.np0005625204 (monmap changed)...
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon crash.np0005625204 on np0005625204.localdomain
Feb 20 09:47:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005625204", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 20 09:47:12 np0005625204.localdomain podman[303990]: 
Feb 20 09:47:12 np0005625204.localdomain podman[303990]: 2026-02-20 09:47:12.909110222 +0000 UTC m=+0.075186225 container create bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, io.openshift.expose-services=, release=1770267347, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, build-date=2026-02-09T10:25:24Z, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 20 09:47:12 np0005625204.localdomain systemd[1]: Started libpod-conmon-bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1.scope.
Feb 20 09:47:12 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:47:12 np0005625204.localdomain podman[303990]: 2026-02-20 09:47:12.881213487 +0000 UTC m=+0.047289500 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:47:12 np0005625204.localdomain podman[303990]: 2026-02-20 09:47:12.994244361 +0000 UTC m=+0.160320364 container init bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, ceph=True, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: tmp-crun.LazvEH.mount: Deactivated successfully.
Feb 20 09:47:13 np0005625204.localdomain podman[303990]: 2026-02-20 09:47:13.013727098 +0000 UTC m=+0.179803101 container start bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, build-date=2026-02-09T10:25:24Z, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, release=1770267347, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main)
Feb 20 09:47:13 np0005625204.localdomain podman[303990]: 2026-02-20 09:47:13.013932715 +0000 UTC m=+0.180008728 container attach bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main)
Feb 20 09:47:13 np0005625204.localdomain vibrant_jepsen[304005]: 167 167
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: libpod-bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1.scope: Deactivated successfully.
Feb 20 09:47:13 np0005625204.localdomain podman[303990]: 2026-02-20 09:47:13.018536966 +0000 UTC m=+0.184613039 container died bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.42.2, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, GIT_CLEAN=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1770267347, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, version=7, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7)
Feb 20 09:47:13 np0005625204.localdomain podman[304010]: 2026-02-20 09:47:13.129076643 +0000 UTC m=+0.096483057 container remove bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_jepsen, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, name=rhceph, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, version=7, ceph=True, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: libpod-conmon-bec19272a726b9ad04fdbb2ca11a442bf5dbeb6d95a2b476748c3f89f66c22c1.scope: Deactivated successfully.
Feb 20 09:47:13 np0005625204.localdomain ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44559 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:47:13 np0005625204.localdomain sudo[303955]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:13 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:13 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:13 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Feb 20 09:47:13 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Feb 20 09:47:13 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Feb 20 09:47:13 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:47:13 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:13 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:13 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:47:13 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:47:13 np0005625204.localdomain sudo[304028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:13 np0005625204.localdomain sudo[304028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:13 np0005625204.localdomain sudo[304028]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:13 np0005625204.localdomain sudo[304046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:13 np0005625204.localdomain sudo[304046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:13 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 2026-02-20 09:47:13.852944229 +0000 UTC m=+0.077169277 container create a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, ceph=True, vcs-type=git, GIT_CLEAN=True, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z)
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: Started libpod-conmon-a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24.scope.
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-b8a74d72450a7e3cc2d43c0da7d92e4e7f1a6b3a4ea509e7ba96765527b8cd3f-merged.mount: Deactivated successfully.
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 2026-02-20 09:47:13.823578238 +0000 UTC m=+0.047803276 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 2026-02-20 09:47:13.924671576 +0000 UTC m=+0.148896614 container init a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, io.k8s.description=Red Hat Ceph Storage 7, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 2026-02-20 09:47:13.934518039 +0000 UTC m=+0.158743077 container start a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, RELEASE=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 2026-02-20 09:47:13.934825698 +0000 UTC m=+0.159050786 container attach a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, architecture=x86_64, build-date=2026-02-09T10:25:24Z, release=1770267347, GIT_CLEAN=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, com.redhat.component=rhceph-container, name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:47:13 np0005625204.localdomain trusting_cerf[304097]: 167 167
Feb 20 09:47:13 np0005625204.localdomain systemd[1]: libpod-a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24.scope: Deactivated successfully.
Feb 20 09:47:13 np0005625204.localdomain podman[304081]: 2026-02-20 09:47:13.939053818 +0000 UTC m=+0.163278906 container died a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2026-02-09T10:25:24Z, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, version=7)
Feb 20 09:47:14 np0005625204.localdomain podman[304102]: 2026-02-20 09:47:14.045918613 +0000 UTC m=+0.093453046 container remove a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_cerf, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, description=Red Hat Ceph Storage 7, release=1770267347, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, name=rhceph, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:47:14 np0005625204.localdomain systemd[1]: libpod-conmon-a53ffa717afda366fe72ca5423765e74fae184e13e708697360df6fe36081f24.scope: Deactivated successfully.
Feb 20 09:47:14 np0005625204.localdomain sudo[304046]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: from='client.44559 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.0 (monmap changed)...
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.0 on np0005625204.localdomain
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:14 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Feb 20 09:47:14 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:14 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:47:14 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:47:14 np0005625204.localdomain sudo[304125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:14 np0005625204.localdomain sudo[304125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:14 np0005625204.localdomain sudo[304125]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:14 np0005625204.localdomain sudo[304143]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:14 np0005625204.localdomain sudo[304143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-814a2e032646545d5e27fe113f9120c8ce9e759df56810fd845e7402bcd46083-merged.mount: Deactivated successfully.
Feb 20 09:47:14 np0005625204.localdomain podman[304178]: 
Feb 20 09:47:14 np0005625204.localdomain podman[304178]: 2026-02-20 09:47:14.957381577 +0000 UTC m=+0.084334836 container create 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, version=7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, name=rhceph, io.openshift.expose-services=, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: Started libpod-conmon-2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48.scope.
Feb 20 09:47:15 np0005625204.localdomain podman[304178]: 2026-02-20 09:47:14.921854268 +0000 UTC m=+0.048807557 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:47:15 np0005625204.localdomain podman[304178]: 2026-02-20 09:47:15.038785772 +0000 UTC m=+0.165739041 container init 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-09T10:25:24Z, name=rhceph, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:47:15 np0005625204.localdomain podman[304178]: 2026-02-20 09:47:15.049744678 +0000 UTC m=+0.176697947 container start 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, release=1770267347, name=rhceph, version=7, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 20 09:47:15 np0005625204.localdomain podman[304178]: 2026-02-20 09:47:15.050079798 +0000 UTC m=+0.177033077 container attach 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, name=rhceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, build-date=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7)
Feb 20 09:47:15 np0005625204.localdomain vigilant_liskov[304194]: 167 167
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: libpod-2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48.scope: Deactivated successfully.
Feb 20 09:47:15 np0005625204.localdomain podman[304178]: 2026-02-20 09:47:15.053057489 +0000 UTC m=+0.180010818 container died 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, build-date=2026-02-09T10:25:24Z, architecture=x86_64, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.42.2, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:47:15 np0005625204.localdomain podman[304199]: 2026-02-20 09:47:15.159868952 +0000 UTC m=+0.092926438 container remove 2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_liskov, io.openshift.expose-services=, architecture=x86_64, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, vendor=Red Hat, Inc., RELEASE=main, vcs-type=git, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, com.redhat.component=rhceph-container, release=1770267347, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2026-02-09T10:25:24Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: libpod-conmon-2f21430b146b69dc93666a8c5126f32fbb32c570004cfe98cd168b95974ddc48.scope: Deactivated successfully.
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: Reconfiguring osd.3 (monmap changed)...
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon osd.3 on np0005625204.localdomain
Feb 20 09:47:15 np0005625204.localdomain sudo[304143]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:47:15 np0005625204.localdomain sudo[304223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:15 np0005625204.localdomain sudo[304223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:47:15 np0005625204.localdomain sudo[304223]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:15 np0005625204.localdomain sudo[304242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:15 np0005625204.localdomain sudo[304242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:15 np0005625204.localdomain podman[304240]: 2026-02-20 09:47:15.588679814 +0000 UTC m=+0.092889777 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:47:15 np0005625204.localdomain podman[304240]: 2026-02-20 09:47:15.630030792 +0000 UTC m=+0.134240715 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44562 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO root] Saving service mon spec with placement label:mon
Feb 20 09:47:15 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Feb 20 09:47:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:47:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e6fc8ca174237947f2374abdf5b7141a8f1dec94af8e47ec8436ae8eb86cbfc8-merged.mount: Deactivated successfully.
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 2026-02-20 09:47:16.090776113 +0000 UTC m=+0.086343797 container create 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, io.buildah.version=1.42.2, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1770267347, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, build-date=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container)
Feb 20 09:47:16 np0005625204.localdomain systemd[1]: Started libpod-conmon-1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b.scope.
Feb 20 09:47:16 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 2026-02-20 09:47:16.053280013 +0000 UTC m=+0.048847757 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 2026-02-20 09:47:16.166516814 +0000 UTC m=+0.162084498 container init 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, ceph=True, version=7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-09T10:25:24Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 2026-02-20 09:47:16.177354307 +0000 UTC m=+0.172922001 container start 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, name=rhceph, ceph=True, com.redhat.component=rhceph-container, architecture=x86_64, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.buildah.version=1.42.2, build-date=2026-02-09T10:25:24Z, version=7, RELEASE=main, release=1770267347)
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 2026-02-20 09:47:16.178362157 +0000 UTC m=+0.173929841 container attach 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, CEPH_POINT_RELEASE=, io.buildah.version=1.42.2, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1770267347, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, ceph=True, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, version=7, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git)
Feb 20 09:47:16 np0005625204.localdomain thirsty_easley[304315]: 167 167
Feb 20 09:47:16 np0005625204.localdomain systemd[1]: libpod-1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b.scope: Deactivated successfully.
Feb 20 09:47:16 np0005625204.localdomain podman[304300]: 2026-02-20 09:47:16.182524174 +0000 UTC m=+0.178091888 container died 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2026-02-09T10:25:24Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, version=7, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:47:16 np0005625204.localdomain podman[304320]: 2026-02-20 09:47:16.306537046 +0000 UTC m=+0.110445407 container remove 1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=thirsty_easley, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, architecture=x86_64, vcs-type=git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, version=7, RELEASE=main, build-date=2026-02-09T10:25:24Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2)
Feb 20 09:47:16 np0005625204.localdomain systemd[1]: libpod-conmon-1283b38e890b311c965ca481b118f1b24b20c1966c3f3e610941aa620e41381b.scope: Deactivated successfully.
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mds.mds.np0005625204.wnsphl (monmap changed)...
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mds.mds.np0005625204.wnsphl on np0005625204.localdomain
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005625204.wnsphl", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:16 np0005625204.localdomain sudo[304242]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:16 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:47:16 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:16 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:16 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:47:16 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:47:16 np0005625204.localdomain sudo[304336]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:16 np0005625204.localdomain sudo[304336]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:16 np0005625204.localdomain sudo[304336]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:16.591 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:16 np0005625204.localdomain sudo[304354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:16 np0005625204.localdomain sudo[304354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4897af6e17dff831bcaf8282f1a1e9a9fe9122d30f753292ae307e1bf5b1a744-merged.mount: Deactivated successfully.
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 2026-02-20 09:47:17.098248709 +0000 UTC m=+0.077332590 container create b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, vcs-type=git, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2026-02-09T10:25:24Z, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.42.2, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14)
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: Started libpod-conmon-b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858.scope.
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 2026-02-20 09:47:17.166235983 +0000 UTC m=+0.145319854 container init b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, architecture=x86_64, build-date=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, release=1770267347, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, name=rhceph, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, vcs-type=git, GIT_BRANCH=main, version=7, distribution-scope=public)
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 2026-02-20 09:47:17.067171847 +0000 UTC m=+0.046255788 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 2026-02-20 09:47:17.17691522 +0000 UTC m=+0.155999101 container start b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, GIT_BRANCH=main, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.42.2, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 2026-02-20 09:47:17.177342113 +0000 UTC m=+0.156426014 container attach b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, release=1770267347, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.42.2, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, name=rhceph, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, ceph=True, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:47:17 np0005625204.localdomain recursing_fermi[304402]: 167 167
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: libpod-b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858.scope: Deactivated successfully.
Feb 20 09:47:17 np0005625204.localdomain podman[304387]: 2026-02-20 09:47:17.181230413 +0000 UTC m=+0.160314344 container died b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.42.2, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, com.redhat.component=rhceph-container, release=1770267347, org.opencontainers.image.created=2026-02-09T10:25:24Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, ceph=True, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:47:17 np0005625204.localdomain ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44571 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625204", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:47:17 np0005625204.localdomain podman[304407]: 2026-02-20 09:47:17.281844907 +0000 UTC m=+0.092242379 container remove b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=recursing_fermi, GIT_BRANCH=main, architecture=x86_64, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-09T10:25:24Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.42.2, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: libpod-conmon-b0fb8254068c382c4c5385f348174540ba3473806fd7ea04d4638a4a75f4f858.scope: Deactivated successfully.
Feb 20 09:47:17 np0005625204.localdomain podman[304413]: 2026-02-20 09:47:17.355075871 +0000 UTC m=+0.141844209 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, release=1770267347, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9)
Feb 20 09:47:17 np0005625204.localdomain sudo[304354]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:17 np0005625204.localdomain podman[304413]: 2026-02-20 09:47:17.380143229 +0000 UTC m=+0.166911537 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal)
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:17 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005625204 (monmap changed)...
Feb 20 09:47:17 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005625204 (monmap changed)...
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='client.44562 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: Saving service mon spec with placement label:mon
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mgr.np0005625204.exgrzx (monmap changed)...
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "mgr services"} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005625204.exgrzx", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mgr.np0005625204.exgrzx on np0005625204.localdomain
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:17 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:17 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:47:17 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:47:17 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:17 np0005625204.localdomain sudo[304444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:47:17 np0005625204.localdomain sudo[304444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:17 np0005625204.localdomain sudo[304444]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:17 np0005625204.localdomain sudo[304462]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:47:17 np0005625204.localdomain sudo[304462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:17 np0005625204.localdomain sshd[304479]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:47:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:47:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:47:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:47:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:47:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:47:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Feb 20 09:47:17 np0005625204.localdomain sshd[304479]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: tmp-crun.XiG12H.mount: Deactivated successfully.
Feb 20 09:47:17 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-98f0487996dcd35df07979c1af8c46dbe20aa15d7ab05f3530d709d4e5be6681-merged.mount: Deactivated successfully.
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 2026-02-20 09:47:18.111408 +0000 UTC m=+0.087088029 container create 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, vendor=Red Hat, Inc., name=rhceph, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.42.2, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, build-date=2026-02-09T10:25:24Z, version=7, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: Started libpod-conmon-576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853.scope.
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: tmp-crun.OP2FU0.mount: Deactivated successfully.
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 2026-02-20 09:47:18.076288894 +0000 UTC m=+0.051968943 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.208 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 2026-02-20 09:47:18.208087473 +0000 UTC m=+0.183767512 container init 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1770267347, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, architecture=x86_64, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, name=rhceph, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, io.buildah.version=1.42.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.225 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.227 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb19a8e6-41fa-42df-a81d-688c9de2f66a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.209799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2509ea4a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': 'f12d86b6019598b37a075595676284b6dad07ec0ac125b6a343c1ff54b2d9b22'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.209799', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '250a0566-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': '594f90f7da7b9e3d4074ddabb061a0dcf64ca427cbb4d178bc184549eb084099'}]}, 'timestamp': '2026-02-20 09:47:18.227541', '_unique_id': '6018991eb6e94cfcac6cb53640f8a739'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain sshd[304518]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.229 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.231 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 2026-02-20 09:47:18.232144211 +0000 UTC m=+0.207824250 container start 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1770267347, com.redhat.component=rhceph-container, vcs-type=git)
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 2026-02-20 09:47:18.233715879 +0000 UTC m=+0.209395988 container attach 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, io.buildah.version=1.42.2, description=Red Hat Ceph Storage 7, build-date=2026-02-09T10:25:24Z, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, release=1770267347, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:47:18 np0005625204.localdomain focused_maxwell[304514]: 167 167
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: libpod-576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853.scope: Deactivated successfully.
Feb 20 09:47:18 np0005625204.localdomain podman[304499]: 2026-02-20 09:47:18.240104175 +0000 UTC m=+0.215784224 container died 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, release=1770267347, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.buildah.version=1.42.2, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2026-02-09T10:25:24Z, build-date=2026-02-09T10:25:24Z, version=7, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.265 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.266 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e01c4fa4-ee64-4dec-a454-5f2dbf97afcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.232136', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '250fe922-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': 'a11bcde4b7038b95c42ae5ba7af2c527ab228c289ac8d784a8756f75df1a010a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.232136', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251002f4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '2a619bb0b430dd18636127eaf094560a289d4d216bb732f8df1d43a6e6f7e3b9'}]}, 'timestamp': '2026-02-20 09:47:18.266888', '_unique_id': 'ab4cad6b0d86413d9257afa0ee405233'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.268 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.271 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2df98f6-592c-43b5-826b-6da02db31eab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.270954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2510bde8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '37256375301243feb15a0c3c9a57b0f65dbc3e3945b0dfc44fceab65c00a833c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.270954', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2510d8aa-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '5e6ba177d3c227669dfd67a6434540bc256a34c0ca6c7a80125aeb03f5196878'}]}, 'timestamp': '2026-02-20 09:47:18.272347', '_unique_id': '53de4422663244af85f71edbb8b21331'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.275 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9924a88-0846-4aaf-8606-a43f88ea8f19', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.276151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '2511885e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': '3efce7060e465043917abfa41f9fbaf4450f7f3b1379a1f8c62e84aaf480c97f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.276151', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '2511a258-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': 'd546d0f33524975e1230d99323942ee9956ea2f5d64f8e49e9527dfffff0d8f6'}]}, 'timestamp': '2026-02-20 09:47:18.277531', '_unique_id': '4e5f64e5a1664ec6a8afdaa521caf75d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.278 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.303 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 15210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78bc9af0-1641-4913-ad54-9cc58902d00b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15210000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:47:18.280849', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '2515a286-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.542152542, 'message_signature': 'e3d962fc6af8f3a8e98efaee06301634f48f337572490904b73e1e38ff20bc32'}]}, 'timestamp': '2026-02-20 09:47:18.303785', '_unique_id': 'b5e2564ba7a64479ae3d80caa22badb2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.304 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.306 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '54bdbd29-2c89-4d38-b9c7-7357152ffc7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.306861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '25163660-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '35c6e7bb730ce1bbbf486c1ca913182ddea7ea9b9329739bd9e4a315a16f26da'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.306861', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251663ba-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '141d77617591c736795afb06987ec46a7c6c5faf1278472b0cdf7837f4638437'}]}, 'timestamp': '2026-02-20 09:47:18.308689', '_unique_id': '7e2558dadffb4a0ba7c92be57916dda4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.310 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '154641d9-740b-4b37-a2b5-178532418b0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.311982', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '2517babc-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '6dd2581d3aabbd905799ce12bb0293d89a50fdd422c4762382f32b0ff256c372'}]}, 'timestamp': '2026-02-20 09:47:18.317526', '_unique_id': 'c632028817764670afc13e007ab98a7f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.320 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cdb3ba40-c2bc-4849-b67f-de2a8f26d794', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.320524', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '25184fea-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'b72ed21b9aa5a1697a35bc0aafdb9cf97b14cb17cf2fa78c3af1516ec42e54d8'}]}, 'timestamp': '2026-02-20 09:47:18.321329', '_unique_id': '67d850235aa04cc9aa81d3b1a4efc174'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '118f1633-b444-4750-b1ab-95e7a6fbc910', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.325214', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251905f2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '9c0b4197ddf5f239c10dca0dbf3bf651291ed558bd53aac758b50d10a72442b0'}]}, 'timestamp': '2026-02-20 09:47:18.325994', '_unique_id': '1fb9ca672f264b5fa2c78ea804746565'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ce00f25-61af-4009-84fa-cb900ded8e53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.329141', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '25199f76-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'b6848144d0c7a1f88d513973b289c6a8c8c576084ff0c95ae83ea3f62a75ce9a'}]}, 'timestamp': '2026-02-20 09:47:18.329929', '_unique_id': '04a96b61e93a47cd951706fd8a55e2d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.333 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3742ff5f-278b-42ec-a4cf-3a8263c3e4d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.333066', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251a388c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '283a7bf1cbe253ccaf858d62f31ef76e9d7153228a2750d7b6be1a2ac5824923'}]}, 'timestamp': '2026-02-20 09:47:18.333847', '_unique_id': '73ed27d201114bbc887c4f339c394db0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.335 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.336 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.337 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2f990f3-a23c-41d8-8e63-8464ac4beec5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.337000', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251ad21a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'ca8d1c4baf7fed3560b5d8984e23406756e5a11cba655f3242b718cb5702c0e4'}]}, 'timestamp': '2026-02-20 09:47:18.337791', '_unique_id': 'adffd59b3afc4d319cc2c02fcd3c9b63'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.340 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9529bba3-b1ed-4be6-8132-4bf144d83382', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.341163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251b75a8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': '830696075ce24cdc19feea4d089d25ba095da4894ac58c83e5bc95814096beb1'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.341163', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251b9006-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.449028538, 'message_signature': 'd933fbbeaab92dfe4b60b23e27682446114fe90f7ef79bafb20e6a766904c6b8'}]}, 'timestamp': '2026-02-20 09:47:18.342483', '_unique_id': '50a6752ca3414ea5a841e26b795a7c0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.344 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.344 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.344 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd48ee688-c1c2-43c2-890e-b1a4eb914646', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.344615', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251bf67c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'ea75c244dd79e7b82a1378ec21a609449c14eb2d7a725ca1eb07c10e1b528fb5'}]}, 'timestamp': '2026-02-20 09:47:18.345071', '_unique_id': '4b0144e18d8d46f4a1101d60b1f98fdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.346 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.348 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.348 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64c2a4d2-0116-4117-b2ae-4df8833e90ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.348241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251c845c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '761747582922777899c5214d22268d85ae34496d2a2faad8d9fd824f1dbdbce8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.348241', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251c98ac-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '6135ffe1cd9ffbf5fec225da4a86b90d1cedef5030f5a33851fd7e240e99e973'}]}, 'timestamp': '2026-02-20 09:47:18.349253', '_unique_id': 'c25b74f4f4974d85957e70e3204fe954'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.350 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.351 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fac47ff-53ca-43fb-a9b5-3203d849b81f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.351133', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251cf126-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'a4f10e7e3deea243c3575d25b0dbe038bdc3e8969d725c1ac899dbe3bda07af7'}]}, 'timestamp': '2026-02-20 09:47:18.351589', '_unique_id': '6f3c03a8099046568282010dbfa89ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.352 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.353 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.353 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bfadfb57-222b-4438-9c8a-2e70058386b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.353208', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251d450e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': 'eb888c0db1b8554c4c529d1931769545ec1fb261f982026f52bebd76363b47e9'}]}, 'timestamp': '2026-02-20 09:47:18.353625', '_unique_id': 'a6804f177f6147ef934ca868c2e13e61'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.355 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.355 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.356 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e664bbb-ace9-4b62-8bb5-b109322d1d58', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.355722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251da95e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '4d7b94ae08f5d4afcac1005d256a4e23577ebb580b7a665ef2a0bd66ab14130a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.355722', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251db9da-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': 'b3692d8b0e3a8ccb44bf9f1d6510f2af0170913c0932059322cd2342b236f304'}]}, 'timestamp': '2026-02-20 09:47:18.356653', '_unique_id': 'dc5b899f1e8f4c35a954f21d3bcd3b96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.357 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.358 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6452b0e5-a183-4197-bd7b-85c6daac0199', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:47:18.358732', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '251e1b96-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.551229901, 'message_signature': '0988c2bedf8245a84e795d14f593f3e7a9ea49d63a4abe807b02620b6cf62457'}]}, 'timestamp': '2026-02-20 09:47:18.359087', '_unique_id': 'cb01b56aefbe48159a4c8139975ee9f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.359 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.360 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4973c59-993d-4d15-ad63-fb05d2350231', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:47:18.361127', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '251e7a96-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '0af73b3b416ac3a230ab1301028490f89f1dc8d38b37bebaf93fcf5a94959a34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:47:18.361127', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '251e8c8e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.471408524, 'message_signature': '957b92132113e09d2a9e679ccab2a9e95bc87bdbb8dbf020103efcb77d19a8ad'}]}, 'timestamp': '2026-02-20 09:47:18.361966', '_unique_id': 'd9f91f9576de4ec4869bf3d77a352ed1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.362 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.363 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.364 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.364 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a208359-c6ec-4db1-a702-eef7c95d238c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:47:18.364251', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '251ef2d2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11397.542152542, 'message_signature': 'c56d710976f336ce35f189ae75f30c88abc635f935e87b6c7293c5ac3e4353d4'}]}, 'timestamp': '2026-02-20 09:47:18.364583', '_unique_id': 'd02723f99e1c4aad82704c5842f1ad65'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:47:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:47:18.365 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:47:18 np0005625204.localdomain podman[304521]: 2026-02-20 09:47:18.378683181 +0000 UTC m=+0.127154067 container remove 576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=focused_maxwell, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_BRANCH=main, build-date=2026-02-09T10:25:24Z, release=1770267347, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.created=2026-02-09T10:25:24Z)
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: libpod-conmon-576586ade1b87f320ab15a8afe693addc34fe0d1cd41e7f8b6d928a0a3e2f853.scope: Deactivated successfully.
Feb 20 09:47:18 np0005625204.localdomain sudo[304462]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: from='client.44571 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005625204", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mon.np0005625204 (monmap changed)...
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mon.np0005625204 on np0005625204.localdomain
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain.devices.0}] v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625204.localdomain}] v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] update: starting ev 3648edd9-f15d-4e63-81de-22936437324c (Updating node-proxy deployment (+3 -> 3))
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] complete: finished ev 3648edd9-f15d-4e63-81de-22936437324c (Updating node-proxy deployment (+3 -> 3))
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Completed event 3648edd9-f15d-4e63-81de-22936437324c (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain sudo[304537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:47:18 np0005625204.localdomain sudo[304537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:47:18 np0005625204.localdomain sudo[304537]: pam_unix(sudo:session): session closed for user root
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:18 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:47:18 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:47:18 np0005625204.localdomain sshd[304518]: Invalid user oracle from 154.91.170.41 port 34606
Feb 20 09:47:18 np0005625204.localdomain sshd[304518]: Received disconnect from 154.91.170.41 port 34606:11: Bye Bye [preauth]
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: tmp-crun.mMBt74.mount: Deactivated successfully.
Feb 20 09:47:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8f7489e31821167baff6543526491bffd4e0075d2db1fdc8c7f8dc11ad648788-merged.mount: Deactivated successfully.
Feb 20 09:47:18 np0005625204.localdomain sshd[304518]: Disconnected from invalid user oracle 154.91.170.41 port 34606 [preauth]
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:19 np0005625204.localdomain sshd[299722]: Received disconnect from 192.168.122.11 port 59108:11: disconnected by user
Feb 20 09:47:19 np0005625204.localdomain sshd[299722]: Disconnected from user tripleo-admin 192.168.122.11 port 59108
Feb 20 09:47:19 np0005625204.localdomain sshd[299680]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 20 09:47:19 np0005625204.localdomain systemd[1]: session-70.scope: Deactivated successfully.
Feb 20 09:47:19 np0005625204.localdomain systemd[1]: session-70.scope: Consumed 1.764s CPU time.
Feb 20 09:47:19 np0005625204.localdomain systemd-logind[759]: Session 70 logged out. Waiting for processes to exit.
Feb 20 09:47:19 np0005625204.localdomain systemd-logind[759]: Removed session 70.
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain.devices.0}] v 0)
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625202.localdomain}] v 0)
Feb 20 09:47:19 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005625203 (monmap changed)...
Feb 20 09:47:19 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005625203 (monmap changed)...
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:19 np0005625204.localdomain ceph-mgr[287186]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain
Feb 20 09:47:19 np0005625204.localdomain ceph-mgr[287186]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain
Feb 20 09:47:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mon.np0005625202 (monmap changed)...
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mon.np0005625202 on np0005625202.localdomain
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain.devices.0}] v 0)
Feb 20 09:47:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005625203.localdomain}] v 0)
Feb 20 09:47:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:47:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:47:21 np0005625204.localdomain podman[304556]: 2026-02-20 09:47:21.153815024 +0000 UTC m=+0.085020646 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:47:21 np0005625204.localdomain podman[304555]: 2026-02-20 09:47:21.229520824 +0000 UTC m=+0.160533971 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:47:21 np0005625204.localdomain podman[304556]: 2026-02-20 09:47:21.23947904 +0000 UTC m=+0.170684642 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:47:21 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:47:21 np0005625204.localdomain podman[304555]: 2026-02-20 09:47:21.306183383 +0000 UTC m=+0.237196550 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:47:21 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:47:21 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:21 np0005625204.localdomain ceph-mon[301857]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:21 np0005625204.localdomain ceph-mon[301857]: Reconfiguring mon.np0005625203 (monmap changed)...
Feb 20 09:47:21 np0005625204.localdomain ceph-mon[301857]: Reconfiguring daemon mon.np0005625203 on np0005625203.localdomain
Feb 20 09:47:21 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:21 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:21.593 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:47:23 np0005625204.localdomain ceph-mon[301857]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:23 np0005625204.localdomain ceph-mon[301857]: mgrmap e42: np0005625204.exgrzx(active, since 24s), standbys: np0005625202.arwxwo, np0005625203.lonygy
Feb 20 09:47:23 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Writing back 50 completed events
Feb 20 09:47:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 20 09:47:23 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:24 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:47:24 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:25 np0005625204.localdomain ceph-mon[301857]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:25 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:47:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:47:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:47:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:47:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:47:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:47:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:26.596 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:27 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:27 np0005625204.localdomain ceph-mon[301857]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:28 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections..
Feb 20 09:47:28 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: []
Feb 20 09:47:28 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections..
Feb 20 09:47:28 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: []
Feb 20 09:47:28 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections..
Feb 20 09:47:28 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: []
Feb 20 09:47:28 np0005625204.localdomain ceph-mon[301857]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:29 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Activating special unit Exit the Session...
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped target Main User Target.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped target Basic System.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped target Paths.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped target Sockets.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped target Timers.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Closed D-Bus User Message Bus Socket.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Stopped Create User's Volatile Files and Directories.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Removed slice User Application Slice.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Reached target Shutdown.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Finished Exit the Session.
Feb 20 09:47:29 np0005625204.localdomain systemd[299695]: Reached target Exit the Session.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 20 09:47:29 np0005625204.localdomain systemd[1]: user-1003.slice: Consumed 2.386s CPU time.
Feb 20 09:47:29 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:30 np0005625204.localdomain ceph-mon[301857]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:47:30 np0005625204.localdomain podman[304599]: 2026-02-20 09:47:30.887937052 +0000 UTC m=+0.096866740 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:47:30 np0005625204.localdomain podman[304599]: 2026-02-20 09:47:30.902443027 +0000 UTC m=+0.111372745 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:47:30 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:47:31 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:31.600 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:33 np0005625204.localdomain ceph-mon[301857]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:33 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:34 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:35 np0005625204.localdomain ceph-mon[301857]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:35 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:36.602 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:47:37 np0005625204.localdomain ceph-mon[301857]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:37 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:39 np0005625204.localdomain ceph-mon[301857]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:39 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:47:40 np0005625204.localdomain podman[304618]: 2026-02-20 09:47:40.149399955 +0000 UTC m=+0.086249895 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:47:40 np0005625204.localdomain podman[304618]: 2026-02-20 09:47:40.163082104 +0000 UTC m=+0.099932034 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:47:40 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:47:41 np0005625204.localdomain ceph-mon[301857]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:41 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:41.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:43 np0005625204.localdomain ceph-mon[301857]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:43 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:44 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:45 np0005625204.localdomain ceph-mon[301857]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:45 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:46 np0005625204.localdomain sshd[304642]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:47:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:47:46 np0005625204.localdomain podman[304643]: 2026-02-20 09:47:46.140691773 +0000 UTC m=+0.073269366 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:47:46 np0005625204.localdomain podman[304643]: 2026-02-20 09:47:46.17419064 +0000 UTC m=+0.106768193 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:47:46 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:47:46 np0005625204.localdomain ceph-mon[301857]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:46.607 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:47:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:46.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:47:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:46.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:47:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:46.610 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:47:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:46.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:46.611 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:47:46 np0005625204.localdomain sshd[304642]: Invalid user sol from 45.148.10.240 port 51228
Feb 20 09:47:47 np0005625204.localdomain sshd[304642]: Connection closed by invalid user sol 45.148.10.240 port 51228 [preauth]
Feb 20 09:47:47 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:47:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:47:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:47:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:47:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:47:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18288 "" "Go-http-client/1.1"
Feb 20 09:47:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:47:48 np0005625204.localdomain systemd[1]: tmp-crun.2J8KES.mount: Deactivated successfully.
Feb 20 09:47:48 np0005625204.localdomain podman[304667]: 2026-02-20 09:47:48.155250025 +0000 UTC m=+0.091650340 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:47:48 np0005625204.localdomain podman[304667]: 2026-02-20 09:47:48.172117472 +0000 UTC m=+0.108517787 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Feb 20 09:47:48 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:47:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:48.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:48.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:47:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:48.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:47:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:48.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:47:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:48.742 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:47:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:48.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:47:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:47:49 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3781215475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.201 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.276 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.277 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:47:49 np0005625204.localdomain ceph-mon[301857]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4031544902' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3781215475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.511 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:47:49 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.514 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11726MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.515 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.516 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.602 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:47:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:49.702 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:47:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:47:50 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/190951353' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:50.161 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:47:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:50.168 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:47:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:50.191 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:47:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:50.194 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:47:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:50.194 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:47:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1476854404' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/190951353' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:51.195 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:51 np0005625204.localdomain ceph-mon[301857]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:51 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:51.612 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:51.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:51.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:47:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:47:52 np0005625204.localdomain podman[304732]: 2026-02-20 09:47:52.138862524 +0000 UTC m=+0.077128525 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:47:52 np0005625204.localdomain systemd[1]: tmp-crun.XJaV3b.mount: Deactivated successfully.
Feb 20 09:47:52 np0005625204.localdomain podman[304732]: 2026-02-20 09:47:52.212989016 +0000 UTC m=+0.151254997 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:47:52 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:47:52 np0005625204.localdomain podman[304733]: 2026-02-20 09:47:52.218194205 +0000 UTC m=+0.150643428 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Feb 20 09:47:52 np0005625204.localdomain podman[304733]: 2026-02-20 09:47:52.303233631 +0000 UTC m=+0.235682884 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 20 09:47:52 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:47:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1447312241' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:52.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:52.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:52.747 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:52.747 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:53 np0005625204.localdomain ceph-mon[301857]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3443497250' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:47:53 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:53.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:53.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:47:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:53.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.204 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.205 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.205 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.205 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.602 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.620 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.620 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.621 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:47:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:54.622 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:47:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:47:55 np0005625204.localdomain ceph-mon[301857]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/3650643353' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 20 09:47:55 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:47:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:47:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:47:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:47:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:47:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:47:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:56.614 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:47:56.618 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:47:57 np0005625204.localdomain ceph-mon[301857]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:57 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] Optimize plan auto_2026-02-20_09:47:58
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] do_upmap
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] pools ['.mgr', 'manila_metadata', 'volumes', 'manila_data', 'images', 'backups', 'vms']
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [balancer INFO root] prepared 0/10 changes
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] _maybe_adjust
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.453674623115578e-06 of space, bias 4.0, pg target 0.0019596681323283084 quantized to 16 (current 16)
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections..
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: []
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections..
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: []
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] scanning for idle connections..
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [volumes INFO mgr_util] cleaning up connections: []
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mgr[287186]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 20 09:47:58 np0005625204.localdomain ceph-mon[301857]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:59 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:47:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:00 np0005625204.localdomain ceph-mon[301857]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:48:01 np0005625204.localdomain sshd[304788]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:48:01 np0005625204.localdomain podman[304775]: 2026-02-20 09:48:01.147472266 +0000 UTC m=+0.086699038 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:48:01 np0005625204.localdomain podman[304775]: 2026-02-20 09:48:01.184805761 +0000 UTC m=+0.124032493 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:48:01 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:48:01 np0005625204.localdomain sshd[304788]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:48:01 np0005625204.localdomain ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44589 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:48:01 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:01 np0005625204.localdomain ceph-mon[301857]: from='client.44589 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:48:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:01.616 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:01.621 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:02 np0005625204.localdomain ceph-mon[301857]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2727431459' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:48:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2727431459' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:48:03 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:04 np0005625204.localdomain ceph-mon[301857]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:05 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:48:06.011 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:48:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:48:06.012 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:48:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:48:06.012 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:48:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:06.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:06 np0005625204.localdomain ceph-mon[301857]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:07 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/2546430745' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Feb 20 09:48:09 np0005625204.localdomain ceph-mon[301857]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:09 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:48:11 np0005625204.localdomain podman[304796]: 2026-02-20 09:48:11.139683636 +0000 UTC m=+0.082599862 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:48:11 np0005625204.localdomain podman[304796]: 2026-02-20 09:48:11.175666549 +0000 UTC m=+0.118582735 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:48:11 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:48:11 np0005625204.localdomain ceph-mon[301857]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:11 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:11.622 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:13 np0005625204.localdomain ceph-mon[301857]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:13 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:14 np0005625204.localdomain ceph-mgr[287186]: log_channel(audit) log [DBG] : from='client.44610 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:48:14 np0005625204.localdomain ceph-mon[301857]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:14 np0005625204.localdomain ceph-mon[301857]: from='client.44610 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 20 09:48:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:15 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:16 np0005625204.localdomain ceph-mon[301857]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:16.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:48:17 np0005625204.localdomain podman[304817]: 2026-02-20 09:48:17.149706239 +0000 UTC m=+0.089245585 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:48:17 np0005625204.localdomain podman[304817]: 2026-02-20 09:48:17.183851457 +0000 UTC m=+0.123390813 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:48:17 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:48:17 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:48:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:48:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:48:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:48:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:48:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18294 "" "Go-http-client/1.1"
Feb 20 09:48:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:48:19 np0005625204.localdomain podman[304841]: 2026-02-20 09:48:19.140506193 +0000 UTC m=+0.082790748 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-type=git)
Feb 20 09:48:19 np0005625204.localdomain podman[304841]: 2026-02-20 09:48:19.153239474 +0000 UTC m=+0.095524039 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7)
Feb 20 09:48:19 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:48:19 np0005625204.localdomain ceph-mon[301857]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:19 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:20 np0005625204.localdomain sudo[304861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:48:20 np0005625204.localdomain sudo[304861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:20 np0005625204.localdomain sudo[304861]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:20 np0005625204.localdomain sudo[304879]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:48:20 np0005625204.localdomain sudo[304879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/712476891' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:48:21 np0005625204.localdomain sudo[304879]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:21 np0005625204.localdomain ceph-mgr[287186]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [INF] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 20 09:48:21 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] update: starting ev f0532645-8d13-4eb4-a222-42dd4abbb158 (Updating node-proxy deployment (+3 -> 3))
Feb 20 09:48:21 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] complete: finished ev f0532645-8d13-4eb4-a222-42dd4abbb158 (Updating node-proxy deployment (+3 -> 3))
Feb 20 09:48:21 np0005625204.localdomain ceph-mgr[287186]: [progress INFO root] Completed event f0532645-8d13-4eb4-a222-42dd4abbb158 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 20 09:48:21 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:48:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:21.627 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:21 np0005625204.localdomain sudo[304929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:48:21 np0005625204.localdomain sudo[304929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:21 np0005625204.localdomain sudo[304929]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:48:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:48:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 ' entity='mgr.np0005625204.exgrzx' 
Feb 20 09:48:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.27007 172.18.0.108:0/150842184' entity='mgr.np0005625204.exgrzx' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:48:23 np0005625204.localdomain podman[304947]: 2026-02-20 09:48:23.150843051 +0000 UTC m=+0.084046077 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:48:23 np0005625204.localdomain podman[304947]: 2026-02-20 09:48:23.225354084 +0000 UTC m=+0.158557090 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:48:23 np0005625204.localdomain podman[304948]: 2026-02-20 09:48:23.228843592 +0000 UTC m=+0.156912160 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 e91: 6 total, 6 up, 6 in
Feb 20 09:48:23 np0005625204.localdomain podman[304948]: 2026-02-20 09:48:23.309360329 +0000 UTC m=+0.237428867 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: mgr handle_mgr_map I was active but no longer am
Feb 20 09:48:23 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:23.314+0000 7f0941015640 -1 mgr handle_mgr_map I was active but no longer am
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:48:23 np0005625204.localdomain sshd[302847]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: session-72.scope: Deactivated successfully.
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: session-72.scope: Consumed 11.972s CPU time.
Feb 20 09:48:23 np0005625204.localdomain systemd-logind[759]: Session 72 logged out. Waiting for processes to exit.
Feb 20 09:48:23 np0005625204.localdomain systemd-logind[759]: Removed session 72.
Feb 20 09:48:23 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: ignoring --setuser ceph since I am not root
Feb 20 09:48:23 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: ignoring --setgroup ceph since I am not root
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: ceph version 18.2.1-381.el9cp (984f410e2a30899deb131725765b62212b1621db) reef (stable), process ceph-mgr, pid 2
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: pidfile_write: ignore empty --pid-file
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/2835510203' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: Activating manager daemon np0005625202.arwxwo
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: osdmap e91: 6 total, 6 up, 6 in
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.200:0/2835510203' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: mgrmap e43: np0005625202.arwxwo(active, starting, since 0.0325637s), standbys: np0005625203.lonygy
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625202"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625203"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata", "id": "np0005625204"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata", "who": "mds.np0005625202.akhmop"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata", "who": "mds.np0005625204.wnsphl"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata", "who": "mds.np0005625203.zsrwgk"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625202.arwxwo", "id": "np0005625202.arwxwo"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625203.lonygy", "id": "np0005625203.lonygy"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mds metadata"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mon metadata"} : dispatch
Feb 20 09:48:23 np0005625204.localdomain ceph-mon[301857]: Manager daemon np0005625202.arwxwo is now available
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'alerts'
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'balancer'
Feb 20 09:48:23 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:23.550+0000 7f64ba62c140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 20 09:48:23 np0005625204.localdomain sshd[305012]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 20 09:48:23 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'cephadm'
Feb 20 09:48:23 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:23.626+0000 7f64ba62c140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 20 09:48:23 np0005625204.localdomain sshd[305012]: Accepted publickey for ceph-admin from 192.168.122.106 port 58378 ssh2: RSA SHA256:ReOVVWzTOjcz49zJan4sHrxgFDyL1hzGQqhA1t8e47Y
Feb 20 09:48:23 np0005625204.localdomain systemd-logind[759]: New session 73 of user ceph-admin.
Feb 20 09:48:23 np0005625204.localdomain systemd[1]: Started Session 73 of User ceph-admin.
Feb 20 09:48:23 np0005625204.localdomain sshd[305012]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 20 09:48:23 np0005625204.localdomain sudo[305016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:48:23 np0005625204.localdomain sudo[305016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:23 np0005625204.localdomain sudo[305016]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:23 np0005625204.localdomain sudo[305034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:48:23 np0005625204.localdomain sudo[305034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:24 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'crash'
Feb 20 09:48:24 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 20 09:48:24 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'dashboard'
Feb 20 09:48:24 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:24.366+0000 7f64ba62c140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 20 09:48:24 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/mirror_snapshot_schedule"} : dispatch
Feb 20 09:48:24 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005625202.arwxwo/trash_purge_schedule"} : dispatch
Feb 20 09:48:24 np0005625204.localdomain ceph-mon[301857]: mgrmap e44: np0005625202.arwxwo(active, since 1.04816s), standbys: np0005625203.lonygy
Feb 20 09:48:24 np0005625204.localdomain sshd[305121]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:48:24 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:24 np0005625204.localdomain systemd[1]: tmp-crun.WZZuTt.mount: Deactivated successfully.
Feb 20 09:48:24 np0005625204.localdomain podman[305130]: 2026-02-20 09:48:24.87777572 +0000 UTC m=+0.114838010 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, io.buildah.version=1.42.2, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1770267347, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, build-date=2026-02-09T10:25:24Z, name=rhceph, GIT_BRANCH=main)
Feb 20 09:48:24 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'devicehealth'
Feb 20 09:48:24 np0005625204.localdomain podman[305130]: 2026-02-20 09:48:24.975809657 +0000 UTC m=+0.212871897 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, distribution-scope=public, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1770267347, vendor=Red Hat, Inc., name=rhceph, build-date=2026-02-09T10:25:24Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2026-02-09T10:25:24Z, io.buildah.version=1.42.2, architecture=x86_64, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 20 09:48:24 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 20 09:48:24 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'diskprediction_local'
Feb 20 09:48:24 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:24.989+0000 7f64ba62c140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 20 09:48:25 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 20 09:48:25 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]:   from numpy import show_config as show_numpy_config
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'influx'
Feb 20 09:48:25 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:25.148+0000 7f64ba62c140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'insights'
Feb 20 09:48:25 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:25.215+0000 7f64ba62c140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'iostat'
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'k8sevents'
Feb 20 09:48:25 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:25.342+0000 7f64ba62c140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 20 09:48:25 np0005625204.localdomain sshd[305121]: Received disconnect from 196.189.116.182 port 34606:11: Bye Bye [preauth]
Feb 20 09:48:25 np0005625204.localdomain sshd[305121]: Disconnected from authenticating user root 196.189.116.182 port 34606 [preauth]
Feb 20 09:48:25 np0005625204.localdomain sudo[305034]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'localpool'
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'mds_autoscaler'
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'mirroring'
Feb 20 09:48:25 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'nfs'
Feb 20 09:48:25 np0005625204.localdomain ceph-mon[301857]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:26 np0005625204.localdomain sudo[305248]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:48:26 np0005625204.localdomain sudo[305248]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:26 np0005625204.localdomain sudo[305248]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:26 np0005625204.localdomain sudo[305266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:48:26 np0005625204.localdomain sudo[305266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'orchestrator'
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.167+0000 7f64ba62c140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.321+0000 7f64ba62c140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'osd_perf_query'
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.387+0000 7f64ba62c140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'osd_support'
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.444+0000 7f64ba62c140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'pg_autoscaler'
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'progress'
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.512+0000 7f64ba62c140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:48:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:48:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:48:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:48:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'prometheus'
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.572+0000 7f64ba62c140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:26.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:26 np0005625204.localdomain sudo[305266]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'rbd_support'
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.871+0000 7f64ba62c140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'restful'
Feb 20 09:48:26 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:26.956+0000 7f64ba62c140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:48:24] ENGINE Bus STARTING
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Serving on http://172.18.0.106:8765
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Serving on https://172.18.0.106:7150
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Client ('172.18.0.106', 54518) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: [20/Feb/2026:09:48:25] ENGINE Bus STARTED
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: Cluster is now healthy
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:26 np0005625204.localdomain sudo[305317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:48:26 np0005625204.localdomain sudo[305317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:26 np0005625204.localdomain sudo[305317]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:27 np0005625204.localdomain sudo[305335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'rgw'
Feb 20 09:48:27 np0005625204.localdomain sudo[305335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:27 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.288+0000 7f64ba62c140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'rook'
Feb 20 09:48:27 np0005625204.localdomain sudo[305335]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:27 np0005625204.localdomain sudo[305373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.699+0000 7f64ba62c140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'selftest'
Feb 20 09:48:27 np0005625204.localdomain sudo[305373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:27 np0005625204.localdomain sudo[305373]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.769+0000 7f64ba62c140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'snap_schedule'
Feb 20 09:48:27 np0005625204.localdomain sudo[305391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:48:27 np0005625204.localdomain sudo[305391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:27 np0005625204.localdomain sudo[305391]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'stats'
Feb 20 09:48:27 np0005625204.localdomain sudo[305409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:48:27 np0005625204.localdomain sudo[305409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:27 np0005625204.localdomain sudo[305409]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'status'
Feb 20 09:48:27 np0005625204.localdomain sudo[305427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:48:27 np0005625204.localdomain sudo[305427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:27 np0005625204.localdomain sudo[305427]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 20 09:48:27 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'telegraf'
Feb 20 09:48:27 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:27.969+0000 7f64ba62c140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain sudo[305445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:48:28 np0005625204.localdomain sudo[305445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305445]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'telemetry'
Feb 20 09:48:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.027+0000 7f64ba62c140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'test_orchestrator'
Feb 20 09:48:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.162+0000 7f64ba62c140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain sudo[305479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:48:28 np0005625204.localdomain sudo[305479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305479]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain sudo[305497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new
Feb 20 09:48:28 np0005625204.localdomain sudo[305497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305497]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'volumes'
Feb 20 09:48:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.310+0000 7f64ba62c140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain sudo[305515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 20 09:48:28 np0005625204.localdomain sudo[305515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305515]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain sudo[305533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:48:28 np0005625204.localdomain sudo[305533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305533]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Loading python module 'zabbix'
Feb 20 09:48:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.505+0000 7f64ba62c140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain sudo[305551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:48:28 np0005625204.localdomain sudo[305551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305551]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.conf
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.conf
Feb 20 09:48:28 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.conf
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-mgr-np0005625204-exgrzx[287182]: 2026-02-20T09:48:28.565+0000 7f64ba62c140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: ms_deliver_dispatch: unhandled message 0x559ff0eb7600 mon_map magic: 0 from mon.2 v2:172.18.0.105:3300/0
Feb 20 09:48:28 np0005625204.localdomain ceph-mgr[287186]: client.0 ms_handle_reset on v2:172.18.0.106:6810/1895764224
Feb 20 09:48:28 np0005625204.localdomain sudo[305569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:48:28 np0005625204.localdomain sudo[305569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305569]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain sudo[305587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:48:28 np0005625204.localdomain sudo[305587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305587]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain sudo[305605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:48:28 np0005625204.localdomain sudo[305605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305605]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:28 np0005625204.localdomain sudo[305639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:48:28 np0005625204.localdomain sudo[305639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:28 np0005625204.localdomain sudo[305639]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new
Feb 20 09:48:29 np0005625204.localdomain sudo[305657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305657]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:48:29 np0005625204.localdomain sudo[305675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305675]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 20 09:48:29 np0005625204.localdomain sudo[305693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305693]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph
Feb 20 09:48:29 np0005625204.localdomain sudo[305711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305711]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:48:29 np0005625204.localdomain sudo[305729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305729]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:48:29 np0005625204.localdomain sudo[305747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305747]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:48:29 np0005625204.localdomain sudo[305765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305765]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.conf
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: Standby manager daemon np0005625204.exgrzx started
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: mgrmap e45: np0005625202.arwxwo(active, since 5s), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "mgr metadata", "who": "np0005625204.exgrzx", "id": "np0005625204.exgrzx"} : dispatch
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:48:29 np0005625204.localdomain sudo[305799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:48:29 np0005625204.localdomain sudo[305799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305799]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new
Feb 20 09:48:29 np0005625204.localdomain sudo[305817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305817]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:29 np0005625204.localdomain sudo[305835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 20 09:48:29 np0005625204.localdomain sudo[305835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305835]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:29 np0005625204.localdomain sudo[305853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:48:29 np0005625204.localdomain sudo[305853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:29 np0005625204.localdomain sudo[305853]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config
Feb 20 09:48:30 np0005625204.localdomain sudo[305871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305871]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:48:30 np0005625204.localdomain sudo[305889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305889]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8
Feb 20 09:48:30 np0005625204.localdomain sudo[305907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305907]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:48:30 np0005625204.localdomain sudo[305925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305925]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:48:30 np0005625204.localdomain sudo[305959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305959]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new
Feb 20 09:48:30 np0005625204.localdomain sudo[305977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305977]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain sudo[305995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-a8557ee9-b55d-5519-942c-cf8f6172f1d8/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring.new /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:48:30 np0005625204.localdomain sudo[305995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[305995]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: Updating np0005625202.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: Updating np0005625203.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: Updating np0005625204.localdomain:/var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/config/ceph.client.admin.keyring
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:48:30 np0005625204.localdomain sudo[306013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:48:30 np0005625204.localdomain sudo[306013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:30 np0005625204.localdomain sudo[306013]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:31 np0005625204.localdomain sudo[306031]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:48:31 np0005625204.localdomain sudo[306031]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:48:31 np0005625204.localdomain sudo[306031]: pam_unix(sudo:session): session closed for user root
Feb 20 09:48:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:31.633 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:48:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:31.635 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:31.635 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:48:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:31.635 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:48:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:31.636 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:48:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:31.638 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.927869) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580911928831, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2719, "num_deletes": 256, "total_data_size": 8670897, "memory_usage": 8956368, "flush_reason": "Manual Compaction"}
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580911953979, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 5220534, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11866, "largest_seqno": 14580, "table_properties": {"data_size": 5209498, "index_size": 6901, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27059, "raw_average_key_size": 22, "raw_value_size": 5185912, "raw_average_value_size": 4264, "num_data_blocks": 299, "num_entries": 1216, "num_filter_entries": 1216, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580813, "oldest_key_time": 1771580813, "file_creation_time": 1771580911, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 26200 microseconds, and 11901 cpu microseconds.
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.954056) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 5220534 bytes OK
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.954100) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.956556) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.956584) EVENT_LOG_v1 {"time_micros": 1771580911956577, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.956618) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 8657884, prev total WAL file size 8657884, number of live WAL files 2.
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.961916) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(5098KB)], [15(15MB)]
Feb 20 09:48:31 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580911961979, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 21589879, "oldest_snapshot_seqno": -1}
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11901 keys, 18468504 bytes, temperature: kUnknown
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580912041558, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 18468504, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18399782, "index_size": 37901, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29765, "raw_key_size": 319044, "raw_average_key_size": 26, "raw_value_size": 18196238, "raw_average_value_size": 1528, "num_data_blocks": 1448, "num_entries": 11901, "num_filter_entries": 11901, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771580911, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.042087) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 18468504 bytes
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.043932) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 270.6 rd, 231.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.0, 15.6 +0.0 blob) out(17.6 +0.0 blob), read-write-amplify(7.7) write-amplify(3.5) OK, records in: 12441, records dropped: 540 output_compression: NoCompression
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.043966) EVENT_LOG_v1 {"time_micros": 1771580912043950, "job": 6, "event": "compaction_finished", "compaction_time_micros": 79788, "compaction_time_cpu_micros": 50187, "output_level": 6, "num_output_files": 1, "total_output_size": 18468504, "num_input_records": 12441, "num_output_records": 11901, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580912044960, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771580912048211, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:31.961772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048355) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048358) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048361) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:48:32.048365) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:48:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:48:32 np0005625204.localdomain podman[306049]: 2026-02-20 09:48:32.176829469 +0000 UTC m=+0.097832942 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 20 09:48:32 np0005625204.localdomain podman[306049]: 2026-02-20 09:48:32.216188404 +0000 UTC m=+0.137191907 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:48:32 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:48:32 np0005625204.localdomain ceph-mon[301857]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 20 09:48:34 np0005625204.localdomain ceph-mon[301857]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 20 09:48:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:48:34 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:36 np0005625204.localdomain ceph-mon[301857]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 20 09:48:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:36.637 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:38 np0005625204.localdomain ceph-mon[301857]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:48:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:40 np0005625204.localdomain ceph-mon[301857]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:48:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:41.639 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:48:42 np0005625204.localdomain systemd[292696]: Created slice User Background Tasks Slice.
Feb 20 09:48:42 np0005625204.localdomain ceph-mon[301857]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 20 09:48:42 np0005625204.localdomain systemd[292696]: Starting Cleanup of User's Temporary Files and Directories...
Feb 20 09:48:42 np0005625204.localdomain podman[306067]: 2026-02-20 09:48:42.768726659 +0000 UTC m=+0.705504707 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:48:42 np0005625204.localdomain systemd[292696]: Finished Cleanup of User's Temporary Files and Directories.
Feb 20 09:48:42 np0005625204.localdomain podman[306067]: 2026-02-20 09:48:42.801150916 +0000 UTC m=+0.737928954 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:48:42 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:48:43 np0005625204.localdomain sshd[306090]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:48:44 np0005625204.localdomain sshd[306090]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:48:44 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:45 np0005625204.localdomain ceph-mon[301857]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:46 np0005625204.localdomain ceph-mon[301857]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:46.642 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:48:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:48:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:48:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:48:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:48:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18293 "" "Go-http-client/1.1"
Feb 20 09:48:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:48:48 np0005625204.localdomain podman[306092]: 2026-02-20 09:48:48.136971593 +0000 UTC m=+0.076393862 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:48:48 np0005625204.localdomain podman[306092]: 2026-02-20 09:48:48.148112076 +0000 UTC m=+0.087534365 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:48:48 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:48:48 np0005625204.localdomain ceph-mon[301857]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.744 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:48:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:49.744 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:48:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:48:50 np0005625204.localdomain podman[306138]: 2026-02-20 09:48:50.144933489 +0000 UTC m=+0.079249668 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc.)
Feb 20 09:48:50 np0005625204.localdomain podman[306138]: 2026-02-20 09:48:50.158476422 +0000 UTC m=+0.092792641 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, version=9.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc.)
Feb 20 09:48:50 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:48:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:48:50 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/918867819' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.216 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.296 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.297 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:48:50 np0005625204.localdomain ceph-mon[301857]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/786824728' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/918867819' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.518 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.520 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11778MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.521 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.521 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.592 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.592 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.593 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:48:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:50.631 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:48:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:48:51 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/825169528' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.124 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.132 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.226 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.229 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.229 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.708s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:48:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/49463867' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/825169528' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.646 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.646 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.647 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:48:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:51.651 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:52 np0005625204.localdomain ceph-mon[301857]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3903298265' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:48:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:48:54 np0005625204.localdomain systemd[1]: tmp-crun.ihY7mx.mount: Deactivated successfully.
Feb 20 09:48:54 np0005625204.localdomain podman[306182]: 2026-02-20 09:48:54.140826403 +0000 UTC m=+0.076886986 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true)
Feb 20 09:48:54 np0005625204.localdomain systemd[1]: tmp-crun.FaTQhF.mount: Deactivated successfully.
Feb 20 09:48:54 np0005625204.localdomain podman[306183]: 2026-02-20 09:48:54.163924983 +0000 UTC m=+0.097340507 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.225 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.226 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.227 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:54 np0005625204.localdomain podman[306183]: 2026-02-20 09:48:54.249289372 +0000 UTC m=+0.182704936 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 20 09:48:54 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:48:54 np0005625204.localdomain podman[306182]: 2026-02-20 09:48:54.265014291 +0000 UTC m=+0.201074814 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:48:54 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:48:54 np0005625204.localdomain ceph-mon[301857]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1696424530' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:48:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:54.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:48:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.219 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.220 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.220 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.221 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.732 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.752 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.752 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.753 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:48:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:55.753 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:48:56 np0005625204.localdomain ceph-mon[301857]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:48:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:48:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:48:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:48:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:48:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:48:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:56.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:48:56.652 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:48:58 np0005625204.localdomain ceph-mon[301857]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:48:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:00 np0005625204.localdomain ceph-mon[301857]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:00 np0005625204.localdomain sshd[306225]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:49:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:01.651 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:01 np0005625204.localdomain sshd[306225]: Invalid user user from 86.99.116.54 port 36182
Feb 20 09:49:01 np0005625204.localdomain sshd[306225]: Received disconnect from 86.99.116.54 port 36182:11: Bye Bye [preauth]
Feb 20 09:49:01 np0005625204.localdomain sshd[306225]: Disconnected from invalid user user 86.99.116.54 port 36182 [preauth]
Feb 20 09:49:02 np0005625204.localdomain ceph-mon[301857]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3831557669' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:49:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3831557669' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:49:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:49:03 np0005625204.localdomain podman[306227]: 2026-02-20 09:49:03.156418368 +0000 UTC m=+0.089857794 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 20 09:49:03 np0005625204.localdomain podman[306227]: 2026-02-20 09:49:03.172582081 +0000 UTC m=+0.106021437 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:49:03 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:49:04 np0005625204.localdomain sshd[306246]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:49:04 np0005625204.localdomain ceph-mon[301857]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:05 np0005625204.localdomain sshd[306246]: Invalid user admin from 57.128.218.144 port 42394
Feb 20 09:49:05 np0005625204.localdomain sshd[306246]: Received disconnect from 57.128.218.144 port 42394:11: Bye Bye [preauth]
Feb 20 09:49:05 np0005625204.localdomain sshd[306246]: Disconnected from invalid user admin 57.128.218.144 port 42394 [preauth]
Feb 20 09:49:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:49:06.012 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:49:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:49:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:49:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:49:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:49:06 np0005625204.localdomain ceph-mon[301857]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:06.654 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:08 np0005625204.localdomain ceph-mon[301857]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:10 np0005625204.localdomain ceph-mon[301857]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:11.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:12 np0005625204.localdomain ceph-mon[301857]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:49:13 np0005625204.localdomain podman[306248]: 2026-02-20 09:49:13.129299389 +0000 UTC m=+0.068851906 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:49:13 np0005625204.localdomain podman[306248]: 2026-02-20 09:49:13.137814844 +0000 UTC m=+0.077367391 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:49:13 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:49:14 np0005625204.localdomain ceph-mon[301857]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:16.660 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:16 np0005625204.localdomain ceph-mon[301857]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:49:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:49:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:49:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:49:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:49:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18295 "" "Go-http-client/1.1"
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.207 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.234 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.235 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9153d565-cea2-443b-a551-f9356b5deb53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.208748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c91c4be-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '0b8d8f781f93177c68d2aac88f31e1c471d8fe5d6afb33997ea911f3d443ddce'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.208748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c91d4ae-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '7bf7da62ffac372a41aa8b06cfb31abe0a7b04c7226c28da3adf6a04afe6652d'}]}, 'timestamp': '2026-02-20 09:49:18.235794', '_unique_id': 'bbbec8174e974a1e9b9ba70bbac7dd70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.236 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.237 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.237 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.238 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b09932f-8881-420d-9c0e-b29eaf21b0b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.237829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c923598-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '2b125be1ee260173351675c9761406b0eac063bda3d51385e93fc64206333fd4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.237829', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c924556-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '6932083c680ddcd2697066784f85dbba65a2d1250a57cbd1ce956e722cfd0db3'}]}, 'timestamp': '2026-02-20 09:49:18.238673', '_unique_id': 'e058ee5d94cf462dba3cdce202f8491a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.239 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.240 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.244 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '580bb93f-c9b7-423a-b0fd-948c9ec8abb5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.240716', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9343ca-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '92dc00050e3d2a8447f7e873561a0c6974efd7b6daa8d9db26df3079f6349d6b'}]}, 'timestamp': '2026-02-20 09:49:18.245116', '_unique_id': 'ee1723eccb6242f0a7acd48bc621b2e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.245 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.246 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.263 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 15800000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ae4a6ce-1391-44eb-9918-f2fdfc578cc0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 15800000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:49:18.246479', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '6c96360c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.502865791, 'message_signature': '2e5dd20065c21d61d209b1f7f0f236121d9491f088a4d9cc193139382cac5534'}]}, 'timestamp': '2026-02-20 09:49:18.264569', '_unique_id': '726d96870fa541ec9328706e261f4631'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.265 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '51dbc5ae-bfc8-406a-a7f2-61f7e1c2c6ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.267326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c96b6a4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '083da58d8722f6fd30dbf6e3fbab1ff04fabb7fffcf833a9d1cfdbc825fe2f37'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.267326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c96c892-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '791a7d506aae2e382f00b74b8114aa5cf33b0ebcf44f44c4228d7aee047e7a84'}]}, 'timestamp': '2026-02-20 09:49:18.268230', '_unique_id': '27f7d8782acc4194b3d26cba9338f034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.270 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c2cd70e-93d0-41b8-8496-c8aa34ada277', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.270409', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c972fee-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'ce2148c854ef5a44483eea4496078eefddbd39e391788febb8a031f8a2d6b997'}]}, 'timestamp': '2026-02-20 09:49:18.270906', '_unique_id': '3e9cc1f7a2974759bad3e43af9d96cfa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.271 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.272 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.273 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '549cd643-09a9-4331-9fef-9421ee99a5cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.273020', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9794a2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'f8c3a77bd0d892b39caf7ff2aafc32f10a6bce68eaeb74a0be0a6ef73e1aac77'}]}, 'timestamp': '2026-02-20 09:49:18.273478', '_unique_id': 'e55452a4d75142308bdf97a708106822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.275 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f13408c-ec8a-475c-a416-4c00db345aff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.275598', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c97fa8c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'fb0be74a5abd238be6c6187ccfad50b65e29ad8727da8847bc2256a1995f106e'}]}, 'timestamp': '2026-02-20 09:49:18.276087', '_unique_id': '0b5ba298f1b74d9b998da84cef5cc418'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.277 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.278 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '61ec3dab-db04-42c2-8f15-36725250ef03', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.278515', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c986d14-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': 'fac0663a8752a1ac6b06e62c99e9729be9a401e712dd91abe5252b3dac8bd6b8'}]}, 'timestamp': '2026-02-20 09:49:18.279036', '_unique_id': 'c15b992621ff4b8cace54963a6643fc1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.291 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.292 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b5e42ee-3c0b-4466-b558-18e837b888e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.281146', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9a72da-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': 'b97b762e1e5727b81c1d5e4af9e68090d858458c4df3d07c903b8d2f215e08e7'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.281146', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9a8572-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': 'b0806a3e25d936ead7c5093668c95d10af29459ad3a95ff096ae0e9e8ad21927'}]}, 'timestamp': '2026-02-20 09:49:18.292765', '_unique_id': 'c03dc62dd0b944d4b9a4a9cc0e40cfac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.294 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.295 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.295 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd02d59a1-c068-4aa3-9db2-df0dacd98d18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:49:18.295365', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '6c9afd90-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.502865791, 'message_signature': 'd2621df43352528d294302bfedc298f575ab5acd5e739ac0c08864ccaa8fb90b'}]}, 'timestamp': '2026-02-20 09:49:18.295836', '_unique_id': '70a4e4820ed149d88e97d5eda5eb21d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.298 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.299 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '115920db-b60d-4fd6-8b0c-74ad20f29a75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.298543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9b7afe-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '66f94a6bb81db6d77a21ac7804efdcbf189714a6872105aaadad6982052ad7ad'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.298543', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9b8b16-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': 'a32485268c150bd933ad09fc3c87161c7c931ab6a96b2ad145c56446f8daa464'}]}, 'timestamp': '2026-02-20 09:49:18.299451', '_unique_id': '301914c38b6d4e85a0476aecc0593d75'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '487f7958-56d0-4ab7-a0fe-09877f476ccf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.301568', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9bf132-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '8d93a5b83e53040e64c9e520766fa8b56da787ea9fcb201fb363b9cfacae0444'}]}, 'timestamp': '2026-02-20 09:49:18.302061', '_unique_id': '26b857e38d73424d8d06981911415f79'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '057082e9-764b-4065-a9f8-7515c4dd2255', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.304143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9c53f2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': '4195a9a941c32752f471fdf71b3442e36e0be8198f35cede9dfe27087df44317'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.304143', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9c6518-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': 'c7acbad9aaea2b2e386b090ac531a39f00541f7d6ebaef8960d28ac4170100cb'}]}, 'timestamp': '2026-02-20 09:49:18.305001', '_unique_id': '9a0bc337a5294a6480719f518f1a2bde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.307 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e7820c9-a46a-4d0a-a169-e64f3c607d4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.307289', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9ccf58-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '0e1af3938816ad6aa2ccc83f07ad23dddd79adff69b3516cb7d0afaf8c2c6006'}]}, 'timestamp': '2026-02-20 09:49:18.307779', '_unique_id': '37cfedd0a3fb44428b949f195430ddaf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.310 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5d998d51-87b0-487f-b4fe-587638f8345e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.309902', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9d34f2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': '3d3087dd44bef3a9299bda9bcf4182613e2bec736b53fc2b98daa8e4771b5b3c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.309902', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9d44ec-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.520356243, 'message_signature': '40cf9c4ea41b02da3d538481d74198539f131b0ca7d8f5835eee8d67375d2cc2'}]}, 'timestamp': '2026-02-20 09:49:18.310758', '_unique_id': '76e19bf3857a493db4f0a1872c6ee227'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.311 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.312 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '221cfeee-1735-42c4-9df2-3eff736a1da3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.313062', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9db094-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '631832dbdb8aac4c786003e2d4edcd99ca318d044436164efada5be900da45c2'}]}, 'timestamp': '2026-02-20 09:49:18.313516', '_unique_id': '0000feb62b52433ea4bf9897f0024477'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.314 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.315 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.315 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32f342fc-241e-4c1c-ba83-6609b4496d89', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.315581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9e13ea-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': 'a6e42984c09e650aabdb39a28360faf88b09553a0b46e9e4680cfab5cef5e62a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.315581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9e1e8a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '1e57c6bb30f617259e72e10424b63a4320d8718214ca158c252080b437490973'}]}, 'timestamp': '2026-02-20 09:49:18.316231', '_unique_id': '032e68c5f5b5400eb6e023ac50896179'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.316 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.317 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd9b92159-9db7-47a6-a700-f50d56803003', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:49:18.317743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6c9e644e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': '2c5e3a6282aee253099de6b7cfc53dff3a1e238faca11905926ed73ce1bccb20'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:49:18.317743', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6c9e6e94-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.447949971, 'message_signature': 'c705231169b01982323a8a99824b1145f179a61f3591f40042deb3e405a91547'}]}, 'timestamp': '2026-02-20 09:49:18.318279', '_unique_id': '03a78aaa904b44a183e6d4048b3db869'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae135d58-f63e-4512-96a5-f96018c88480', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.319706', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9eb16a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '5be8a17acbd60a41b0b9f339f7ad2074176fb5f74896bf8094efb154d68c0236'}]}, 'timestamp': '2026-02-20 09:49:18.320011', '_unique_id': '55949289ee9b43f7aecefee9bade305f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '792f3301-bf55-487e-82b2-3f55478d166d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:49:18.321534', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6c9ef95e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11517.479913336, 'message_signature': '6eef2927f9afe77db1a657ebebdfda84cebdedc1b3603d8c3cb262347a6ebd1a'}]}, 'timestamp': '2026-02-20 09:49:18.321853', '_unique_id': '6836aa72ee1a4627a0f2bd88b5187376'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:49:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:49:18.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:49:18 np0005625204.localdomain ceph-mon[301857]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:49:19 np0005625204.localdomain podman[306271]: 2026-02-20 09:49:19.156195641 +0000 UTC m=+0.092056300 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:49:19 np0005625204.localdomain podman[306271]: 2026-02-20 09:49:19.168306482 +0000 UTC m=+0.104167131 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:49:19 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:49:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:20 np0005625204.localdomain ceph-mon[301857]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:49:21 np0005625204.localdomain podman[306296]: 2026-02-20 09:49:21.137160004 +0000 UTC m=+0.077217587 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-type=git, config_id=openstack_network_exporter, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:49:21 np0005625204.localdomain podman[306296]: 2026-02-20 09:49:21.145187324 +0000 UTC m=+0.085244727 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.7, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:49:21 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:49:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:21.663 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:22 np0005625204.localdomain ceph-mon[301857]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:49:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:49:25 np0005625204.localdomain podman[306316]: 2026-02-20 09:49:25.955568859 +0000 UTC m=+0.040199482 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller)
Feb 20 09:49:25 np0005625204.localdomain podman[306316]: 2026-02-20 09:49:25.979992879 +0000 UTC m=+0.064623522 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:49:25 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:49:26 np0005625204.localdomain podman[306317]: 2026-02-20 09:49:26.014756747 +0000 UTC m=+0.097077880 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:49:26 np0005625204.localdomain podman[306317]: 2026-02-20 09:49:26.048041931 +0000 UTC m=+0.130363104 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:49:26 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:49:26 np0005625204.localdomain ceph-mon[301857]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:49:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:49:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:49:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:49:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:49:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:49:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:26.665 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:26.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:27 np0005625204.localdomain ceph-mon[301857]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:28 np0005625204.localdomain ceph-mon[301857]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:29 np0005625204.localdomain sshd[306360]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:49:29 np0005625204.localdomain sshd[306360]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:49:30 np0005625204.localdomain ceph-mon[301857]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:31 np0005625204.localdomain sudo[306362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:49:31 np0005625204.localdomain sudo[306362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:49:31 np0005625204.localdomain sudo[306362]: pam_unix(sudo:session): session closed for user root
Feb 20 09:49:31 np0005625204.localdomain sudo[306380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:49:31 np0005625204.localdomain sudo[306380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:49:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:31.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:31 np0005625204.localdomain sudo[306380]: pam_unix(sudo:session): session closed for user root
Feb 20 09:49:32 np0005625204.localdomain sudo[306430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:49:32 np0005625204.localdomain sudo[306430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:49:32 np0005625204.localdomain sudo[306430]: pam_unix(sudo:session): session closed for user root
Feb 20 09:49:32 np0005625204.localdomain ceph-mon[301857]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:49:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:49:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:49:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:49:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:49:34 np0005625204.localdomain podman[306448]: 2026-02-20 09:49:34.149799018 +0000 UTC m=+0.083378901 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Feb 20 09:49:34 np0005625204.localdomain podman[306448]: 2026-02-20 09:49:34.159290861 +0000 UTC m=+0.092870744 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:49:34 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:49:34 np0005625204.localdomain ceph-mon[301857]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:49:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:36 np0005625204.localdomain ceph-mon[301857]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:36.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:38 np0005625204.localdomain ceph-mon[301857]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:40 np0005625204.localdomain ceph-mon[301857]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:41.673 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:49:42 np0005625204.localdomain ceph-mon[301857]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:49:44 np0005625204.localdomain podman[306466]: 2026-02-20 09:49:44.128207181 +0000 UTC m=+0.068968881 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:49:44 np0005625204.localdomain podman[306466]: 2026-02-20 09:49:44.140074365 +0000 UTC m=+0.080836085 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:49:44 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:49:44 np0005625204.localdomain ceph-mon[301857]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:46 np0005625204.localdomain ceph-mon[301857]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:46.677 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:49:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5069 writes, 22K keys, 5069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5069 writes, 696 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 90 writes, 307 keys, 90 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 90 writes, 39 syncs, 2.31 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:49:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:49:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:49:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:49:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:49:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:49:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18294 "" "Go-http-client/1.1"
Feb 20 09:49:48 np0005625204.localdomain ceph-mon[301857]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:49:50 np0005625204.localdomain sshd[306500]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:49:50 np0005625204.localdomain systemd[1]: tmp-crun.nXONmp.mount: Deactivated successfully.
Feb 20 09:49:50 np0005625204.localdomain podman[306488]: 2026-02-20 09:49:50.157723851 +0000 UTC m=+0.095918135 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:49:50 np0005625204.localdomain podman[306488]: 2026-02-20 09:49:50.166216014 +0000 UTC m=+0.104410298 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:49:50 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:49:50 np0005625204.localdomain ceph-mon[301857]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1780740115' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2135499472' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:50.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:49:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5997 writes, 25K keys, 5997 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5997 writes, 930 syncs, 6.45 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 177 writes, 364 keys, 177 commit groups, 1.0 writes per commit group, ingest: 0.34 MB, 0.00 MB/s
                                                          Interval WAL: 177 writes, 85 syncs, 2.08 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:49:51 np0005625204.localdomain sshd[306500]: Received disconnect from 188.166.218.64 port 37954:11: Bye Bye [preauth]
Feb 20 09:49:51 np0005625204.localdomain sshd[306500]: Disconnected from authenticating user root 188.166.218.64 port 37954 [preauth]
Feb 20 09:49:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:49:51 np0005625204.localdomain podman[306514]: 2026-02-20 09:49:51.661624797 +0000 UTC m=+0.079676710 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 09:49:51 np0005625204.localdomain podman[306514]: 2026-02-20 09:49:51.678190841 +0000 UTC m=+0.096242774 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9)
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.678 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.681 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.684 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:51 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.738 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.738 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.738 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.739 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:49:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:51.739 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:49:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:49:52 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1972084818' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.229 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.315 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.315 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.515 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.517 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11775MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.518 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.518 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.585 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.586 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.586 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.627 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:49:52 np0005625204.localdomain ceph-mon[301857]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1972084818' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:49:52.965 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:49:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:49:52.965 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:49:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:52.991 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:49:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1147410803' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:53.087 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:49:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:53.094 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:49:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:53.110 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:49:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:53.112 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:49:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:53.113 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:49:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1147410803' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:54.109 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:54.110 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:54.127 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:54.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:54.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:54 np0005625204.localdomain ceph-mon[301857]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:49:54 np0005625204.localdomain ceph-mon[301857]: mgrmap e46: np0005625202.arwxwo(active, since 90s), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:49:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:49:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2042522384' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.793 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.794 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.794 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:49:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:55.795 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:49:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:49:56 np0005625204.localdomain podman[306578]: 2026-02-20 09:49:56.142683971 +0000 UTC m=+0.078572497 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:49:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:49:56 np0005625204.localdomain podman[306578]: 2026-02-20 09:49:56.185028626 +0000 UTC m=+0.120917182 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Feb 20 09:49:56 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:49:56 np0005625204.localdomain podman[306597]: 2026-02-20 09:49:56.243888913 +0000 UTC m=+0.082880616 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:49:56 np0005625204.localdomain podman[306597]: 2026-02-20 09:49:56.274950881 +0000 UTC m=+0.113942554 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent)
Feb 20 09:49:56 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:49:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:56.337 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:49:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:56.358 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:49:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:56.358 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:49:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:49:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:49:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:49:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:49:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:49:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:49:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:56.683 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:49:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:56.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:49:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:49:56.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:49:56 np0005625204.localdomain ceph-mon[301857]: pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:49:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4152313636' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:49:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:49:57.968 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:49:58 np0005625204.localdomain ceph-mon[301857]: pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 588 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 20 09:49:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e92 e92: 6 total, 6 up, 6 in
Feb 20 09:50:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:00 np0005625204.localdomain ceph-mon[301857]: pgmap v51: 177 pgs: 177 active+clean; 121 MiB data, 624 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s rd, 1.3 MiB/s wr, 12 op/s
Feb 20 09:50:00 np0005625204.localdomain ceph-mon[301857]: osdmap e92: 6 total, 6 up, 6 in
Feb 20 09:50:00 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 09:50:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:01.687 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:01.689 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:01.690 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:01.690 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:01.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:01.723 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 e93: 6 total, 6 up, 6 in
Feb 20 09:50:02 np0005625204.localdomain ceph-mon[301857]: pgmap v53: 177 pgs: 177 active+clean; 125 MiB data, 632 MiB used, 41 GiB / 42 GiB avail; 9.9 KiB/s rd, 2.0 MiB/s wr, 14 op/s
Feb 20 09:50:02 np0005625204.localdomain ceph-mon[301857]: osdmap e93: 6 total, 6 up, 6 in
Feb 20 09:50:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/188009214' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:50:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/188009214' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:50:04 np0005625204.localdomain ceph-mon[301857]: pgmap v55: 177 pgs: 177 active+clean; 125 MiB data, 632 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 2.6 MiB/s wr, 18 op/s
Feb 20 09:50:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:50:05 np0005625204.localdomain systemd[1]: tmp-crun.N4ssJV.mount: Deactivated successfully.
Feb 20 09:50:05 np0005625204.localdomain podman[306619]: 2026-02-20 09:50:05.148520799 +0000 UTC m=+0.086633518 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:50:05 np0005625204.localdomain podman[306619]: 2026-02-20 09:50:05.186209145 +0000 UTC m=+0.124321824 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 20 09:50:05 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:50:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:06.013 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:50:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:50:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:50:06 np0005625204.localdomain sshd[306638]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:50:06 np0005625204.localdomain sshd[306638]: Invalid user dixi from 18.221.252.160 port 59194
Feb 20 09:50:06 np0005625204.localdomain sshd[306638]: Received disconnect from 18.221.252.160 port 59194:11: Bye Bye [preauth]
Feb 20 09:50:06 np0005625204.localdomain sshd[306638]: Disconnected from invalid user dixi 18.221.252.160 port 59194 [preauth]
Feb 20 09:50:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:06.724 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:06 np0005625204.localdomain ceph-mon[301857]: pgmap v56: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 5.1 MiB/s wr, 47 op/s
Feb 20 09:50:08 np0005625204.localdomain ceph-mon[301857]: pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 3.1 MiB/s wr, 29 op/s
Feb 20 09:50:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:10 np0005625204.localdomain ceph-mon[301857]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.1 MiB/s wr, 24 op/s
Feb 20 09:50:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:11.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:11.730 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:11.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:11.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:11.765 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:11.766 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:12 np0005625204.localdomain ceph-mon[301857]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 2.0 MiB/s wr, 23 op/s
Feb 20 09:50:14 np0005625204.localdomain sshd[306640]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:50:14 np0005625204.localdomain sshd[306640]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:50:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:50:14 np0005625204.localdomain podman[306642]: 2026-02-20 09:50:14.620660085 +0000 UTC m=+0.084317499 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:50:14 np0005625204.localdomain podman[306642]: 2026-02-20 09:50:14.633056765 +0000 UTC m=+0.096714209 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:50:14 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:50:15 np0005625204.localdomain ceph-mon[301857]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 1.8 MiB/s wr, 20 op/s
Feb 20 09:50:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:16.766 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:16.768 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:16.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:16.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:16.821 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:16.822 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:17 np0005625204.localdomain ceph-mon[301857]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 1.7 MiB/s wr, 19 op/s
Feb 20 09:50:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:50:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:50:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:50:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:50:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:50:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18293 "" "Go-http-client/1.1"
Feb 20 09:50:19 np0005625204.localdomain ceph-mon[301857]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:50:21 np0005625204.localdomain ceph-mon[301857]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:21 np0005625204.localdomain podman[306665]: 2026-02-20 09:50:21.144881857 +0000 UTC m=+0.082306459 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:50:21 np0005625204.localdomain podman[306665]: 2026-02-20 09:50:21.178948915 +0000 UTC m=+0.116373517 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:50:21 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:50:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:21.823 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:21.860 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:21.860 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5038 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:21.861 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:21.862 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:21.863 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:21 np0005625204.localdomain sshd[306689]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:50:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:50:22 np0005625204.localdomain podman[306691]: 2026-02-20 09:50:22.150409712 +0000 UTC m=+0.082452003 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter)
Feb 20 09:50:22 np0005625204.localdomain podman[306691]: 2026-02-20 09:50:22.193097067 +0000 UTC m=+0.125139358 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, release=1770267347, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:50:22 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:50:22 np0005625204.localdomain sshd[306689]: Invalid user manel from 154.91.170.41 port 59836
Feb 20 09:50:22 np0005625204.localdomain sshd[306689]: Received disconnect from 154.91.170.41 port 59836:11: Bye Bye [preauth]
Feb 20 09:50:22 np0005625204.localdomain sshd[306689]: Disconnected from invalid user manel 154.91.170.41 port 59836 [preauth]
Feb 20 09:50:23 np0005625204.localdomain ceph-mon[301857]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:25 np0005625204.localdomain ceph-mon[301857]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:26 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:26Z|00069|memory_trim|INFO|Detected inactivity (last active 30024 ms ago): trimming memory
Feb 20 09:50:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:50:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:50:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:50:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:50:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:50:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:50:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:26.897 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:26.899 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:26.899 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5036 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:26.900 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:26.901 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:26.902 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:50:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:50:27 np0005625204.localdomain ceph-mon[301857]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:27 np0005625204.localdomain podman[306711]: 2026-02-20 09:50:27.14798154 +0000 UTC m=+0.079908906 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:50:27 np0005625204.localdomain systemd[1]: tmp-crun.G8NbrI.mount: Deactivated successfully.
Feb 20 09:50:27 np0005625204.localdomain podman[306712]: 2026-02-20 09:50:27.214505427 +0000 UTC m=+0.150381402 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:50:27 np0005625204.localdomain podman[306712]: 2026-02-20 09:50:27.225045042 +0000 UTC m=+0.160921057 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:50:27 np0005625204.localdomain sshd[306752]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:50:27 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:50:27 np0005625204.localdomain podman[306711]: 2026-02-20 09:50:27.266396327 +0000 UTC m=+0.198323673 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:50:27 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:50:28 np0005625204.localdomain sshd[306752]: Invalid user sol from 45.148.10.240 port 60744
Feb 20 09:50:28 np0005625204.localdomain sshd[306752]: Connection closed by invalid user sol 45.148.10.240 port 60744 [preauth]
Feb 20 09:50:29 np0005625204.localdomain ceph-mon[301857]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:31 np0005625204.localdomain ceph-mon[301857]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:31.902 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:31.903 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:31.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:31.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:31.904 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:31.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:32 np0005625204.localdomain sudo[306754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:50:32 np0005625204.localdomain sudo[306754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:50:32 np0005625204.localdomain sudo[306754]: pam_unix(sudo:session): session closed for user root
Feb 20 09:50:32 np0005625204.localdomain sudo[306772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:50:32 np0005625204.localdomain sudo[306772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:50:33 np0005625204.localdomain sudo[306772]: pam_unix(sudo:session): session closed for user root
Feb 20 09:50:33 np0005625204.localdomain ceph-mon[301857]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:33 np0005625204.localdomain sudo[306823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:50:33 np0005625204.localdomain sudo[306823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:50:33 np0005625204.localdomain sudo[306823]: pam_unix(sudo:session): session closed for user root
Feb 20 09:50:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:50:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:50:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:50:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:50:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:50:35 np0005625204.localdomain ceph-mon[301857]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:50:36 np0005625204.localdomain systemd[1]: tmp-crun.mTJawC.mount: Deactivated successfully.
Feb 20 09:50:36 np0005625204.localdomain podman[306841]: 2026-02-20 09:50:36.17781094 +0000 UTC m=+0.102427269 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:50:36 np0005625204.localdomain podman[306841]: 2026-02-20 09:50:36.192680724 +0000 UTC m=+0.117297043 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:50:36 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:50:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:36.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:37 np0005625204.localdomain ceph-mon[301857]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:39 np0005625204.localdomain ceph-mon[301857]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:40 np0005625204.localdomain ceph-mon[301857]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:41.909 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:41.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:41.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:41.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:41.950 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:41.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:42 np0005625204.localdomain ceph-mon[301857]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:42.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:42.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 09:50:44 np0005625204.localdomain ceph-mon[301857]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:44.905 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpd2d1_9_k/privsep.sock']
Feb 20 09:50:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:50:45 np0005625204.localdomain podman[306863]: 2026-02-20 09:50:45.18252836 +0000 UTC m=+0.122674365 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:50:45 np0005625204.localdomain podman[306863]: 2026-02-20 09:50:45.188405205 +0000 UTC m=+0.128551170 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:50:45 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:50:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.528 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:50:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.418 306887 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:50:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.422 306887 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:50:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.425 306887 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 20 09:50:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:45.426 306887 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306887
Feb 20 09:50:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.069 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpy7d1rb82/privsep.sock']
Feb 20 09:50:46 np0005625204.localdomain ceph-mon[301857]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.672 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:50:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.563 306896 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:50:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.568 306896 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:50:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.571 306896 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 20 09:50:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:46.571 306896 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306896
Feb 20 09:50:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:46.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:46.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:50:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:46.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:50:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:46.955 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:46.993 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:46.994 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:50:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:47.275 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:47 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:47.561 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpzojxzdwa/privsep.sock']
Feb 20 09:50:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:50:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:50:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:50:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:50:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:50:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18298 "" "Go-http-client/1.1"
Feb 20 09:50:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.173 264355 INFO oslo.privsep.daemon [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Spawned new privsep daemon via rootwrap
Feb 20 09:50:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.066 306908 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 20 09:50:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.071 306908 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 20 09:50:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.074 306908 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 20 09:50:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:48.074 306908 INFO oslo.privsep.daemon [-] privsep daemon running as pid 306908
Feb 20 09:50:48 np0005625204.localdomain ceph-mon[301857]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:49.551 264355 INFO neutron.agent.linux.ip_lib [None req-bc0e12f9-30aa-4092-b2f0-013756f32f9d - - - - - -] Device tap31575aad-f0 cannot be used as it has no MAC address
Feb 20 09:50:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:49.627 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:49 np0005625204.localdomain kernel: device tap31575aad-f0 entered promiscuous mode
Feb 20 09:50:49 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581049.6405] manager: (tap31575aad-f0): new Generic device (/org/freedesktop/NetworkManager/Devices/17)
Feb 20 09:50:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:49Z|00070|binding|INFO|Claiming lport 31575aad-f0ba-4bf3-a41b-1370b560a95e for this chassis.
Feb 20 09:50:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:49Z|00071|binding|INFO|31575aad-f0ba-4bf3-a41b-1370b560a95e: Claiming unknown
Feb 20 09:50:49 np0005625204.localdomain systemd-udevd[306923]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:50:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:49.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:49.655 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a71c85068454599bd460f60dda32410', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=019bf044-9ed4-4b4f-84fd-b8df4eb6e270, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=31575aad-f0ba-4bf3-a41b-1370b560a95e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:50:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:49.657 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 31575aad-f0ba-4bf3-a41b-1370b560a95e in datapath f5fde837-7459-455b-ad78-388d579e00e0 bound to our chassis
Feb 20 09:50:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:49.660 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 971d15a3-6eaf-4bc0-b431-214a710c6d28 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:50:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:49.660 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5fde837-7459-455b-ad78-388d579e00e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:50:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:49.664 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b49cd48f-5e27-49c7-9483-d62b8f6582ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: libvirt version: 11.10.0, package: 4.el9 (builder@centos.org, 2026-01-29-15:25:17, )
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: hostname: np0005625204.localdomain
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:49Z|00072|binding|INFO|Setting lport 31575aad-f0ba-4bf3-a41b-1370b560a95e ovn-installed in OVS
Feb 20 09:50:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:49Z|00073|binding|INFO|Setting lport 31575aad-f0ba-4bf3-a41b-1370b560a95e up in Southbound
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:49.675 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap31575aad-f0: No such device
Feb 20 09:50:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:49.719 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:49.749 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:49.766 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:50 np0005625204.localdomain ceph-mon[301857]: pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1559223443' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1430418607' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:50 np0005625204.localdomain podman[306996]: 
Feb 20 09:50:50 np0005625204.localdomain podman[306996]: 2026-02-20 09:50:50.666400188 +0000 UTC m=+0.063523688 container create cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:50:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:50 np0005625204.localdomain systemd[1]: Started libpod-conmon-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099.scope.
Feb 20 09:50:50 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:50:50 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/64f6be432043f77aaa1f4774174218e0ed969c67419b29f51e1d1f5ed1490642/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:50:50 np0005625204.localdomain podman[306996]: 2026-02-20 09:50:50.638701181 +0000 UTC m=+0.035824771 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:50:50 np0005625204.localdomain podman[306996]: 2026-02-20 09:50:50.744112367 +0000 UTC m=+0.141235867 container init cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:50:50 np0005625204.localdomain podman[306996]: 2026-02-20 09:50:50.758735705 +0000 UTC m=+0.155859245 container start cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:50:50 np0005625204.localdomain dnsmasq[307014]: started, version 2.85 cachesize 150
Feb 20 09:50:50 np0005625204.localdomain dnsmasq[307014]: DNS service limited to local subnets
Feb 20 09:50:50 np0005625204.localdomain dnsmasq[307014]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:50:50 np0005625204.localdomain dnsmasq[307014]: warning: no upstream servers configured
Feb 20 09:50:50 np0005625204.localdomain dnsmasq-dhcp[307014]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:50:50 np0005625204.localdomain dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 0 addresses
Feb 20 09:50:50 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host
Feb 20 09:50:50 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts
Feb 20 09:50:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:50.817 264355 INFO neutron.agent.dhcp.agent [None req-6a378441-ebbf-4433-970c-af8645e50a93 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:48Z, description=, device_id=b5d0538d-0ab3-4d2a-a4dc-0d49c2ca7aa5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5b7ebe0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5b7eb80>], id=97f20640-625d-4622-93e9-d2725d104851, ip_allocation=immediate, mac_address=fa:16:3e:78:b2:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:43Z, description=, dns_domain=, id=f5fde837-7459-455b-ad78-388d579e00e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1091425961-network, port_security_enabled=True, project_id=5a71c85068454599bd460f60dda32410, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46902, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=251, status=ACTIVE, subnets=['ca5de6a4-7fc3-43ac-b6e5-6653e73f1e70'], tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:44Z, vlan_transparent=None, network_id=f5fde837-7459-455b-ad78-388d579e00e0, port_security_enabled=False, project_id=5a71c85068454599bd460f60dda32410, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=320, status=DOWN, tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:48Z on network f5fde837-7459-455b-ad78-388d579e00e0
Feb 20 09:50:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:50.899 264355 INFO neutron.agent.dhcp.agent [None req-8ed23602-a787-4725-b8c8-510121196255 - - - - - -] DHCP configuration for ports {'8d76b0e0-7a09-42a5-96b9-2313d0cff9d6'} is completed
Feb 20 09:50:51 np0005625204.localdomain dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 1 addresses
Feb 20 09:50:51 np0005625204.localdomain podman[307032]: 2026-02-20 09:50:51.034376745 +0000 UTC m=+0.066617940 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:50:51 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host
Feb 20 09:50:51 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts
Feb 20 09:50:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.158 264355 INFO neutron.agent.dhcp.agent [None req-ab33f320-70c7-48d3-9cdd-d05cd384fb6f - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:48Z, description=, device_id=b5d0538d-0ab3-4d2a-a4dc-0d49c2ca7aa5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a62a90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a62790>], id=97f20640-625d-4622-93e9-d2725d104851, ip_allocation=immediate, mac_address=fa:16:3e:78:b2:17, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:43Z, description=, dns_domain=, id=f5fde837-7459-455b-ad78-388d579e00e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPDetailsNegativeTestJSON-1091425961-network, port_security_enabled=True, project_id=5a71c85068454599bd460f60dda32410, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46902, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=251, status=ACTIVE, subnets=['ca5de6a4-7fc3-43ac-b6e5-6653e73f1e70'], tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:44Z, vlan_transparent=None, network_id=f5fde837-7459-455b-ad78-388d579e00e0, port_security_enabled=False, project_id=5a71c85068454599bd460f60dda32410, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=320, status=DOWN, tags=[], tenant_id=5a71c85068454599bd460f60dda32410, updated_at=2026-02-20T09:50:48Z on network f5fde837-7459-455b-ad78-388d579e00e0
Feb 20 09:50:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.309 264355 INFO neutron.agent.dhcp.agent [None req-52186cb0-788d-4113-87a9-aa379704d383 - - - - - -] DHCP configuration for ports {'97f20640-625d-4622-93e9-d2725d104851'} is completed
Feb 20 09:50:51 np0005625204.localdomain dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 1 addresses
Feb 20 09:50:51 np0005625204.localdomain podman[307069]: 2026-02-20 09:50:51.459081007 +0000 UTC m=+0.066258360 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:50:51 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host
Feb 20 09:50:51 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts
Feb 20 09:50:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:50:51 np0005625204.localdomain podman[307087]: 2026-02-20 09:50:51.629533626 +0000 UTC m=+0.072606928 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:50:51 np0005625204.localdomain podman[307087]: 2026-02-20 09:50:51.643164093 +0000 UTC m=+0.086237435 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:50:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.653 264355 INFO neutron.agent.dhcp.agent [None req-718f11f1-e233-41cd-a860-e1e2cf44efbe - - - - - -] DHCP configuration for ports {'97f20640-625d-4622-93e9-d2725d104851'} is completed
Feb 20 09:50:51 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.743 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:50:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:51.763 264355 INFO neutron.agent.linux.ip_lib [None req-bbba7d19-4bf4-4398-86f3-888da9f43e3e - - - - - -] Device tap9906e141-c3 cannot be used as it has no MAC address
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:51 np0005625204.localdomain kernel: device tap9906e141-c3 entered promiscuous mode
Feb 20 09:50:51 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581051.8364] manager: (tap9906e141-c3): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Feb 20 09:50:51 np0005625204.localdomain systemd-udevd[306925]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:50:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:51Z|00074|binding|INFO|Claiming lport 9906e141-c388-453f-9169-7c98a351db5e for this chassis.
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.838 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:51Z|00075|binding|INFO|9906e141-c388-453f-9169-7c98a351db5e: Claiming unknown
Feb 20 09:50:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:51.848 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=9906e141-c388-453f-9169-7c98a351db5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:50:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:51.850 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 9906e141-c388-453f-9169-7c98a351db5e in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 bound to our chassis
Feb 20 09:50:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:51Z|00076|binding|INFO|Setting lport 9906e141-c388-453f-9169-7c98a351db5e ovn-installed in OVS
Feb 20 09:50:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:51Z|00077|binding|INFO|Setting lport 9906e141-c388-453f-9169-7c98a351db5e up in Southbound
Feb 20 09:50:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:51.852 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8d59c69c-3a69-449e-9d36-233c1f4c5c30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:50:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:51.852 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.853 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:51.853 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5169d689-d1d1-4c0b-be6f-01d9e1c84395]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.870 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.903 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.938 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:51.995 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.228 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.291 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.312 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.312 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.560 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.562 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11475MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.563 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.563 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:50:52 np0005625204.localdomain ceph-mon[301857]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2038880473' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.654 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.655 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.655 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:50:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:52.710 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:50:52 np0005625204.localdomain podman[307201]: 
Feb 20 09:50:52 np0005625204.localdomain podman[307201]: 2026-02-20 09:50:52.833101685 +0000 UTC m=+0.081561617 container create a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:50:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:50:52 np0005625204.localdomain systemd[1]: Started libpod-conmon-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7.scope.
Feb 20 09:50:52 np0005625204.localdomain podman[307201]: 2026-02-20 09:50:52.784629827 +0000 UTC m=+0.033089849 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:50:52 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:50:52 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b7f60ded7c720c0eef64273aac4a3860430b8d407f2c1fb2fc8c0fb6a9fb560/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:50:52 np0005625204.localdomain podman[307201]: 2026-02-20 09:50:52.907492036 +0000 UTC m=+0.155951998 container init a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:50:52 np0005625204.localdomain podman[307201]: 2026-02-20 09:50:52.91801965 +0000 UTC m=+0.166479622 container start a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:50:52 np0005625204.localdomain dnsmasq[307248]: started, version 2.85 cachesize 150
Feb 20 09:50:52 np0005625204.localdomain dnsmasq[307248]: DNS service limited to local subnets
Feb 20 09:50:52 np0005625204.localdomain dnsmasq[307248]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:50:52 np0005625204.localdomain dnsmasq[307248]: warning: no upstream servers configured
Feb 20 09:50:52 np0005625204.localdomain dnsmasq-dhcp[307248]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:50:52 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 0 addresses
Feb 20 09:50:52 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:50:52 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:50:52 np0005625204.localdomain podman[307233]: 2026-02-20 09:50:52.976563088 +0000 UTC m=+0.101051038 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, vcs-type=git, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, version=9.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Feb 20 09:50:52 np0005625204.localdomain podman[307233]: 2026-02-20 09:50:52.990429132 +0000 UTC m=+0.114917102 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1770267347, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, architecture=x86_64, build-date=2026-02-05T04:57:10Z, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:50:53 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:50:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:53.147 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:50:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:53.153 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:50:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:53.162 264355 INFO neutron.agent.dhcp.agent [None req-c3ae247d-f4e6-4f75-a064-f1a1e271b441 - - - - - -] DHCP configuration for ports {'2b93bbc2-5aeb-49cc-b610-6f4f7708d346'} is completed
Feb 20 09:50:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:50:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1986532880' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.186 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.199 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.206 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.220 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.223 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.223 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:50:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:53.224 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1986532880' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:54.201 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:53Z, description=, device_id=0c77eb17-66e6-4aa0-8b78-169b259339e9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d2370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d2fa0>], id=034bd422-60dc-4b3b-ba25-f380892caff4, ip_allocation=immediate, mac_address=fa:16:3e:14:5b:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:49Z, description=, dns_domain=, id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-982155183-network, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=331, status=ACTIVE, subnets=['c9423f67-342b-44f2-ac81-92ef706f7aa6'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:50Z, vlan_transparent=None, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=False, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=372, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:53Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.229 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.230 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.230 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:54 np0005625204.localdomain podman[307278]: 2026-02-20 09:50:54.443320076 +0000 UTC m=+0.071507887 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:50:54 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 1 addresses
Feb 20 09:50:54 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:50:54 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:50:54 np0005625204.localdomain ceph-mon[301857]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 09:50:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:54.726 264355 INFO neutron.agent.dhcp.agent [None req-693fa230-8a3b-4cc0-bf1a-d1ec6c5f9eaf - - - - - -] DHCP configuration for ports {'034bd422-60dc-4b3b-ba25-f380892caff4'} is completed
Feb 20 09:50:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:54.741 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 09:50:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.740 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.741 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.844 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.845 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.845 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:50:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:55.846 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:50:56 np0005625204.localdomain dnsmasq[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/addn_hosts - 0 addresses
Feb 20 09:50:56 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/host
Feb 20 09:50:56 np0005625204.localdomain dnsmasq-dhcp[307014]: read /var/lib/neutron/dhcp/f5fde837-7459-455b-ad78-388d579e00e0/opts
Feb 20 09:50:56 np0005625204.localdomain podman[307317]: 2026-02-20 09:50:56.095333726 +0000 UTC m=+0.059251181 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 20 09:50:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:56Z|00078|binding|INFO|Releasing lport 31575aad-f0ba-4bf3-a41b-1370b560a95e from this chassis (sb_readonly=0)
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:56 np0005625204.localdomain kernel: device tap31575aad-f0 left promiscuous mode
Feb 20 09:50:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:56Z|00079|binding|INFO|Setting lport 31575aad-f0ba-4bf3-a41b-1370b560a95e down in Southbound
Feb 20 09:50:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:56.251 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f5fde837-7459-455b-ad78-388d579e00e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5a71c85068454599bd460f60dda32410', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=019bf044-9ed4-4b4f-84fd-b8df4eb6e270, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=31575aad-f0ba-4bf3-a41b-1370b560a95e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:50:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:56.254 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 31575aad-f0ba-4bf3-a41b-1370b560a95e in datapath f5fde837-7459-455b-ad78-388d579e00e0 unbound from our chassis
Feb 20 09:50:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:56.258 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f5fde837-7459-455b-ad78-388d579e00e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.258 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:50:56.259 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd4409b-308a-4baf-bb7f-b1aa7c08f6cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.261 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:56.522 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:50:53Z, description=, device_id=0c77eb17-66e6-4aa0-8b78-169b259339e9, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a629d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a62370>], id=034bd422-60dc-4b3b-ba25-f380892caff4, ip_allocation=immediate, mac_address=fa:16:3e:14:5b:51, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:49Z, description=, dns_domain=, id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-982155183-network, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=331, status=ACTIVE, subnets=['c9423f67-342b-44f2-ac81-92ef706f7aa6'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:50Z, vlan_transparent=None, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=False, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=372, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:53Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:50:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:50:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:50:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:50:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:50:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:50:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:50:56 np0005625204.localdomain ceph-mon[301857]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:56 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 1 addresses
Feb 20 09:50:56 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:50:56 np0005625204.localdomain podman[307355]: 2026-02-20 09:50:56.813942002 +0000 UTC m=+0.062269170 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:50:56 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.838 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.921 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.941 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.941 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.942 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:56.943 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:57.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:57.118 264355 INFO neutron.agent.dhcp.agent [None req-4e874753-113f-4a89-9768-950317eae4fb - - - - - -] DHCP configuration for ports {'034bd422-60dc-4b3b-ba25-f380892caff4'} is completed
Feb 20 09:50:57 np0005625204.localdomain sshd[307375]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:50:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2966131561' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3661911017' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:57 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:50:57Z|00080|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:50:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:58.005 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:50:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:50:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:50:58 np0005625204.localdomain podman[307378]: 2026-02-20 09:50:58.151378089 +0000 UTC m=+0.086078482 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:50:58 np0005625204.localdomain podman[307378]: 2026-02-20 09:50:58.197098543 +0000 UTC m=+0.131798986 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 09:50:58 np0005625204.localdomain podman[307377]: 2026-02-20 09:50:58.126843026 +0000 UTC m=+0.068167227 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:50:58 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:50:58 np0005625204.localdomain podman[307377]: 2026-02-20 09:50:58.278311799 +0000 UTC m=+0.219635990 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 09:50:58 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:50:58 np0005625204.localdomain dnsmasq[307014]: exiting on receipt of SIGTERM
Feb 20 09:50:58 np0005625204.localdomain podman[307434]: 2026-02-20 09:50:58.474759404 +0000 UTC m=+0.055206049 container kill cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:50:58 np0005625204.localdomain systemd[1]: libpod-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099.scope: Deactivated successfully.
Feb 20 09:50:58 np0005625204.localdomain podman[307449]: 2026-02-20 09:50:58.542564149 +0000 UTC m=+0.057121276 container died cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:50:58 np0005625204.localdomain podman[307449]: 2026-02-20 09:50:58.576168252 +0000 UTC m=+0.090725329 container cleanup cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:50:58 np0005625204.localdomain systemd[1]: libpod-conmon-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099.scope: Deactivated successfully.
Feb 20 09:50:58 np0005625204.localdomain sshd[307375]: Invalid user mailuser from 103.191.14.210 port 47012
Feb 20 09:50:58 np0005625204.localdomain podman[307455]: 2026-02-20 09:50:58.620339141 +0000 UTC m=+0.124721315 container remove cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-f5fde837-7459-455b-ad78-388d579e00e0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:50:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:58.651 264355 INFO neutron.agent.dhcp.agent [None req-f92b337a-9f33-4906-876a-909ecea07fd9 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:50:58 np0005625204.localdomain ceph-mon[301857]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:50:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/306112953' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:50:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:50:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:50:58.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:50:58 np0005625204.localdomain sshd[307375]: Received disconnect from 103.191.14.210 port 47012:11: Bye Bye [preauth]
Feb 20 09:50:58 np0005625204.localdomain sshd[307375]: Disconnected from invalid user mailuser 103.191.14.210 port 47012 [preauth]
Feb 20 09:50:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:50:58.925 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:50:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-64f6be432043f77aaa1f4774174218e0ed969c67419b29f51e1d1f5ed1490642-merged.mount: Deactivated successfully.
Feb 20 09:50:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb735638ae55338806ecb03746e881673d608edbd27bd26a94e926b580d4d099-userdata-shm.mount: Deactivated successfully.
Feb 20 09:50:59 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2df5fde837\x2d7459\x2d455b\x2dad78\x2d388d579e00e0.mount: Deactivated successfully.
Feb 20 09:50:59 np0005625204.localdomain sshd[307479]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:51:00 np0005625204.localdomain sshd[307479]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:51:00 np0005625204.localdomain ceph-mon[301857]: pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 85 B/s wr, 6 op/s
Feb 20 09:51:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:01.496 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4020676950' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:02.080 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:02 np0005625204.localdomain ceph-mon[301857]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 85 B/s wr, 6 op/s
Feb 20 09:51:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1455703923' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2106398093' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:51:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2106398093' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:51:03 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:03.135 264355 INFO neutron.agent.linux.ip_lib [None req-a4c37026-8e84-4a7a-b634-a4c52fe77266 - - - - - -] Device tape1376599-c9 cannot be used as it has no MAC address
Feb 20 09:51:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:03.155 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:03.201 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:03 np0005625204.localdomain kernel: device tape1376599-c9 entered promiscuous mode
Feb 20 09:51:03 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581063.2121] manager: (tape1376599-c9): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Feb 20 09:51:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:03Z|00081|binding|INFO|Claiming lport e1376599-c9f0-4546-a6b8-9a26e1215192 for this chassis.
Feb 20 09:51:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:03Z|00082|binding|INFO|e1376599-c9f0-4546-a6b8-9a26e1215192: Claiming unknown
Feb 20 09:51:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:03.214 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:03 np0005625204.localdomain systemd-udevd[307491]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:51:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:03.225 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ce7589beebc4b9187ac7a68f3264776', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b998e545-1dcc-4262-8de8-c6bf3daefa6e, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e1376599-c9f0-4546-a6b8-9a26e1215192) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:03.227 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e1376599-c9f0-4546-a6b8-9a26e1215192 in datapath c461e2c0-bc21-4786-8276-a80f7d59d18a bound to our chassis
Feb 20 09:51:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:03.229 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c461e2c0-bc21-4786-8276-a80f7d59d18a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:51:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:03.230 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[39f30aeb-f288-4c4b-b553-bc5304f5da49]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:03Z|00083|binding|INFO|Setting lport e1376599-c9f0-4546-a6b8-9a26e1215192 ovn-installed in OVS
Feb 20 09:51:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:03Z|00084|binding|INFO|Setting lport e1376599-c9f0-4546-a6b8-9a26e1215192 up in Southbound
Feb 20 09:51:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:03.255 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape1376599-c9: No such device
Feb 20 09:51:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:03.288 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:03.317 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:04 np0005625204.localdomain podman[307563]: 
Feb 20 09:51:04 np0005625204.localdomain podman[307563]: 2026-02-20 09:51:04.107676781 +0000 UTC m=+0.085201236 container create 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:04 np0005625204.localdomain systemd[1]: Started libpod-conmon-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3.scope.
Feb 20 09:51:04 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:51:04 np0005625204.localdomain podman[307563]: 2026-02-20 09:51:04.06749515 +0000 UTC m=+0.045019625 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:51:04 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/678080229dd1159fab2cef2bc14cfbf02bb404410bb244a2f4e58b96ed1ce8f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:51:04 np0005625204.localdomain podman[307563]: 2026-02-20 09:51:04.177214977 +0000 UTC m=+0.154739432 container init 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true)
Feb 20 09:51:04 np0005625204.localdomain podman[307563]: 2026-02-20 09:51:04.185798894 +0000 UTC m=+0.163323349 container start 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:51:04 np0005625204.localdomain dnsmasq[307581]: started, version 2.85 cachesize 150
Feb 20 09:51:04 np0005625204.localdomain dnsmasq[307581]: DNS service limited to local subnets
Feb 20 09:51:04 np0005625204.localdomain dnsmasq[307581]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:51:04 np0005625204.localdomain dnsmasq[307581]: warning: no upstream servers configured
Feb 20 09:51:04 np0005625204.localdomain dnsmasq-dhcp[307581]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:51:04 np0005625204.localdomain dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 0 addresses
Feb 20 09:51:04 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host
Feb 20 09:51:04 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts
Feb 20 09:51:04 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:04.309 264355 INFO neutron.agent.dhcp.agent [None req-6fbaa961-c7b3-4807-9c99-6827d93e37ef - - - - - -] DHCP configuration for ports {'5dfd33d6-db01-479a-af8d-bbb800d50548'} is completed
Feb 20 09:51:04 np0005625204.localdomain ceph-mon[301857]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 711 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 85 B/s wr, 6 op/s
Feb 20 09:51:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:05.439 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:05.978 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:06.014 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:06.015 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:06 np0005625204.localdomain ceph-mon[301857]: pgmap v86: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.1 MiB/s rd, 1.8 MiB/s wr, 89 op/s
Feb 20 09:51:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:51:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:07.130 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:07 np0005625204.localdomain podman[307582]: 2026-02-20 09:51:07.190181873 +0000 UTC m=+0.128410516 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 20 09:51:07 np0005625204.localdomain podman[307582]: 2026-02-20 09:51:07.225987542 +0000 UTC m=+0.164216155 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:51:07 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:51:07 np0005625204.localdomain sshd[307601]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:51:08 np0005625204.localdomain ceph-mon[301857]: pgmap v87: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.1 MiB/s rd, 1.8 MiB/s wr, 89 op/s
Feb 20 09:51:08 np0005625204.localdomain sshd[307601]: Invalid user gpadmin from 203.228.30.198 port 44752
Feb 20 09:51:08 np0005625204.localdomain sshd[307601]: Received disconnect from 203.228.30.198 port 44752:11: Bye Bye [preauth]
Feb 20 09:51:08 np0005625204.localdomain sshd[307601]: Disconnected from invalid user gpadmin 203.228.30.198 port 44752 [preauth]
Feb 20 09:51:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:09.930 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:10.350 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:10.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:10 np0005625204.localdomain ceph-mon[301857]: pgmap v88: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 107 op/s
Feb 20 09:51:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:11.694 2 INFO neutron.agent.securitygroups_rpc [None req-12bd9327-2dd3-43c4-b987-ac4cbf3c449a 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']
Feb 20 09:51:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:12.161 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:12.198 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:11Z, description=, device_id=11869463-2b1a-4016-a65b-70d38a714c73, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59deaf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59ded60>], id=8799e7fd-5c02-4d3b-a4f3-6cd59f823eec, ip_allocation=immediate, mac_address=fa:16:3e:06:57:ef, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:00Z, description=, dns_domain=, id=c461e2c0-bc21-4786-8276-a80f7d59d18a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-776197744-network, port_security_enabled=True, project_id=5ce7589beebc4b9187ac7a68f3264776, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31945, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=445, status=ACTIVE, subnets=['d96daeed-2eb7-4135-afcb-67a1e510422b'], tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:02Z, vlan_transparent=None, network_id=c461e2c0-bc21-4786-8276-a80f7d59d18a, port_security_enabled=False, project_id=5ce7589beebc4b9187ac7a68f3264776, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=505, status=DOWN, tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:11Z on network c461e2c0-bc21-4786-8276-a80f7d59d18a
Feb 20 09:51:12 np0005625204.localdomain dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 1 addresses
Feb 20 09:51:12 np0005625204.localdomain podman[307620]: 2026-02-20 09:51:12.433441185 +0000 UTC m=+0.063390384 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:51:12 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host
Feb 20 09:51:12 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts
Feb 20 09:51:12 np0005625204.localdomain ceph-mon[301857]: pgmap v89: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Feb 20 09:51:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:12.683 264355 INFO neutron.agent.dhcp.agent [None req-3b0739f5-8c16-4d34-9227-1e43e182fb2e - - - - - -] DHCP configuration for ports {'8799e7fd-5c02-4d3b-a4f3-6cd59f823eec'} is completed
Feb 20 09:51:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:13.679 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:11Z, description=, device_id=11869463-2b1a-4016-a65b-70d38a714c73, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a11550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a115b0>], id=8799e7fd-5c02-4d3b-a4f3-6cd59f823eec, ip_allocation=immediate, mac_address=fa:16:3e:06:57:ef, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:00Z, description=, dns_domain=, id=c461e2c0-bc21-4786-8276-a80f7d59d18a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesNegativeTestJSON-776197744-network, port_security_enabled=True, project_id=5ce7589beebc4b9187ac7a68f3264776, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31945, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=445, status=ACTIVE, subnets=['d96daeed-2eb7-4135-afcb-67a1e510422b'], tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:02Z, vlan_transparent=None, network_id=c461e2c0-bc21-4786-8276-a80f7d59d18a, port_security_enabled=False, project_id=5ce7589beebc4b9187ac7a68f3264776, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=505, status=DOWN, tags=[], tenant_id=5ce7589beebc4b9187ac7a68f3264776, updated_at=2026-02-20T09:51:11Z on network c461e2c0-bc21-4786-8276-a80f7d59d18a
Feb 20 09:51:13 np0005625204.localdomain systemd[1]: tmp-crun.j09zt6.mount: Deactivated successfully.
Feb 20 09:51:13 np0005625204.localdomain podman[307658]: 2026-02-20 09:51:13.912250301 +0000 UTC m=+0.080443153 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:51:13 np0005625204.localdomain dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 1 addresses
Feb 20 09:51:13 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host
Feb 20 09:51:13 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts
Feb 20 09:51:14 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:14.136 264355 INFO neutron.agent.dhcp.agent [None req-3356d537-3256-4a3d-b5eb-608060d82911 - - - - - -] DHCP configuration for ports {'8799e7fd-5c02-4d3b-a4f3-6cd59f823eec'} is completed
Feb 20 09:51:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:14.219 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:14 np0005625204.localdomain ceph-mon[301857]: pgmap v90: 177 pgs: 177 active+clean; 192 MiB data, 775 MiB used, 41 GiB / 42 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 100 op/s
Feb 20 09:51:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e94 e94: 6 total, 6 up, 6 in
Feb 20 09:51:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:15.642 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:15 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:15Z|00085|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:51:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:15.918 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:51:16 np0005625204.localdomain systemd[1]: tmp-crun.7OWOPN.mount: Deactivated successfully.
Feb 20 09:51:16 np0005625204.localdomain podman[307680]: 2026-02-20 09:51:16.152603769 +0000 UTC m=+0.089238756 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:51:16 np0005625204.localdomain podman[307680]: 2026-02-20 09:51:16.166076441 +0000 UTC m=+0.102711428 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:51:16 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:51:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:16.259 2 INFO neutron.agent.securitygroups_rpc [None req-dd3e0c14-3c22-4790-87f5-ba03a5ef1aea ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']
Feb 20 09:51:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:16.379 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:15Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d9850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d9970>], id=609a0699-8716-4bf8-9f50-bfeec5f65721, ip_allocation=immediate, mac_address=fa:16:3e:c0:a3:f9, name=tempest-parent-420346976, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:50:49Z, description=, dns_domain=, id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-LiveMigrationTest-982155183-network, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=59450, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=331, status=ACTIVE, subnets=['c9423f67-342b-44f2-ac81-92ef706f7aa6'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:50:50Z, vlan_transparent=None, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=521, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:51:15Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:51:16 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 2 addresses
Feb 20 09:51:16 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:51:16 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:51:16 np0005625204.localdomain podman[307722]: 2026-02-20 09:51:16.576584009 +0000 UTC m=+0.048712216 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:51:16 np0005625204.localdomain ceph-mon[301857]: pgmap v91: 177 pgs: 177 active+clean; 217 MiB data, 865 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 3.8 MiB/s wr, 149 op/s
Feb 20 09:51:16 np0005625204.localdomain ceph-mon[301857]: osdmap e94: 6 total, 6 up, 6 in
Feb 20 09:51:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:16.798 264355 INFO neutron.agent.dhcp.agent [None req-545fe686-f9ee-4026-ba90-55b31207d3e1 - - - - - -] DHCP configuration for ports {'609a0699-8716-4bf8-9f50-bfeec5f65721'} is completed
Feb 20 09:51:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:17.199 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:17 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:17.441 2 INFO neutron.agent.securitygroups_rpc [None req-c36d1673-2dec-447b-a8b3-50030e0a0823 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']
Feb 20 09:51:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:51:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:51:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:51:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 20 09:51:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:51:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19264 "" "Go-http-client/1.1"
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.208 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.238 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.239 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '256dada4-5f98-4b1c-9400-21beaabff7e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.209598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b418f33e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '87946d9f0f26cb5d65c0c4b737223a80bd5c528df60490d05314641979eb4219'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.209598', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b41908ce-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '8341a97d95e34528afc5a026d6388e13c5364d3034fa81efd639aced69e866d1'}]}, 'timestamp': '2026-02-20 09:51:18.240090', '_unique_id': '709bf9dd951947b1b897c24e6bd4f9fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.241 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.242 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.243 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.246 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f7df49b-c707-4074-a173-1df2e06ede56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.243183', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b41a13e0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '411506a4f53bf167bda6d25bcf2598a14864a452893bdad871bc264e4d040150'}]}, 'timestamp': '2026-02-20 09:51:18.246950', '_unique_id': '735add1382994b9eb670cb71ed4bbe32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.247 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.249 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e08b957-6eca-478d-af56-64d777e8acc8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.249741', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b41a9914-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '5cb09b3c4aae2f15e9c0fc07b4837ad6d81b30da9b6fee494a191ece0e11a855'}]}, 'timestamp': '2026-02-20 09:51:18.250356', '_unique_id': 'ea3b9d2d97094c8d8837df79453bd1b3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.252 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98f0fd24-8f7a-4367-ab7f-dc355dfc3849', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.252996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b41b154c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '629e655059f19c753c2f4495ab8c07767f87b7f7be3c75e8643b77155d6e05e9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.252996', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b41b299c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '588558e4cc07b0649db69bf9a815c7471a05b0215f47e6577271526bb8c07de8'}]}, 'timestamp': '2026-02-20 09:51:18.254052', '_unique_id': '342eb19806df4fae9e0b834665495f87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.254 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.256 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '45ef484a-765f-45e9-b7bd-0ea7319e0702', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.256757', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b41ba8b8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '9cda7638a540ca035193aec621f25f5005c7c4af0ad2e8da6a0b93ff483fb00e'}]}, 'timestamp': '2026-02-20 09:51:18.257304', '_unique_id': 'fb079755193c4a3592d3b1b043bfa425'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.258 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.259 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d0275d9-b710-40de-b8c7-b80623e4e1d6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:51:18.260146', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'b41eaf5e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.515627067, 'message_signature': '979f2facea36fa7c892bdbdd621b230d0462f72a2f1cc6dcf24f69a4f6c46717'}]}, 'timestamp': '2026-02-20 09:51:18.277129', '_unique_id': '4b98327a71164f61b31753912dbcb57a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.341 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5df4177-c90f-472d-9e16-b59a373c0441', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.332167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b4288524-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '149a6cce4857b69294c88c71d6b7a069ff78e47e19de73f9ce12246c27c5b055'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.332167', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b4289780-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '93e271380e82c4e0e0cd5e476e3d94ebffcfb5766ba0cdb822ab1a5f25713dec'}]}, 'timestamp': '2026-02-20 09:51:18.342059', '_unique_id': 'eb3d3cc30efc40e3aa530297c67f73c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.343 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.344 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.344 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 16410000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c30287a-212a-4a0b-b194-df603d9396a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 16410000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:51:18.344552', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'b4290c60-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.515627067, 'message_signature': '27fd54a1e9bcbd59d640640ef88c3f251097d0acbcfff78c4b62ba6890ec6b78'}]}, 'timestamp': '2026-02-20 09:51:18.345038', '_unique_id': '26d83cdf9ffd4cee9be8d993070062a0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.345 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.347 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.347 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '200419c1-83b9-46ce-93c3-7894d4eb8b37', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.347145', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b429701a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '18146b90d526b6cc6a5c16891abd7bfe2d658b0724896dfae9f1dea2cb840851'}]}, 'timestamp': '2026-02-20 09:51:18.347604', '_unique_id': '17c933aa86ff4527a7a5baee69a72a20'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.348 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.349 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.349 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5a34f86-a4f4-4d4d-9d36-94b3eb236daa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.349691', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b429d3c0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '52f942a50a692f7f2336547fb226c1a144e860e69101062c88d59db6461756f7'}]}, 'timestamp': '2026-02-20 09:51:18.350180', '_unique_id': '266c9dbb83c149a081130aafa9affa2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.351 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.352 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dec1583d-7c13-473a-8eef-2ac59e575bdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.352264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42a37c0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '701219493d81052b50baf13165e857beec324ff49f62e28964ac7eeea2d44582'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.352264', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42a490e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '73ba829f44fce98045a0543ac11c103666096e928f15ebe5eb9c17a23344164d'}]}, 'timestamp': '2026-02-20 09:51:18.353127', '_unique_id': 'c8020be45a7944dca0407ff8867de070'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.354 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.355 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b88c5e6c-1f89-4546-8a09-bfea5aae7dc9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.355313', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42aaef8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '07b157154b622adae74ea027a3ac25fea2110251864ebe325356fdde498f2e89'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.355313', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42ac032-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '0cc5d6619dcb379128b3369089d4479c44461a4812e9d5624571d71c9f041e0f'}]}, 'timestamp': '2026-02-20 09:51:18.356174', '_unique_id': 'e4e63f354a68473690cc47ed0de8ca3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.357 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.358 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.358 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.358 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5aa818c-2f4e-4e79-99ff-d4d91d7c3ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.358293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42b2414-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'dc6d14a5f518c2236f321fc53138388c3afd581857db2c5869d0e446191a7e72'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.358293', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42b36a2-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '6a246f6b79746c8977d21cad23cd8bdd13b9d13b43ee77ca2a52eb9a3a480a5b'}]}, 'timestamp': '2026-02-20 09:51:18.359209', '_unique_id': '18ac07b2f31b4efabd2f93429b182269'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.360 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5156e247-11b7-4704-93f7-ba16cb7584ab', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.361339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42b9a48-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': '95a35c9a3a9e73f35a765f1d147499a0ee7ea505c89af3993be03382bd66c1a3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.361339', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42bab78-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'd335b71cd81d9f584263c4ebff0e3ba8827c511c4f443ed160a6c151ecf98b4f'}]}, 'timestamp': '2026-02-20 09:51:18.362232', '_unique_id': 'ca0f32f813a747b7adec95f56e406e7c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.363 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f958db2c-bded-426a-92be-e6dde503f416', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.364407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42c1266-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': '36057ef342ab2647ed3f61d8517ef47b0dbc90daee12ad75819303c64f28288c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.364407', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42c23f0-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.571369821, 'message_signature': 'e38e3ae862e273e4a47620ac9552068999574ee7bbd113212d366ce0074b55d0'}]}, 'timestamp': '2026-02-20 09:51:18.365284', '_unique_id': '070e0a75ffb74c68a2c1c138bc7b80fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.366 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.367 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.367 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a74aa16a-ca3d-40a8-8e3f-6cfe83fa1aed', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.367162', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42c7af8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': 'f8cc19db6b907fd5592c1a257e76a93eafe4d9902408d810d41347f7912140ba'}]}, 'timestamp': '2026-02-20 09:51:18.367455', '_unique_id': '18da39b7229a4d4da949c11962118826'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.368 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd667267-2e09-4a23-8801-6ee78c81a17a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.368824', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42cbbbc-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '969c741fad99bfa85917933a41da62a85c31ff7fd92a89f8136dbf527f374d72'}]}, 'timestamp': '2026-02-20 09:51:18.369112', '_unique_id': '63e5d7964b2346fb84beab66f97afeca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.370 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b0c1b8f-6dca-42ff-b9f3-e2a331c5854c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.370408', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42cf974-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': '91423644838db6c51b5df22716d8a3618b5e0351f410aa9aabc25da0b7b1f995'}]}, 'timestamp': '2026-02-20 09:51:18.370711', '_unique_id': 'c8b519a4f3204b08b8fbb0a635f618b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.371 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ff8c79e-095b-4d1c-abf5-fc227cbc37e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.372061', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42d39e8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': 'dd778da10f5fd1fe1c1954adf425aea1a36f66b86c7f6f1bb1285315e0c6f6f2'}]}, 'timestamp': '2026-02-20 09:51:18.372341', '_unique_id': 'f56e96d1d6004ca8b9fca226d7a1d1e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.372 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.373 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cfd08a11-8fbe-43f1-9824-4287a61a5628', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:51:18.373594', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'b42d76c4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.482391755, 'message_signature': 'ef4fa13c92ca2f6552e3ac8291cdcc3233c5d022fe531389ce0f8369853d3cc3'}]}, 'timestamp': '2026-02-20 09:51:18.373899', '_unique_id': '80c32f01b81946aea392765742c01c23'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.374 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.375 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.375 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.375 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b1cb260-fa20-4fe4-b01c-fc24c3a5014e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:51:18.375161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'b42db2ec-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'a8a6942dba88642a7fc121932b879ef8b3295b14d1114ed4286408256de38eee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:51:18.375161', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'b42dbcba-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11637.448848594, 'message_signature': 'db3547d5673f97c7f1b7d1d03b54be64b80d99a331865769fd827469a17d4b09'}]}, 'timestamp': '2026-02-20 09:51:18.375695', '_unique_id': 'fa8ca3e1ca774a888a9b2c2f3de747bb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:51:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:51:18.376 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:51:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e95 e95: 6 total, 6 up, 6 in
Feb 20 09:51:18 np0005625204.localdomain ceph-mon[301857]: pgmap v93: 177 pgs: 177 active+clean; 217 MiB data, 865 MiB used, 41 GiB / 42 GiB avail; 1.0 MiB/s rd, 2.4 MiB/s wr, 80 op/s
Feb 20 09:51:18 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:18.766 2 INFO neutron.agent.securitygroups_rpc [req-3c77ea9c-030b-4c3f-a6b2-e9f761f0d591 req-ae9f50d3-4bb2-48d4-a279-bccc17ebbc38 19c6a0af0d664b5d92fdce6a6ecdbcc4 5ce7589beebc4b9187ac7a68f3264776 - - default default] Security group rule updated ['ddf49fd2-9d36-4d8c-9b90-f70fbafa6560']
Feb 20 09:51:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:19.300 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:19.360 2 INFO neutron.agent.securitygroups_rpc [req-cf23cb9e-603b-4426-8e72-b88eccda31be req-4c57a8df-f22b-4186-8db6-2e7fa9aa1e7d 19c6a0af0d664b5d92fdce6a6ecdbcc4 5ce7589beebc4b9187ac7a68f3264776 - - default default] Security group rule updated ['ddf49fd2-9d36-4d8c-9b90-f70fbafa6560']
Feb 20 09:51:19 np0005625204.localdomain ceph-mon[301857]: osdmap e95: 6 total, 6 up, 6 in
Feb 20 09:51:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:19.903 2 INFO neutron.agent.securitygroups_rpc [None req-b04d749d-19a2-4f89-bafc-552dc6778fc9 ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']
Feb 20 09:51:20 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:20.421 264355 INFO neutron.agent.linux.ip_lib [None req-d50820b1-4b96-4482-8fee-3816155b48ac - - - - - -] Device tap22bf7523-8a cannot be used as it has no MAC address
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain kernel: device tap22bf7523-8a entered promiscuous mode
Feb 20 09:51:20 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581080.4495] manager: (tap22bf7523-8a): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Feb 20 09:51:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:20Z|00086|binding|INFO|Claiming lport 22bf7523-8a19-46b0-a0b7-53070ea1823e for this chassis.
Feb 20 09:51:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:20Z|00087|binding|INFO|22bf7523-8a19-46b0-a0b7-53070ea1823e: Claiming unknown
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.452 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain systemd-udevd[307753]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.459 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:20.468 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=22bf7523-8a19-46b0-a0b7-53070ea1823e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:20.470 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 22bf7523-8a19-46b0-a0b7-53070ea1823e in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc bound to our chassis
Feb 20 09:51:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:20.474 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc126e7a-67b5-4025-9da6-7c8301672033 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:51:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:20.474 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9021dc49-7e01-42e7-8f32-572dec89afcc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:20.475 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[96bec316-613d-416d-8fc8-cdcd0a3f32b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:20Z|00088|binding|INFO|Setting lport 22bf7523-8a19-46b0-a0b7-53070ea1823e ovn-installed in OVS
Feb 20 09:51:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:20Z|00089|binding|INFO|Setting lport 22bf7523-8a19-46b0-a0b7-53070ea1823e up in Southbound
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.491 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.514 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap22bf7523-8a: No such device
Feb 20 09:51:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:20.539 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:20 np0005625204.localdomain ceph-mon[301857]: pgmap v95: 177 pgs: 177 active+clean; 236 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 5.3 MiB/s rd, 4.1 MiB/s wr, 168 op/s
Feb 20 09:51:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e96 e96: 6 total, 6 up, 6 in
Feb 20 09:51:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e96 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:21 np0005625204.localdomain podman[307825]: 
Feb 20 09:51:21 np0005625204.localdomain podman[307825]: 2026-02-20 09:51:21.450463321 +0000 UTC m=+0.082415821 container create d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:51:21 np0005625204.localdomain systemd[1]: Started libpod-conmon-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635.scope.
Feb 20 09:51:21 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:51:21 np0005625204.localdomain podman[307825]: 2026-02-20 09:51:21.411924351 +0000 UTC m=+0.043876861 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:51:21 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/faa9e2373e9edef47d940be28280fa18e93a1d836a7561ac2b42ed8a739e240e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:51:21 np0005625204.localdomain podman[307825]: 2026-02-20 09:51:21.523756991 +0000 UTC m=+0.155709481 container init d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 20 09:51:21 np0005625204.localdomain podman[307825]: 2026-02-20 09:51:21.531691017 +0000 UTC m=+0.163643517 container start d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:51:21 np0005625204.localdomain dnsmasq[307844]: started, version 2.85 cachesize 150
Feb 20 09:51:21 np0005625204.localdomain dnsmasq[307844]: DNS service limited to local subnets
Feb 20 09:51:21 np0005625204.localdomain dnsmasq[307844]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:51:21 np0005625204.localdomain dnsmasq[307844]: warning: no upstream servers configured
Feb 20 09:51:21 np0005625204.localdomain dnsmasq-dhcp[307844]: DHCP, static leases only on 19.80.0.0, lease time 1d
Feb 20 09:51:21 np0005625204.localdomain dnsmasq[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/addn_hosts - 0 addresses
Feb 20 09:51:21 np0005625204.localdomain dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/host
Feb 20 09:51:21 np0005625204.localdomain dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/opts
Feb 20 09:51:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:21.595 264355 INFO neutron.agent.dhcp.agent [None req-6470ccaf-2d7c-4ace-b3df-cc75451ce300 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:19Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a62ac0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a62700>], id=ce4822a0-5e7a-4c40-9856-6c8879a12ac7, ip_allocation=immediate, mac_address=fa:16:3e:ef:22:88, name=tempest-subport-288633192, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:17Z, description=, dns_domain=, id=9021dc49-7e01-42e7-8f32-572dec89afcc, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-1209378868, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36362, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=526, status=ACTIVE, subnets=['dd9ea435-5cb2-4df9-b036-81064a982eb1'], tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:51:18Z, vlan_transparent=None, network_id=9021dc49-7e01-42e7-8f32-572dec89afcc, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=536, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, updated_at=2026-02-20T09:51:19Z on network 9021dc49-7e01-42e7-8f32-572dec89afcc
Feb 20 09:51:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:21.689 264355 INFO neutron.agent.dhcp.agent [None req-014ad46e-8117-4079-a33a-14af686b614c - - - - - -] DHCP configuration for ports {'8069ffae-e153-4a3e-ac83-1cd290da58a3'} is completed
Feb 20 09:51:21 np0005625204.localdomain ceph-mon[301857]: osdmap e96: 6 total, 6 up, 6 in
Feb 20 09:51:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e97 e97: 6 total, 6 up, 6 in
Feb 20 09:51:21 np0005625204.localdomain dnsmasq[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/addn_hosts - 1 addresses
Feb 20 09:51:21 np0005625204.localdomain dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/host
Feb 20 09:51:21 np0005625204.localdomain dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/opts
Feb 20 09:51:21 np0005625204.localdomain podman[307860]: 2026-02-20 09:51:21.834247991 +0000 UTC m=+0.060672502 container kill d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:51:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:51:22 np0005625204.localdomain podman[307882]: 2026-02-20 09:51:22.143441644 +0000 UTC m=+0.079967349 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:51:22 np0005625204.localdomain podman[307882]: 2026-02-20 09:51:22.17712617 +0000 UTC m=+0.113651865 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:51:22 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:51:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:22.234 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:22 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:22.550 264355 INFO neutron.agent.dhcp.agent [None req-0d61c553-c9e7-4395-903b-ef9b589e58d8 - - - - - -] DHCP configuration for ports {'ce4822a0-5e7a-4c40-9856-6c8879a12ac7'} is completed
Feb 20 09:51:22 np0005625204.localdomain ceph-mon[301857]: pgmap v97: 177 pgs: 177 active+clean; 236 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 6.5 MiB/s rd, 1.4 MiB/s wr, 126 op/s
Feb 20 09:51:22 np0005625204.localdomain ceph-mon[301857]: osdmap e97: 6 total, 6 up, 6 in
Feb 20 09:51:22 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3950457057' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:22.905 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:22.906 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:22.934 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 20 09:51:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.137 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.138 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.143 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.144 281292 INFO nova.compute.claims [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Claim successful on node np0005625204.localdomain
Feb 20 09:51:23 np0005625204.localdomain podman[307905]: 2026-02-20 09:51:23.161816862 +0000 UTC m=+0.093304066 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, version=9.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:51:23 np0005625204.localdomain podman[307905]: 2026-02-20 09:51:23.208438805 +0000 UTC m=+0.139925989 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, release=1770267347, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-02-05T04:57:10Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9/ubi-minimal)
Feb 20 09:51:23 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.465 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.486 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.487 281292 DEBUG nova.compute.provider_tree [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.549 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.586 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:51:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:23.647 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:24 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:51:24 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2724867462' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.122 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.129 281292 DEBUG nova.compute.provider_tree [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.149 281292 DEBUG nova.scheduler.client.report [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.184 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.046s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.185 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.237 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.238 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.257 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.282 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.399 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.400 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.401 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Creating image(s)
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.438 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.475 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.516 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.521 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "3692da63af034f7d594aac7c4b8eda10742f09b0" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.522 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "3692da63af034f7d594aac7c4b8eda10742f09b0" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.544 281292 WARNING oslo_policy.policy [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.544 281292 WARNING oslo_policy.policy [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.546 281292 DEBUG nova.policy [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ba15d0e9919d4594a2e6e9d6b3414a5e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 20 09:51:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:24.604 281292 DEBUG nova.virt.libvirt.imagebackend [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Image locations are: [{'url': 'rbd://a8557ee9-b55d-5519-942c-cf8f6172f1d8/images/06bd71fd-c415-45d9-b669-46209b7ca2f4/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://a8557ee9-b55d-5519-942c-cf8f6172f1d8/images/06bd71fd-c415-45d9-b669-46209b7ca2f4/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Feb 20 09:51:24 np0005625204.localdomain ceph-mon[301857]: pgmap v99: 177 pgs: 177 active+clean; 236 MiB data, 872 MiB used, 41 GiB / 42 GiB avail; 6.5 MiB/s rd, 1.4 MiB/s wr, 126 op/s
Feb 20 09:51:24 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3777396642' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:24 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2724867462' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:24Z|00090|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.090 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:25 np0005625204.localdomain systemd[1]: tmp-crun.LHJAEf.mount: Deactivated successfully.
Feb 20 09:51:25 np0005625204.localdomain dnsmasq[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/addn_hosts - 0 addresses
Feb 20 09:51:25 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/host
Feb 20 09:51:25 np0005625204.localdomain dnsmasq-dhcp[307581]: read /var/lib/neutron/dhcp/c461e2c0-bc21-4786-8276-a80f7d59d18a/opts
Feb 20 09:51:25 np0005625204.localdomain podman[308018]: 2026-02-20 09:51:25.357563327 +0000 UTC m=+0.078545736 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:25Z|00091|binding|INFO|Releasing lport e1376599-c9f0-4546-a6b8-9a26e1215192 from this chassis (sb_readonly=0)
Feb 20 09:51:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:25Z|00092|binding|INFO|Setting lport e1376599-c9f0-4546-a6b8-9a26e1215192 down in Southbound
Feb 20 09:51:25 np0005625204.localdomain kernel: device tape1376599-c9 left promiscuous mode
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.549 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:25.552 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c461e2c0-bc21-4786-8276-a80f7d59d18a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5ce7589beebc4b9187ac7a68f3264776', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b998e545-1dcc-4262-8de8-c6bf3daefa6e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e1376599-c9f0-4546-a6b8-9a26e1215192) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.552 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:25.556 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e1376599-c9f0-4546-a6b8-9a26e1215192 in datapath c461e2c0-bc21-4786-8276-a80f7d59d18a unbound from our chassis
Feb 20 09:51:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:25.560 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c461e2c0-bc21-4786-8276-a80f7d59d18a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:25.561 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[04b51bf8-61bb-4b6e-afef-236c35dece96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.638 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.728 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.730 281292 DEBUG nova.virt.images [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] 06bd71fd-c415-45d9-b669-46209b7ca2f4 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.731 281292 DEBUG nova.privsep.utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.732 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e98 e98: 6 total, 6 up, 6 in
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.975 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.part /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted" returned: 0 in 0.243s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:25.980 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:26.053 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0.converted --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:26.055 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "3692da63af034f7d594aac7c4b8eda10742f09b0" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:26.093 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:26.099 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:26 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:26.476 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005625204.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:15Z, description=, device_id=90eb8d1f-8d13-4395-9d15-67fdaa60632d, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d2c10>], dns_domain=, dns_name=tempest-livemigrationtest-server-721665546, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d2ac0>], id=609a0699-8716-4bf8-9f50-bfeec5f65721, ip_allocation=immediate, mac_address=fa:16:3e:c0:a3:f9, name=tempest-parent-420346976, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=521, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d2ca0>], trunk_id=bb723cd7-ac34-46b0-bf66-79c7ed1fe96f, updated_at=2026-02-20T09:51:25Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:51:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:51:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:51:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:51:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:51:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:51:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:51:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:26.704 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/3692da63af034f7d594aac7c4b8eda10742f09b0 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.605s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:26 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 2 addresses
Feb 20 09:51:26 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:51:26 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:51:26 np0005625204.localdomain podman[308111]: 2026-02-20 09:51:26.714447414 +0000 UTC m=+0.066008582 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:51:26 np0005625204.localdomain ceph-mon[301857]: pgmap v100: 177 pgs: 177 active+clean; 224 MiB data, 868 MiB used, 41 GiB / 42 GiB avail; 10 MiB/s rd, 7.1 MiB/s wr, 301 op/s
Feb 20 09:51:26 np0005625204.localdomain ceph-mon[301857]: osdmap e98: 6 total, 6 up, 6 in
Feb 20 09:51:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:26.836 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] resizing rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Feb 20 09:51:26 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:26.923 264355 INFO neutron.agent.dhcp.agent [None req-2127b375-9167-4379-8284-d0a570c4be85 - - - - - -] DHCP configuration for ports {'609a0699-8716-4bf8-9f50-bfeec5f65721'} is completed
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.024 281292 DEBUG nova.objects.instance [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lazy-loading 'migration_context' on Instance uuid 90eb8d1f-8d13-4395-9d15-67fdaa60632d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.038 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.039 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Ensure instance console log exists: /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.039 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.040 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.041 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:27Z|00093|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.236 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.239 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:27 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e99 e99: 6 total, 6 up, 6 in
Feb 20 09:51:27 np0005625204.localdomain dnsmasq[307581]: exiting on receipt of SIGTERM
Feb 20 09:51:27 np0005625204.localdomain podman[308222]: 2026-02-20 09:51:27.583953147 +0000 UTC m=+0.061955261 container kill 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:51:27 np0005625204.localdomain systemd[1]: libpod-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3.scope: Deactivated successfully.
Feb 20 09:51:27 np0005625204.localdomain podman[308236]: 2026-02-20 09:51:27.659586145 +0000 UTC m=+0.059891869 container died 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:51:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-678080229dd1159fab2cef2bc14cfbf02bb404410bb244a2f4e58b96ed1ce8f7-merged.mount: Deactivated successfully.
Feb 20 09:51:27 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3-userdata-shm.mount: Deactivated successfully.
Feb 20 09:51:27 np0005625204.localdomain podman[308236]: 2026-02-20 09:51:27.753645683 +0000 UTC m=+0.153951367 container cleanup 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:51:27 np0005625204.localdomain systemd[1]: libpod-conmon-44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3.scope: Deactivated successfully.
Feb 20 09:51:27 np0005625204.localdomain podman[308238]: 2026-02-20 09:51:27.780793844 +0000 UTC m=+0.172926865 container remove 44361fff54e24a14fb69ef92c0da145ff105d77b1fc23f99715e3848e2b7f3e3 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c461e2c0-bc21-4786-8276-a80f7d59d18a, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.799 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Successfully updated port: 609a0699-8716-4bf8-9f50-bfeec5f65721 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 20 09:51:27 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:27.810 264355 INFO neutron.agent.dhcp.agent [None req-79c30ef8-3ce6-4ac2-99b9-583d7fae01d1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:51:27 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dc461e2c0\x2dbc21\x2d4786\x2d8276\x2da80f7d59d18a.mount: Deactivated successfully.
Feb 20 09:51:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3066824957' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:27 np0005625204.localdomain ceph-mon[301857]: osdmap e99: 6 total, 6 up, 6 in
Feb 20 09:51:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4036231401' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.823 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.824 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquired lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.824 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 20 09:51:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:27.906 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 20 09:51:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:28.039 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.362 281292 DEBUG nova.network.neutron [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating instance_info_cache with network_info: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.398 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Releasing lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.399 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance network_info: |[{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.404 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Start _get_guest_xml network_info=[{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-20T09:49:57Z,direct_url=<?>,disk_format='qcow2',id=06bd71fd-c415-45d9-b669-46209b7ca2f4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='91bce661d685472eb3e7cacab17bf52a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-20T09:49:59Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'size': 0, 'device_name': '/dev/vda', 'device_type': 'disk', 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'guest_format': None, 'encryption_options': None, 'image_id': '06bd71fd-c415-45d9-b669-46209b7ca2f4'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.411 281292 WARNING nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.414 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.415 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.417 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Searching host: 'np0005625204.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.418 281292 DEBUG nova.virt.libvirt.host [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.418 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.419 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-20T09:49:56Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='40a6f41a-8891-4900-942e-688a656af142',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-20T09:49:57Z,direct_url=<?>,disk_format='qcow2',id=06bd71fd-c415-45d9-b669-46209b7ca2f4,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='91bce661d685472eb3e7cacab17bf52a',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-20T09:49:59Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.420 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.420 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.420 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.421 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.421 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.422 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.422 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.423 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.423 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.424 281292 DEBUG nova.virt.hardware [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.429 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.594 281292 DEBUG nova.compute.manager [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.595 281292 DEBUG nova.compute.manager [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing instance network info cache due to event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.596 281292 DEBUG oslo_concurrency.lockutils [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.597 281292 DEBUG oslo_concurrency.lockutils [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquired lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.597 281292 DEBUG nova.network.neutron [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 20 09:51:28 np0005625204.localdomain ceph-mon[301857]: pgmap v102: 177 pgs: 177 active+clean; 224 MiB data, 868 MiB used, 41 GiB / 42 GiB avail; 4.4 MiB/s rd, 5.9 MiB/s wr, 192 op/s
Feb 20 09:51:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3035586964' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:28 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e100 e100: 6 total, 6 up, 6 in
Feb 20 09:51:28 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 20 09:51:28 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3117282067' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:28.937 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.047 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:51:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.064 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:29 np0005625204.localdomain podman[308305]: 2026-02-20 09:51:29.162474941 +0000 UTC m=+0.087445342 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:51:29 np0005625204.localdomain podman[308304]: 2026-02-20 09:51:29.188371045 +0000 UTC m=+0.111530322 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.221 281292 DEBUG nova.network.neutron [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updated VIF entry in instance network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.222 281292 DEBUG nova.network.neutron [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating instance_info_cache with network_info: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.240 281292 DEBUG oslo_concurrency.lockutils [req-f42d659d-970b-4ec5-a559-98f178f1917a req-1f1337d8-131f-43ba-8dc2-26bd4b92433b d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Releasing lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:51:29 np0005625204.localdomain podman[308304]: 2026-02-20 09:51:29.244742068 +0000 UTC m=+0.167901325 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 20 09:51:29 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:51:29 np0005625204.localdomain podman[308305]: 2026-02-20 09:51:29.29804834 +0000 UTC m=+0.223018701 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:51:29 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:51:29 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 20 09:51:29 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3784175978' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.609 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.611 281292 DEBUG nova.virt.libvirt.vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-20T09:51:24Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.612 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.613 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.616 281292 DEBUG nova.objects.instance [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 90eb8d1f-8d13-4395-9d15-67fdaa60632d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.635 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] End _get_guest_xml xml=<domain type="kvm">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <uuid>90eb8d1f-8d13-4395-9d15-67fdaa60632d</uuid>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <name>instance-00000008</name>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <memory>131072</memory>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <vcpu>1</vcpu>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <metadata>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:name>tempest-LiveMigrationTest-server-721665546</nova:name>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:creationTime>2026-02-20 09:51:28</nova:creationTime>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:flavor name="m1.nano">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:memory>128</nova:memory>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:disk>1</nova:disk>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:swap>0</nova:swap>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:ephemeral>0</nova:ephemeral>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:vcpus>1</nova:vcpus>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </nova:flavor>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:owner>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:user uuid="ba15d0e9919d4594a2e6e9d6b3414a5e">tempest-LiveMigrationTest-2108133970-project-member</nova:user>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:project uuid="e704aae5b1ba49d59262f9aa0c366fb4">tempest-LiveMigrationTest-2108133970</nova:project>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </nova:owner>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:root type="image" uuid="06bd71fd-c415-45d9-b669-46209b7ca2f4"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <nova:ports>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <nova:port uuid="609a0699-8716-4bf8-9f50-bfeec5f65721">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         </nova:port>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </nova:ports>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </nova:instance>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </metadata>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <sysinfo type="smbios">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <system>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <entry name="manufacturer">RDO</entry>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <entry name="product">OpenStack Compute</entry>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <entry name="serial">90eb8d1f-8d13-4395-9d15-67fdaa60632d</entry>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <entry name="uuid">90eb8d1f-8d13-4395-9d15-67fdaa60632d</entry>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <entry name="family">Virtual Machine</entry>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </system>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </sysinfo>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <os>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <boot dev="hd"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <smbios mode="sysinfo"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </os>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <features>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <acpi/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <apic/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <vmcoreinfo/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </features>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <clock offset="utc">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <timer name="pit" tickpolicy="delay"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <timer name="hpet" present="no"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </clock>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <cpu mode="host-model" match="exact">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <topology sockets="1" cores="1" threads="1"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </cpu>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   <devices>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <disk type="network" device="disk">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <driver type="raw" cache="none"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <source protocol="rbd" name="vms/90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.103" port="6789"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.104" port="6789"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.105" port="6789"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </source>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <auth username="openstack">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <secret type="ceph" uuid="a8557ee9-b55d-5519-942c-cf8f6172f1d8"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </auth>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <target dev="vda" bus="virtio"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <disk type="network" device="cdrom">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <driver type="raw" cache="none"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <source protocol="rbd" name="vms/90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.103" port="6789"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.104" port="6789"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <host name="172.18.0.105" port="6789"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </source>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <auth username="openstack">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:         <secret type="ceph" uuid="a8557ee9-b55d-5519-942c-cf8f6172f1d8"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       </auth>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <target dev="sda" bus="sata"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </disk>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <interface type="ethernet">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <mac address="fa:16:3e:c0:a3:f9"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <model type="virtio"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <driver name="vhost" rx_queue_size="512"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <mtu size="1442"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <target dev="tap609a0699-87"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </interface>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <serial type="pty">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <log file="/var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/console.log" append="off"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </serial>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <video>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <model type="virtio"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </video>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <input type="tablet" bus="usb"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <rng model="virtio">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <backend model="random">/dev/urandom</backend>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </rng>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="pci" model="pcie-root-port"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <controller type="usb" index="0"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     <memballoon model="virtio">
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:       <stats period="10"/>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:     </memballoon>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:   </devices>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: </domain>
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.637 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Preparing to wait for external event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.637 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.638 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.638 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.640 281292 DEBUG nova.virt.libvirt.vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-20T09:51:24Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.640 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.641 281292 DEBUG nova.network.os_vif_util [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.642 281292 DEBUG os_vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.644 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.645 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.649 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.650 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap609a0699-87, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.650 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap609a0699-87, col_values=(('external_ids', {'iface-id': '609a0699-8716-4bf8-9f50-bfeec5f65721', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:a3:f9', 'vm-uuid': '90eb8d1f-8d13-4395-9d15-67fdaa60632d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.697 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.701 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.707 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.708 281292 INFO os_vif [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87')
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.778 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.779 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.779 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] No VIF found with MAC fa:16:3e:c0:a3:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.780 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Using config drive
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.820 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:29 np0005625204.localdomain ceph-mon[301857]: osdmap e100: 6 total, 6 up, 6 in
Feb 20 09:51:29 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3117282067' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:29 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3784175978' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.948 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Creating config drive at /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config
Feb 20 09:51:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:29.954 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphx732sd1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.078 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmphx732sd1" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.119 281292 DEBUG nova.storage.rbd_utils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] rbd image 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.125 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.347 281292 DEBUG oslo_concurrency.processutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config 90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.349 281292 INFO nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Deleting local config drive /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d/disk.config because it was imported into RBD.
Feb 20 09:51:30 np0005625204.localdomain systemd[1]: Started libvirt secret daemon.
Feb 20 09:51:30 np0005625204.localdomain kernel: device tap609a0699-87 entered promiscuous mode
Feb 20 09:51:30 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581090.4548] manager: (tap609a0699-87): new Tun device (/org/freedesktop/NetworkManager/Devices/21)
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00094|binding|INFO|Claiming lport 609a0699-8716-4bf8-9f50-bfeec5f65721 for this chassis.
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00095|binding|INFO|609a0699-8716-4bf8-9f50-bfeec5f65721: Claiming fa:16:3e:c0:a3:f9 10.100.0.12
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00096|binding|INFO|Claiming lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 for this chassis.
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00097|binding|INFO|ce4822a0-5e7a-4c40-9856-6c8879a12ac7: Claiming fa:16:3e:ef:22:88 19.80.0.55
Feb 20 09:51:30 np0005625204.localdomain systemd-udevd[308456]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.459 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.474 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:a3:f9 10.100.0.12'], port_security=['fa:16:3e:c0:a3:f9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-420346976', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '90eb8d1f-8d13-4395-9d15-67fdaa60632d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-420346976', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=609a0699-8716-4bf8-9f50-bfeec5f65721) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.477 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:22:88 19.80.0.55'], port_security=['fa:16:3e:ef:22:88 19.80.0.55'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['609a0699-8716-4bf8-9f50-bfeec5f65721'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-288633192', 'neutron:cidrs': '19.80.0.55/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-288633192', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ce4822a0-5e7a-4c40-9856-6c8879a12ac7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:30 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581090.4787] device (tap609a0699-87): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00098|binding|INFO|Setting lport 609a0699-8716-4bf8-9f50-bfeec5f65721 up in Southbound
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00099|binding|INFO|Setting lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 up in Southbound
Feb 20 09:51:30 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581090.4823] device (tap609a0699-87): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.478 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 609a0699-8716-4bf8-9f50-bfeec5f65721 in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 bound to our chassis
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.482 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8d59c69c-3a69-449e-9d36-233c1f4c5c30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.483 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.488 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00100|binding|INFO|Setting lport 609a0699-8716-4bf8-9f50-bfeec5f65721 ovn-installed in OVS
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.494 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7a85a5-ace3-40f6-96c6-14fc309d89d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.495 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap51f8ae9c-11 in ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.497 162782 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap51f8ae9c-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.497 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[39a2bf12-de53-4c37-ba0e-3cf62e5b950e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.498 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[020afea6-67c4-45b0-bac2-7639e698d58b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.508 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9cc99d-ed10-4859-a9bd-745aba2d1fb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain systemd-machined[85698]: New machine qemu-3-instance-00000008.
Feb 20 09:51:30 np0005625204.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000008.
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.532 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[395f8974-e03e-4b31-9f3f-e542edecb4e7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.566 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[fdcee394-a1ec-46b7-a5d4-c7d2b1b06dfb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581090.5746] manager: (tap51f8ae9c-10): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 20 09:51:30 np0005625204.localdomain systemd-udevd[308459]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.573 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[127b08a2-a7dc-4e30-aba1-6cca0897327a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.612 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[b18b3c21-cf27-4236-8845-7c5e3bb2c50e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.617 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[909a552b-febb-4d02-8cf7-f9e128e0e395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap51f8ae9c-11: link becomes ready
Feb 20 09:51:30 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap51f8ae9c-10: link becomes ready
Feb 20 09:51:30 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581090.6464] device (tap51f8ae9c-10): carrier: link connected
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.652 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[2dcab3e8-d39b-423f-a524-96896c273395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.669 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[86a7956b-8d47-4a89-a950-61b88ebf4fb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51f8ae9c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:63:f7:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164981, 'reachable_time': 34961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308494, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.684 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[723908b4-38b3-4b96-8f11-b3090e4be4cc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe63:f7d8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1164981, 'tstamp': 1164981}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308502, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.697 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b96ee882-6b86-401e-b9ac-7c816b6c7fad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51f8ae9c-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:63:f7:d8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164981, 'reachable_time': 34961, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308511, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.730 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[de51d8e0-fdf6-4188-9307-ebc563c8159a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.793 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1bf065d6-5b6d-4fc7-ac70-eb5ace91b160]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.794 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51f8ae9c-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.794 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.795 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51f8ae9c-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain kernel: device tap51f8ae9c-10 entered promiscuous mode
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.832 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.834 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap51f8ae9c-10, col_values=(('external_ids', {'iface-id': '2b93bbc2-5aeb-49cc-b610-6f4f7708d346'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.835 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:30Z|00101|binding|INFO|Releasing lport 2b93bbc2-5aeb-49cc-b610-6f4f7708d346 from this chassis (sb_readonly=0)
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.848 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.849 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.849 162652 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.850 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[88598a3e-70f5-4e39-ac1a-abe6244f7798]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.851 162652 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: global
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     log         /dev/log local0 debug
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     log-tag     haproxy-metadata-proxy-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     user        root
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     group       root
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     maxconn     1024
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     pidfile     /var/lib/neutron/external/pids/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.pid.haproxy
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     daemon
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: defaults
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     log global
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     mode http
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     option httplog
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     option dontlognull
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     option http-server-close
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     option forwardfor
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     retries                 3
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout http-request    30s
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout connect         30s
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout client          32s
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout server          32s
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout http-keep-alive 30s
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: listen listener
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     bind 169.254.169.254:80
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     server metadata /var/lib/neutron/metadata_proxy
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:     http-request add-header X-OVN-Network-ID 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 20 09:51:30 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:30.851 162652 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'env', 'PROCESS_TAG=haproxy-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 20 09:51:30 np0005625204.localdomain ceph-mon[301857]: pgmap v105: 177 pgs: 177 active+clean; 356 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 16 MiB/s rd, 18 MiB/s wr, 555 op/s
Feb 20 09:51:30 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1957332596' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:30 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4245632626' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:51:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e101 e101: 6 total, 6 up, 6 in
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.956 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event <LifecycleEvent: 1771581090.9555645, 90eb8d1f-8d13-4395-9d15-67fdaa60632d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.956 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Started (Lifecycle Event)
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.980 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.984 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event <LifecycleEvent: 1771581090.9584138, 90eb8d1f-8d13-4395-9d15-67fdaa60632d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:51:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:30.984 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Paused (Lifecycle Event)
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.008 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.011 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.055 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 20 09:51:31 np0005625204.localdomain podman[308572]: 
Feb 20 09:51:31 np0005625204.localdomain podman[308572]: 2026-02-20 09:51:31.333086595 +0000 UTC m=+0.073969709 container create fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:51:31 np0005625204.localdomain systemd[1]: Started libpod-conmon-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19.scope.
Feb 20 09:51:31 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:51:31 np0005625204.localdomain podman[308572]: 2026-02-20 09:51:31.292752061 +0000 UTC m=+0.033635165 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:51:31 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7397e54adaacdacf436fa6c7d8a45f9bdf2c03bd965044011b1c1cebe1f2aa8f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:51:31 np0005625204.localdomain podman[308572]: 2026-02-20 09:51:31.403184348 +0000 UTC m=+0.144067472 container init fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:51:31 np0005625204.localdomain podman[308572]: 2026-02-20 09:51:31.412084074 +0000 UTC m=+0.152967188 container start fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:31 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE]   (308590) : New worker (308592) forked
Feb 20 09:51:31 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE]   (308590) : Loading success.
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.487 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ce4822a0-5e7a-4c40-9856-6c8879a12ac7 in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc unbound from our chassis
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.491 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc126e7a-67b5-4025-9da6-7c8301672033 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.491 162652 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9021dc49-7e01-42e7-8f32-572dec89afcc
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.500 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9bfdb9-9305-47eb-bebd-b66bff1ad939]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.502 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9021dc49-71 in ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.504 162782 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9021dc49-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.504 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[48a78f7a-3c13-4323-878d-d228b30da236]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.505 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[17b124a8-1df3-44b8-bc34-b9e1c409d54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.515 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0d2956-d073-42b2-828c-2ce24a6a1c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.528 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[268fae37-d796-4372-acbe-68c78fad7a22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.555 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[684b3015-f664-4bd1-91b7-c73626286232]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.563 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[aef0e6e7-6c2c-457e-99c2-58ddc6e4d032]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain systemd-udevd[308484]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:51:31 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581091.5681] manager: (tap9021dc49-70): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.600 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[3fb7d4ec-4381-4286-acbd-ffe809fead05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.604 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[e5de424e-0375-4cf5-8b5a-7cec86115e6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9021dc49-71: link becomes ready
Feb 20 09:51:31 np0005625204.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9021dc49-70: link becomes ready
Feb 20 09:51:31 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581091.6315] device (tap9021dc49-70): carrier: link connected
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.638 162915 DEBUG oslo.privsep.daemon [-] privsep: reply[fb9d07be-04b5-49c8-8044-bddc0851db86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.655 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[edd2cdcb-a97b-4629-a64c-de384cb85ecb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9021dc49-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:cd:0a:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1165080, 'reachable_time': 24304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308611, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.675 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec1ec04-f2fa-4271-a37b-7a32a3eacc03]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:a45'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1165080, 'tstamp': 1165080}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308612, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.693 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4a022fd3-b92b-44c9-af7b-385911879ae8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9021dc49-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:cd:0a:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1165080, 'reachable_time': 24304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308613, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.724 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[04a41fa0-3c53-4455-9924-dc5758b0f400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.788 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[950f4b1a-8dbf-4505-ba5c-1c370e53d559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.791 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9021dc49-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.791 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.792 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9021dc49-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:31 np0005625204.localdomain kernel: device tap9021dc49-70 entered promiscuous mode
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.794 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.800 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9021dc49-70, col_values=(('external_ids', {'iface-id': '8069ffae-e153-4a3e-ac83-1cd290da58a3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.803 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:31 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:31Z|00102|binding|INFO|Releasing lport 8069ffae-e153-4a3e-ac83-1cd290da58a3 from this chassis (sb_readonly=0)
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.804 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.805 162652 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9021dc49-7e01-42e7-8f32-572dec89afcc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9021dc49-7e01-42e7-8f32-572dec89afcc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.806 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2ecc98-51a4-4522-b479-cb68051ef3b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.807 162652 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: global
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     log         /dev/log local0 debug
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     log-tag     haproxy-metadata-proxy-9021dc49-7e01-42e7-8f32-572dec89afcc
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     user        root
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     group       root
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     maxconn     1024
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     pidfile     /var/lib/neutron/external/pids/9021dc49-7e01-42e7-8f32-572dec89afcc.pid.haproxy
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     daemon
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: defaults
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     log global
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     mode http
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     option httplog
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     option dontlognull
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     option http-server-close
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     option forwardfor
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     retries                 3
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout http-request    30s
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout connect         30s
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout client          32s
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout server          32s
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     timeout http-keep-alive 30s
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: listen listener
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     bind 169.254.169.254:80
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     server metadata /var/lib/neutron/metadata_proxy
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:     http-request add-header X-OVN-Network-ID 9021dc49-7e01-42e7-8f32-572dec89afcc
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 20 09:51:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:31.809 162652 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'env', 'PROCESS_TAG=haproxy-9021dc49-7e01-42e7-8f32-572dec89afcc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9021dc49-7e01-42e7-8f32-572dec89afcc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 20 09:51:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:31.813 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:31 np0005625204.localdomain ceph-mon[301857]: osdmap e101: 6 total, 6 up, 6 in
Feb 20 09:51:32 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:32.129 264355 INFO neutron.agent.linux.ip_lib [None req-6210fe24-3488-4e4e-8ec6-008288325c99 - - - - - -] Device tap21b010cc-c3 cannot be used as it has no MAC address
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:32 np0005625204.localdomain kernel: device tap21b010cc-c3 entered promiscuous mode
Feb 20 09:51:32 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581092.1997] manager: (tap21b010cc-c3): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.203 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:32Z|00103|binding|INFO|Claiming lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 for this chassis.
Feb 20 09:51:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:32Z|00104|binding|INFO|21b010cc-c3ff-4013-97b7-6b7eb23e47a9: Claiming unknown
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.211 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:32.225 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e7d3e1cfe9f4e4d8451c6f0b8be3a29', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04bb3800-97e8-42cd-83bb-692b59d74b62, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=21b010cc-c3ff-4013-97b7-6b7eb23e47a9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:32Z|00105|binding|INFO|Setting lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 ovn-installed in OVS
Feb 20 09:51:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:32Z|00106|binding|INFO|Setting lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 up in Southbound
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.245 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.297 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.324 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:32 np0005625204.localdomain podman[308655]: 
Feb 20 09:51:32 np0005625204.localdomain podman[308655]: 2026-02-20 09:51:32.348929209 +0000 UTC m=+0.117730377 container create 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:51:32 np0005625204.localdomain systemd[1]: Started libpod-conmon-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c.scope.
Feb 20 09:51:32 np0005625204.localdomain podman[308655]: 2026-02-20 09:51:32.29439322 +0000 UTC m=+0.063194378 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 20 09:51:32 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:51:32 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427c548c5ea3a77d1146e79412578647e7513ef27a630d63200866643b6640c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:51:32 np0005625204.localdomain podman[308655]: 2026-02-20 09:51:32.420862827 +0000 UTC m=+0.189663985 container init 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:51:32 np0005625204.localdomain podman[308655]: 2026-02-20 09:51:32.430920317 +0000 UTC m=+0.199721475 container start 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:51:32 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE]   (308683) : New worker (308688) forked
Feb 20 09:51:32 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE]   (308683) : Loading success.
Feb 20 09:51:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:32.489 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 in datapath 09ccac50-3316-4f5e-b2ff-0e97a71903d8 unbound from our chassis
Feb 20 09:51:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:32.491 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 09ccac50-3316-4f5e-b2ff-0e97a71903d8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:51:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:32.492 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3d9853-cdc8-4f21-b7a3-50b152e999ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:32 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e102 e102: 6 total, 6 up, 6 in
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.793 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.793 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.794 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.795 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.795 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Processing event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.795 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.796 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.796 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.797 281292 DEBUG oslo_concurrency.lockutils [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.797 281292 DEBUG nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.798 281292 WARNING nova.compute.manager [req-9671d10b-793e-45cb-be11-60b05572acac req-99b4123a-ff00-49a1-9c18-31c4a2c638a4 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received unexpected event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with vm_state building and task_state spawning.
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.799 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.804 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event <LifecycleEvent: 1771581092.8038256, 90eb8d1f-8d13-4395-9d15-67fdaa60632d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.804 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Resumed (Lifecycle Event)
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.808 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.813 281292 INFO nova.virt.libvirt.driver [-] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance spawned successfully.
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.813 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.826 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.836 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.842 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.842 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.843 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.844 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.844 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.845 281292 DEBUG nova.virt.libvirt.driver [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.874 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.922 281292 INFO nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Took 8.52 seconds to spawn the instance on the hypervisor.
Feb 20 09:51:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:32.923 281292 DEBUG nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:51:32 np0005625204.localdomain ceph-mon[301857]: pgmap v107: 177 pgs: 177 active+clean; 356 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 12 MiB/s rd, 13 MiB/s wr, 364 op/s
Feb 20 09:51:32 np0005625204.localdomain ceph-mon[301857]: osdmap e102: 6 total, 6 up, 6 in
Feb 20 09:51:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:33.009 281292 INFO nova.compute.manager [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Took 10.03 seconds to build instance.
Feb 20 09:51:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:33.028 281292 DEBUG oslo_concurrency.lockutils [None req-862923ea-0fe6-4422-bd43-e68b3f9fb20d ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:33 np0005625204.localdomain podman[308736]: 
Feb 20 09:51:33 np0005625204.localdomain podman[308736]: 2026-02-20 09:51:33.282148174 +0000 UTC m=+0.115715956 container create aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:51:33 np0005625204.localdomain systemd[1]: Started libpod-conmon-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19.scope.
Feb 20 09:51:33 np0005625204.localdomain podman[308736]: 2026-02-20 09:51:33.229687028 +0000 UTC m=+0.063254820 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:51:33 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:51:33 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/422ec30970010cad66130b89158fe73344869aca88877d7c2bd64592bab9a6ec/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:51:33 np0005625204.localdomain podman[308736]: 2026-02-20 09:51:33.348805995 +0000 UTC m=+0.182373777 container init aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:33 np0005625204.localdomain podman[308736]: 2026-02-20 09:51:33.355530995 +0000 UTC m=+0.189098777 container start aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:51:33 np0005625204.localdomain dnsmasq[308755]: started, version 2.85 cachesize 150
Feb 20 09:51:33 np0005625204.localdomain dnsmasq[308755]: DNS service limited to local subnets
Feb 20 09:51:33 np0005625204.localdomain dnsmasq[308755]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:51:33 np0005625204.localdomain dnsmasq[308755]: warning: no upstream servers configured
Feb 20 09:51:33 np0005625204.localdomain dnsmasq-dhcp[308755]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:51:33 np0005625204.localdomain dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 0 addresses
Feb 20 09:51:33 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host
Feb 20 09:51:33 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts
Feb 20 09:51:33 np0005625204.localdomain sudo[308756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:51:33 np0005625204.localdomain sudo[308756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:51:33 np0005625204.localdomain sudo[308756]: pam_unix(sudo:session): session closed for user root
Feb 20 09:51:33 np0005625204.localdomain sudo[308774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:51:33 np0005625204.localdomain sudo[308774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:51:33 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:33.787 264355 INFO neutron.agent.dhcp.agent [None req-e25eb512-de70-4ae6-b848-7918d42efd54 - - - - - -] DHCP configuration for ports {'cfc50a35-d356-47aa-8376-a7f780a8f1d2'} is completed
Feb 20 09:51:34 np0005625204.localdomain sudo[308774]: pam_unix(sudo:session): session closed for user root
Feb 20 09:51:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:34.698 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:34 np0005625204.localdomain sudo[308823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:51:34 np0005625204.localdomain sudo[308823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:51:34 np0005625204.localdomain sudo[308823]: pam_unix(sudo:session): session closed for user root
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: pgmap v109: 177 pgs: 177 active+clean; 356 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 12 MiB/s rd, 12 MiB/s wr, 347 op/s
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.976576) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581094976628, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2473, "num_deletes": 254, "total_data_size": 3616053, "memory_usage": 3674736, "flush_reason": "Manual Compaction"}
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581094989045, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 2342375, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14585, "largest_seqno": 17053, "table_properties": {"data_size": 2333652, "index_size": 5356, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2309, "raw_key_size": 19191, "raw_average_key_size": 20, "raw_value_size": 2315578, "raw_average_value_size": 2522, "num_data_blocks": 236, "num_entries": 918, "num_filter_entries": 918, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580912, "oldest_key_time": 1771580912, "file_creation_time": 1771581094, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 12518 microseconds, and 6189 cpu microseconds.
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.989102) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 2342375 bytes OK
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.989128) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.991091) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.991113) EVENT_LOG_v1 {"time_micros": 1771581094991107, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.991139) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 3605076, prev total WAL file size 3605076, number of live WAL files 2.
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.992138) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(2287KB)], [18(17MB)]
Feb 20 09:51:34 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581094992180, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 20810879, "oldest_snapshot_seqno": -1}
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 12289 keys, 18890509 bytes, temperature: kUnknown
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581095078718, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 18890509, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18818435, "index_size": 40229, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30789, "raw_key_size": 327851, "raw_average_key_size": 26, "raw_value_size": 18607204, "raw_average_value_size": 1514, "num_data_blocks": 1543, "num_entries": 12289, "num_filter_entries": 12289, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581094, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.079113) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 18890509 bytes
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.080843) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 240.1 rd, 218.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 17.6 +0.0 blob) out(18.0 +0.0 blob), read-write-amplify(16.9) write-amplify(8.1) OK, records in: 12819, records dropped: 530 output_compression: NoCompression
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.080876) EVENT_LOG_v1 {"time_micros": 1771581095080863, "job": 8, "event": "compaction_finished", "compaction_time_micros": 86659, "compaction_time_cpu_micros": 45281, "output_level": 6, "num_output_files": 1, "total_output_size": 18890509, "num_input_records": 12819, "num_output_records": 12289, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581095081322, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581095083753, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:34.992064) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083849) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:51:35.083856) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:51:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:36.095 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Check if temp file /var/lib/nova/instances/tmp4xew7c85 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 20 09:51:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:36.096 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4xew7c85',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='90eb8d1f-8d13-4395-9d15-67fdaa60632d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 20 09:51:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:36.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:36 np0005625204.localdomain ceph-mon[301857]: pgmap v110: 177 pgs: 177 active+clean; 318 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 20 MiB/s rd, 14 MiB/s wr, 805 op/s
Feb 20 09:51:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:37.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:37.267 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:36Z, description=, device_id=ae5f315b-79d2-4264-afec-ecf48cf37c1f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a6f6a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a6fbe0>], id=2ab664a2-00a2-4e97-877c-3854355c736b, ip_allocation=immediate, mac_address=fa:16:3e:77:fd:60, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:30Z, description=, dns_domain=, id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-43118640-network, port_security_enabled=True, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23356, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=586, status=ACTIVE, subnets=['728af7d3-4d21-4f1d-9b4d-37b28d1c9bfa'], tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:31Z, vlan_transparent=None, network_id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, port_security_enabled=False, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=633, status=DOWN, tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:37Z on network 09ccac50-3316-4f5e-b2ff-0e97a71903d8
Feb 20 09:51:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:37.478 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:51:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:37.478 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:51:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:37.484 281292 INFO nova.compute.rpcapi [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Feb 20 09:51:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:37.484 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:51:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e103 e103: 6 total, 6 up, 6 in
Feb 20 09:51:37 np0005625204.localdomain dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 1 addresses
Feb 20 09:51:37 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host
Feb 20 09:51:37 np0005625204.localdomain podman[308858]: 2026-02-20 09:51:37.609254133 +0000 UTC m=+0.079742853 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:51:37 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts
Feb 20 09:51:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:51:37 np0005625204.localdomain systemd[1]: tmp-crun.15nAlN.mount: Deactivated successfully.
Feb 20 09:51:37 np0005625204.localdomain systemd[1]: tmp-crun.FBPpLS.mount: Deactivated successfully.
Feb 20 09:51:37 np0005625204.localdomain podman[308870]: 2026-02-20 09:51:37.737077409 +0000 UTC m=+0.110561822 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true)
Feb 20 09:51:37 np0005625204.localdomain podman[308870]: 2026-02-20 09:51:37.748145999 +0000 UTC m=+0.121630392 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:37 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:51:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:37.964 264355 INFO neutron.agent.dhcp.agent [None req-c3385804-a075-451a-a30a-4dd8dfdb3504 - - - - - -] DHCP configuration for ports {'2ab664a2-00a2-4e97-877c-3854355c736b'} is completed
Feb 20 09:51:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:38.238 264355 INFO neutron.agent.linux.ip_lib [None req-4ef5b863-73bf-48af-bf17-945910944163 - - - - - -] Device tapf56de90b-39 cannot be used as it has no MAC address
Feb 20 09:51:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:38.307 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:38 np0005625204.localdomain kernel: device tapf56de90b-39 entered promiscuous mode
Feb 20 09:51:38 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581098.3170] manager: (tapf56de90b-39): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Feb 20 09:51:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:38.318 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:38 np0005625204.localdomain systemd-udevd[308904]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:51:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:38Z|00107|binding|INFO|Claiming lport f56de90b-39da-4f5e-beb2-23f63fa15081 for this chassis.
Feb 20 09:51:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:38Z|00108|binding|INFO|f56de90b-39da-4f5e-beb2-23f63fa15081: Claiming unknown
Feb 20 09:51:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:38.329 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bf3a16481834e3a81e04ea40bee1d8d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26340ad6-3c33-4a82-9f2f-3413cbaaea9f, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f56de90b-39da-4f5e-beb2-23f63fa15081) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:38.331 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f56de90b-39da-4f5e-beb2-23f63fa15081 in datapath 71b28781-95be-4ab4-86ca-7c852dd117aa bound to our chassis
Feb 20 09:51:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:38.333 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 71b28781-95be-4ab4-86ca-7c852dd117aa or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:51:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:38.334 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[83e823f5-1d64-4fd2-a6a2-680d914d826e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:38Z|00109|binding|INFO|Setting lport f56de90b-39da-4f5e-beb2-23f63fa15081 ovn-installed in OVS
Feb 20 09:51:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:38Z|00110|binding|INFO|Setting lport f56de90b-39da-4f5e-beb2-23f63fa15081 up in Southbound
Feb 20 09:51:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:38.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf56de90b-39: No such device
Feb 20 09:51:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:38.397 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:38.427 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:38 np0005625204.localdomain ceph-mon[301857]: pgmap v111: 177 pgs: 177 active+clean; 318 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 7.7 MiB/s rd, 2.4 MiB/s wr, 403 op/s
Feb 20 09:51:38 np0005625204.localdomain ceph-mon[301857]: osdmap e103: 6 total, 6 up, 6 in
Feb 20 09:51:38 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:51:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:39.182 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:36Z, description=, device_id=ae5f315b-79d2-4264-afec-ecf48cf37c1f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59ed6a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59ed550>], id=2ab664a2-00a2-4e97-877c-3854355c736b, ip_allocation=immediate, mac_address=fa:16:3e:77:fd:60, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:30Z, description=, dns_domain=, id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-SecurityGroupRulesTestJSON-43118640-network, port_security_enabled=True, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=23356, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=586, status=ACTIVE, subnets=['728af7d3-4d21-4f1d-9b4d-37b28d1c9bfa'], tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:31Z, vlan_transparent=None, network_id=09ccac50-3316-4f5e-b2ff-0e97a71903d8, port_security_enabled=False, project_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=633, status=DOWN, tags=[], tenant_id=0e7d3e1cfe9f4e4d8451c6f0b8be3a29, updated_at=2026-02-20T09:51:37Z on network 09ccac50-3316-4f5e-b2ff-0e97a71903d8
Feb 20 09:51:39 np0005625204.localdomain podman[308977]: 
Feb 20 09:51:39 np0005625204.localdomain podman[308977]: 2026-02-20 09:51:39.378238343 +0000 UTC m=+0.112781908 container create 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 20 09:51:39 np0005625204.localdomain systemd[1]: Started libpod-conmon-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f.scope.
Feb 20 09:51:39 np0005625204.localdomain podman[308977]: 2026-02-20 09:51:39.328815407 +0000 UTC m=+0.063358972 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:51:39 np0005625204.localdomain podman[309003]: 2026-02-20 09:51:39.449911714 +0000 UTC m=+0.074522227 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:51:39 np0005625204.localdomain dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 1 addresses
Feb 20 09:51:39 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host
Feb 20 09:51:39 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:51:39 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts
Feb 20 09:51:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/251cfd009f8c3881b0902462fc67fb4f6f911df6e30e6a2f4b7d252da92d4521/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:51:39 np0005625204.localdomain podman[308977]: 2026-02-20 09:51:39.473881479 +0000 UTC m=+0.208425004 container init 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:51:39 np0005625204.localdomain podman[308977]: 2026-02-20 09:51:39.480090764 +0000 UTC m=+0.214634289 container start 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:51:39 np0005625204.localdomain dnsmasq[309024]: started, version 2.85 cachesize 150
Feb 20 09:51:39 np0005625204.localdomain dnsmasq[309024]: DNS service limited to local subnets
Feb 20 09:51:39 np0005625204.localdomain dnsmasq[309024]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:51:39 np0005625204.localdomain dnsmasq[309024]: warning: no upstream servers configured
Feb 20 09:51:39 np0005625204.localdomain dnsmasq-dhcp[309024]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:51:39 np0005625204.localdomain dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 0 addresses
Feb 20 09:51:39 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host
Feb 20 09:51:39 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts
Feb 20 09:51:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:39.579 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:39 np0005625204.localdomain systemd[1]: tmp-crun.73uX7D.mount: Deactivated successfully.
Feb 20 09:51:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:39.700 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:39.774 264355 INFO neutron.agent.dhcp.agent [None req-c7498486-8851-4713-90d3-8f2a3e59c45c - - - - - -] DHCP configuration for ports {'2ab664a2-00a2-4e97-877c-3854355c736b', '92767216-7fcd-4a1f-a5c1-2c5d4ba6339b'} is completed
Feb 20 09:51:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:40.489 281292 DEBUG nova.compute.manager [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:40.490 281292 DEBUG oslo_concurrency.lockutils [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:40.490 281292 DEBUG oslo_concurrency.lockutils [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:40.491 281292 DEBUG oslo_concurrency.lockutils [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:40.491 281292 DEBUG nova.compute.manager [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:51:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:40.492 281292 DEBUG nova.compute.manager [req-c4474244-5f3c-4dee-b414-a998785ef269 req-84cac1d0-49fa-4324-a6f0-37e483dc06c5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 20 09:51:40 np0005625204.localdomain ceph-mon[301857]: pgmap v113: 177 pgs: 177 active+clean; 318 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 8.7 MiB/s rd, 2.4 MiB/s wr, 435 op/s
Feb 20 09:51:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.691 281292 INFO nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Took 4.21 seconds for pre_live_migration on destination host np0005625202.localdomain.
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.693 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.725 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4xew7c85',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='90eb8d1f-8d13-4395-9d15-67fdaa60632d',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a8047bd1-acc9-47b2-a05d-5e7eb6222d12),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.731 281292 DEBUG nova.objects.instance [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lazy-loading 'migration_context' on Instance uuid 90eb8d1f-8d13-4395-9d15-67fdaa60632d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.734 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.736 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.737 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.771 281292 DEBUG nova.virt.libvirt.vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T09:51:32Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T09:51:32Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.772 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.773 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.775 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating guest XML with vif config: <interface type="ethernet">
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]:   <mac address="fa:16:3e:c0:a3:f9"/>
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]:   <model type="virtio"/>
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]:   <driver name="vhost" rx_queue_size="512"/>
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]:   <mtu size="1442"/>
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]:   <target dev="tap609a0699-87"/>
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: </interface>
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 20 09:51:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:41.777 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.240 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.240 281292 INFO nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.274 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.321 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 20 09:51:42 np0005625204.localdomain ceph-mon[301857]: pgmap v114: 177 pgs: 177 active+clean; 318 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 7.8 MiB/s rd, 2.1 MiB/s wr, 392 op/s
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.742 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.743 281292 WARNING nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received unexpected event network-vif-plugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with vm_state active and task_state migrating.
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG nova.compute.manager [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing instance network info cache due to event network-changed-609a0699-8716-4bf8-9f50-bfeec5f65721. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.743 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.744 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquired lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.744 281292 DEBUG nova.network.neutron [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Refreshing network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.825 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.825 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 20 09:51:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:42.938 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.373 281292 DEBUG nova.virt.driver [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] Emitting event <LifecycleEvent: 1771581103.3731406, 90eb8d1f-8d13-4395-9d15-67fdaa60632d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.373 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Paused (Lifecycle Event)
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.376 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.377 281292 DEBUG nova.virt.libvirt.migration [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.399 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.404 281292 DEBUG nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.429 281292 INFO nova.compute.manager [None req-df970a63-45cf-4f35-9a6f-640bb54703e2 - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 20 09:51:43 np0005625204.localdomain kernel: device tap609a0699-87 left promiscuous mode
Feb 20 09:51:43 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581103.5514] device (tap609a0699-87): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00111|binding|INFO|Releasing lport 609a0699-8716-4bf8-9f50-bfeec5f65721 from this chassis (sb_readonly=0)
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.569 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00112|binding|INFO|Setting lport 609a0699-8716-4bf8-9f50-bfeec5f65721 down in Southbound
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00113|binding|INFO|Releasing lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 from this chassis (sb_readonly=0)
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00114|binding|INFO|Setting lport ce4822a0-5e7a-4c40-9856-6c8879a12ac7 down in Southbound
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00115|binding|INFO|Removing iface tap609a0699-87 ovn-installed in OVS
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.595 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:a3:f9 10.100.0.12'], port_security=['fa:16:3e:c0:a3:f9 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain,np0005625202.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '0a83b6be-9fe2-42ef-8768-88847d97b165'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-420346976', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '90eb8d1f-8d13-4395-9d15-67fdaa60632d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-420346976', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '8', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=609a0699-8716-4bf8-9f50-bfeec5f65721) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00116|binding|INFO|Releasing lport 2b93bbc2-5aeb-49cc-b610-6f4f7708d346 from this chassis (sb_readonly=0)
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00117|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:51:43 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:43Z|00118|binding|INFO|Releasing lport 8069ffae-e153-4a3e-ac83-1cd290da58a3 from this chassis (sb_readonly=0)
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.599 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ef:22:88 19.80.0.55'], port_security=['fa:16:3e:ef:22:88 19.80.0.55'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['609a0699-8716-4bf8-9f50-bfeec5f65721'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-288633192', 'neutron:cidrs': '19.80.0.55/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-288633192', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '6a912071-fd9c-4d5f-8453-7f993db3506d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ce4822a0-5e7a-4c40-9856-6c8879a12ac7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.602 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 609a0699-8716-4bf8-9f50-bfeec5f65721 in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 unbound from our chassis
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.608 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8d59c69c-3a69-449e-9d36-233c1f4c5c30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.608 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.609 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d3f2b38e-75fd-4e8a-b447-0b9b98d60c24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:43 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:43.610 162652 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 namespace which is not needed anymore
Feb 20 09:51:43 np0005625204.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 20 09:51:43 np0005625204.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000008.scope: Consumed 10.458s CPU time.
Feb 20 09:51:43 np0005625204.localdomain systemd-machined[85698]: Machine qemu-3-instance-00000008 terminated.
Feb 20 09:51:43 np0005625204.localdomain virtqemud[206495]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk: No such file or directory
Feb 20 09:51:43 np0005625204.localdomain virtqemud[206495]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/90eb8d1f-8d13-4395-9d15-67fdaa60632d_disk: No such file or directory
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.705 281292 DEBUG nova.network.neutron [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updated VIF entry in instance network info cache for port 609a0699-8716-4bf8-9f50-bfeec5f65721. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.706 281292 DEBUG nova.network.neutron [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Updating instance_info_cache with network_info: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005625202.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:51:43 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:43.721 2 INFO neutron.agent.securitygroups_rpc [req-bc88a03e-b48b-4063-bf3f-e91bcc37d72d req-9e63afcc-d40a-4b2c-a3aa-f230d65e4db2 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['d4aeef42-5959-493a-9cfc-ec0d9adb0b00']
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.753 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.753 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.753 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.762 281292 DEBUG oslo_concurrency.lockutils [req-aef9dfa8-f2da-4048-bb50-a0ef4e0e2f51 req-8a692af1-03ac-42ec-90df-c8d239cbdef5 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Releasing lock "refresh_cache-90eb8d1f-8d13-4395-9d15-67fdaa60632d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:51:43 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE]   (308590) : haproxy version is 2.8.14-c23fe91
Feb 20 09:51:43 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [NOTICE]   (308590) : path to executable is /usr/sbin/haproxy
Feb 20 09:51:43 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [WARNING]  (308590) : Exiting Master process...
Feb 20 09:51:43 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [ALERT]    (308590) : Current worker (308592) exited with code 143 (Terminated)
Feb 20 09:51:43 np0005625204.localdomain neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0[308586]: [WARNING]  (308590) : All workers exited. Exiting... (0)
Feb 20 09:51:43 np0005625204.localdomain systemd[1]: libpod-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19.scope: Deactivated successfully.
Feb 20 09:51:43 np0005625204.localdomain podman[309059]: 2026-02-20 09:51:43.78972011 +0000 UTC m=+0.071729233 container died fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:51:43 np0005625204.localdomain systemd[1]: tmp-crun.DZLg20.mount: Deactivated successfully.
Feb 20 09:51:43 np0005625204.localdomain podman[309059]: 2026-02-20 09:51:43.842580809 +0000 UTC m=+0.124589852 container cleanup fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 20 09:51:43 np0005625204.localdomain podman[309080]: 2026-02-20 09:51:43.860779532 +0000 UTC m=+0.063843578 container cleanup fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:51:43 np0005625204.localdomain systemd[1]: libpod-conmon-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19.scope: Deactivated successfully.
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.879 281292 DEBUG nova.virt.libvirt.guest [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '90eb8d1f-8d13-4395-9d15-67fdaa60632d' (instance-00000008) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.881 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migration operation has completed
Feb 20 09:51:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:43.882 281292 INFO nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] _post_live_migration() is started..
Feb 20 09:51:44 np0005625204.localdomain sshd[309107]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:51:44 np0005625204.localdomain sshd[309107]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:51:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:44.646 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:43Z, description=, device_id=330257d1-c627-4905-9230-185815fc6ffb, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a58eb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a58640>], id=ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3, ip_allocation=immediate, mac_address=fa:16:3e:c0:25:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:34Z, description=, dns_domain=, id=71b28781-95be-4ab4-86ca-7c852dd117aa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-478398302-network, port_security_enabled=True, project_id=3bf3a16481834e3a81e04ea40bee1d8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7731, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=623, status=ACTIVE, subnets=['0e07ae8b-e5fb-4d37-99ed-b66cb0c0c73e'], tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:36Z, vlan_transparent=None, network_id=71b28781-95be-4ab4-86ca-7c852dd117aa, port_security_enabled=False, project_id=3bf3a16481834e3a81e04ea40bee1d8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=661, status=DOWN, tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:44Z on network 71b28781-95be-4ab4-86ca-7c852dd117aa
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:44 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:44.772 2 INFO neutron.agent.securitygroups_rpc [req-d0af9ee5-c34c-498a-a79b-d6b681e80e4a req-6395f2ff-4ffd-4bf4-8fd0-e88e3c18ce7a 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['f1d2b747-b5b9-4577-9543-577b07c94aaa']
Feb 20 09:51:44 np0005625204.localdomain ceph-mon[301857]: pgmap v115: 177 pgs: 177 active+clean; 318 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 7.0 MiB/s rd, 1.9 MiB/s wr, 348 op/s
Feb 20 09:51:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-7397e54adaacdacf436fa6c7d8a45f9bdf2c03bd965044011b1c1cebe1f2aa8f-merged.mount: Deactivated successfully.
Feb 20 09:51:44 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19-userdata-shm.mount: Deactivated successfully.
Feb 20 09:51:44 np0005625204.localdomain podman[309094]: 2026-02-20 09:51:44.790614666 +0000 UTC m=+0.929954439 container remove fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.798 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[082898fc-7673-464a-bed4-e62f1309e41f]: (4, ('Fri Feb 20 09:51:43 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 (fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19)\nfa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19\nFri Feb 20 09:51:43 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 (fa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19)\nfa739159b23a9599cb2b2b3e6936a888de14434087a4397daff7494907a66c19\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.800 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac84158-22ba-41f9-b68d-3a3558805402]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.802 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51f8ae9c-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.805 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:44 np0005625204.localdomain kernel: device tap51f8ae9c-10 left promiscuous mode
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.820 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.825 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[27d52f11-5292-4b17-b2bb-65677c4405fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.846 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[95ccfdbf-8236-4b75-a0e8-be334f8f3dc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.847 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d91d2cea-ee8c-495f-9344-aac73411b212]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.861 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[6b6f9e2c-1b07-4a34-9563-9ee51f1968b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164973, 'reachable_time': 41135, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309141, 'error': None, 'target': 'ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain systemd[1]: run-netns-ovnmeta\x2d51f8ae9c\x2d1ccc\x2d4ec5\x2d8a06\x2d5c7802ad29e0.mount: Deactivated successfully.
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.863 163070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.863 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb70769-67fd-49a5-ae10-f12419da1355]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.864 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ce4822a0-5e7a-4c40-9856-6c8879a12ac7 in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc unbound from our chassis
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.868 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fc126e7a-67b5-4025-9da6-7c8301672033 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.868 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9021dc49-7e01-42e7-8f32-572dec89afcc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.869 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[44cf62c5-e5f2-430c-9d1a-3ab98419de19]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:44.870 162652 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc namespace which is not needed anymore
Feb 20 09:51:44 np0005625204.localdomain dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 1 addresses
Feb 20 09:51:44 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host
Feb 20 09:51:44 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts
Feb 20 09:51:44 np0005625204.localdomain podman[309128]: 2026-02-20 09:51:44.895366193 +0000 UTC m=+0.064979591 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.918 281292 DEBUG nova.compute.manager [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.919 281292 DEBUG oslo_concurrency.lockutils [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.920 281292 DEBUG oslo_concurrency.lockutils [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.921 281292 DEBUG oslo_concurrency.lockutils [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.921 281292 DEBUG nova.compute.manager [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] No waiting events found dispatching network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 20 09:51:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:44.921 281292 DEBUG nova.compute.manager [req-e684c1d6-26ec-4e1a-b5fa-814d456a68f2 req-66f8854d-13d7-4154-95b0-6c5c1f7ceda6 d4446b63864e47aebd18955a38393018 f6685cdb0ff24cddaeb987c63c89eafb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Received event network-vif-unplugged-609a0699-8716-4bf8-9f50-bfeec5f65721 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 20 09:51:45 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE]   (308683) : haproxy version is 2.8.14-c23fe91
Feb 20 09:51:45 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [NOTICE]   (308683) : path to executable is /usr/sbin/haproxy
Feb 20 09:51:45 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [WARNING]  (308683) : Exiting Master process...
Feb 20 09:51:45 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [WARNING]  (308683) : Exiting Master process...
Feb 20 09:51:45 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [ALERT]    (308683) : Current worker (308688) exited with code 143 (Terminated)
Feb 20 09:51:45 np0005625204.localdomain neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc[308678]: [WARNING]  (308683) : All workers exited. Exiting... (0)
Feb 20 09:51:45 np0005625204.localdomain systemd[1]: libpod-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c.scope: Deactivated successfully.
Feb 20 09:51:45 np0005625204.localdomain podman[309167]: 2026-02-20 09:51:45.064386011 +0000 UTC m=+0.073972761 container died 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.108 281292 DEBUG nova.network.neutron [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Activated binding for port 609a0699-8716-4bf8-9f50-bfeec5f65721 and host np0005625202.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.109 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.111 281292 DEBUG nova.virt.libvirt.vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-20T09:51:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-721665546',display_name='tempest-LiveMigrationTest-server-721665546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005625204.localdomain',hostname='tempest-livemigrationtest-server-721665546',id=8,image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-20T09:51:32Z,launched_on='np0005625204.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005625204.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='e704aae5b1ba49d59262f9aa0c366fb4',ramdisk_id='',reservation_id='r-erbwo03j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='06bd71fd-c415-45d9-b669-46209b7ca2f4',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-2108133970',owner_user_name='tempest-LiveMigrationTest-2108133970-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-20T09:51:35Z,user_data=None,user_id='ba15d0e9919d4594a2e6e9d6b3414a5e',uuid=90eb8d1f-8d13-4395-9d15-67fdaa60632d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.112 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converting VIF {"id": "609a0699-8716-4bf8-9f50-bfeec5f65721", "address": "fa:16:3e:c0:a3:f9", "network": {"id": "51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0", "bridge": "br-int", "label": "tempest-LiveMigrationTest-982155183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "e704aae5b1ba49d59262f9aa0c366fb4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap609a0699-87", "ovs_interfaceid": "609a0699-8716-4bf8-9f50-bfeec5f65721", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.113 281292 DEBUG nova.network.os_vif_util [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.113 281292 DEBUG os_vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 20 09:51:45 np0005625204.localdomain podman[309167]: 2026-02-20 09:51:45.11459252 +0000 UTC m=+0.124179260 container cleanup 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.116 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 18 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.116 281292 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap609a0699-87, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.118 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.120 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.124 281292 INFO os_vif [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:a3:f9,bridge_name='br-int',has_traffic_filtering=True,id=609a0699-8716-4bf8-9f50-bfeec5f65721,network=Network(51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap609a0699-87')
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.125 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.125 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.126 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.126 281292 DEBUG nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.127 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Deleting instance files /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d_del
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.127 281292 INFO nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Deletion of /var/lib/nova/instances/90eb8d1f-8d13-4395-9d15-67fdaa60632d_del complete
Feb 20 09:51:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:45.136 264355 INFO neutron.agent.dhcp.agent [None req-5f3762c0-1cb3-459e-abff-392fbcbf26c3 - - - - - -] DHCP configuration for ports {'ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3'} is completed
Feb 20 09:51:45 np0005625204.localdomain podman[309182]: 2026-02-20 09:51:45.14945183 +0000 UTC m=+0.077951798 container cleanup 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:51:45 np0005625204.localdomain systemd[1]: libpod-conmon-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c.scope: Deactivated successfully.
Feb 20 09:51:45 np0005625204.localdomain podman[309198]: 2026-02-20 09:51:45.219598775 +0000 UTC m=+0.080784773 container remove 941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.224 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[060b3405-9785-4a34-9279-55da535def80]: (4, ('Fri Feb 20 09:51:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc (941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c)\n941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c\nFri Feb 20 09:51:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc (941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c)\n941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.226 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4db536ff-5e02-4037-bcd9-0703fd85878a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.227 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9021dc49-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.230 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:45 np0005625204.localdomain kernel: device tap9021dc49-70 left promiscuous mode
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.233 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.238 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[aff20123-d610-416f-9de3-fb8208236ef2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:45.243 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.260 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[979fe4d1-9021-4f27-ab7e-9e62d607f9a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.262 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[f10fc171-c42c-44e1-b7cd-03de60fceaf4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.279 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[07623aa5-31f9-4518-af3e-1ace6320e407]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1165072, 'reachable_time': 39996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309218, 'error': None, 'target': 'ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.281 163070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9021dc49-7e01-42e7-8f32-572dec89afcc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 20 09:51:45 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:45.282 163070 DEBUG oslo.privsep.daemon [-] privsep: reply[c112abee-704b-41f9-84f4-0b56d0b5019c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-427c548c5ea3a77d1146e79412578647e7513ef27a630d63200866643b6640c1-merged.mount: Deactivated successfully.
Feb 20 09:51:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-941eb6dc5cf79525fc14f167f89b13fef62a77cc8de55938235f03449d46f02c-userdata-shm.mount: Deactivated successfully.
Feb 20 09:51:45 np0005625204.localdomain systemd[1]: run-netns-ovnmeta\x2d9021dc49\x2d7e01\x2d42e7\x2d8f32\x2d572dec89afcc.mount: Deactivated successfully.
Feb 20 09:51:46 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:46.580 2 INFO neutron.agent.securitygroups_rpc [req-d0a4da6f-de87-4709-b506-6d507f2fa68b req-a4293e8b-cdcb-4f0f-b9af-77766c0f126a 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['b0641abe-7ec2-4391-9e24-125339c7b7ee']
Feb 20 09:51:46 np0005625204.localdomain ceph-mon[301857]: pgmap v116: 177 pgs: 177 active+clean; 360 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 MiB/s rd, 4.8 MiB/s wr, 155 op/s
Feb 20 09:51:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:51:47 np0005625204.localdomain podman[309219]: 2026-02-20 09:51:47.116783425 +0000 UTC m=+0.058536798 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:51:47 np0005625204.localdomain podman[309219]: 2026-02-20 09:51:47.129065532 +0000 UTC m=+0.070818915 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:51:47 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:51:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:47.310 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:47 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:47.317 2 INFO neutron.agent.securitygroups_rpc [req-bca898f3-80d6-4116-8407-48ccb221c91a req-faa58b52-ff7a-4794-95e2-54e55cdad610 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['57ce2b3f-bfcc-424f-be8f-efa4d8d83e67']
Feb 20 09:51:47 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:47.588 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:43Z, description=, device_id=330257d1-c627-4905-9230-185815fc6ffb, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59e0700>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a85f40>], id=ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3, ip_allocation=immediate, mac_address=fa:16:3e:c0:25:a2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:51:34Z, description=, dns_domain=, id=71b28781-95be-4ab4-86ca-7c852dd117aa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-478398302-network, port_security_enabled=True, project_id=3bf3a16481834e3a81e04ea40bee1d8d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7731, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=623, status=ACTIVE, subnets=['0e07ae8b-e5fb-4d37-99ed-b66cb0c0c73e'], tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:36Z, vlan_transparent=None, network_id=71b28781-95be-4ab4-86ca-7c852dd117aa, port_security_enabled=False, project_id=3bf3a16481834e3a81e04ea40bee1d8d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=661, status=DOWN, tags=[], tenant_id=3bf3a16481834e3a81e04ea40bee1d8d, updated_at=2026-02-20T09:51:44Z on network 71b28781-95be-4ab4-86ca-7c852dd117aa
Feb 20 09:51:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:51:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:51:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:51:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162550 "" "Go-http-client/1.1"
Feb 20 09:51:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:51:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20242 "" "Go-http-client/1.1"
Feb 20 09:51:47 np0005625204.localdomain dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 1 addresses
Feb 20 09:51:47 np0005625204.localdomain podman[309259]: 2026-02-20 09:51:47.84433328 +0000 UTC m=+0.079824094 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:51:47 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host
Feb 20 09:51:47 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts
Feb 20 09:51:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:47.898 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:48.084 264355 INFO neutron.agent.dhcp.agent [None req-1ecfbbd2-7e17-47cc-a1b0-53717d9eb847 - - - - - -] DHCP configuration for ports {'ba2fda90-ec40-4c48-b4d3-77c44e1ba9a3'} is completed
Feb 20 09:51:48 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:48.224 2 INFO neutron.agent.securitygroups_rpc [req-9dee97d4-d9a8-4ee1-93af-ecec75edb6d8 req-b55f0bec-8773-478c-b205-7d4f6dd0e50e 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['35812fcc-3257-4b87-b011-2e502647727b']
Feb 20 09:51:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 09:51:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 09:51:48 np0005625204.localdomain snmpd[68593]: empty variable list in _query
Feb 20 09:51:48 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:48.489 2 INFO neutron.agent.securitygroups_rpc [req-3e7d9994-1b77-4e20-ab05-14f19dff3953 req-ef6c6054-1e67-499f-b1bf-b8ae592974a9 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['35812fcc-3257-4b87-b011-2e502647727b']
Feb 20 09:51:48 np0005625204.localdomain ceph-mon[301857]: pgmap v117: 177 pgs: 177 active+clean; 360 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 MiB/s rd, 4.8 MiB/s wr, 155 op/s
Feb 20 09:51:49 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:49.164 2 INFO neutron.agent.securitygroups_rpc [req-fdb7e361-ff4f-4f47-a1a0-e5e8ae6f1fbe req-da6cfbf4-9203-474f-ad26-64056962735b 107ab22f7bfc4441a4ab7a417331cdb2 0e7d3e1cfe9f4e4d8451c6f0b8be3a29 - - default default] Security group rule updated ['35812fcc-3257-4b87-b011-2e502647727b']
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.342 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.344 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.344 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "90eb8d1f-8d13-4395-9d15-67fdaa60632d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.407 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.407 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.408 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.408 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.409 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e104 e104: 6 total, 6 up, 6 in
Feb 20 09:51:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/523973223' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:51:49 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/552899203' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.857 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.929 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:51:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:49.930 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:51:49 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:49.948 2 INFO neutron.agent.securitygroups_rpc [None req-3fe948ea-c8e6-429c-836a-702342b0e4ac 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group rule updated ['4439e19b-bf91-4420-aff1-6854f961fef4']
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.120 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.152 281292 WARNING nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.154 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11387MB free_disk=41.42898178100586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.155 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.156 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.200 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Migration for instance 90eb8d1f-8d13-4395-9d15-67fdaa60632d refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.222 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.255 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.256 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Migration a8047bd1-acc9-47b2-a05d-5e7eb6222d12 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.256 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.257 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:51:50 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:50.270 2 INFO neutron.agent.securitygroups_rpc [None req-e1dc84f3-2fa9-4dac-a092-2cb427ae3321 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group rule updated ['4439e19b-bf91-4420-aff1-6854f961fef4']
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.323 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3199515391' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.828 281292 DEBUG oslo_concurrency.processutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.834 281292 DEBUG nova.compute.provider_tree [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: pgmap v118: 177 pgs: 177 active+clean; 380 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 4.3 MiB/s wr, 199 op/s
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/552899203' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: osdmap e104: 6 total, 6 up, 6 in
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2939242926' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3199515391' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.861 281292 DEBUG nova.scheduler.client.report [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.903 281292 DEBUG nova.compute.resource_tracker [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.904 281292 DEBUG oslo_concurrency.lockutils [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:50.924 281292 INFO nova.compute.manager [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Migrating instance to np0005625202.localdomain finished successfully.
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.049 281292 INFO nova.scheduler.client.report [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] Deleted allocation for migration a8047bd1-acc9-47b2-a05d-5e7eb6222d12
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.049 281292 DEBUG nova.virt.libvirt.driver [None req-33d9284d-1dc9-45ce-9d7a-b8cac683d1f9 fc37a88bcbe34be0bfdc7c0bd2787c78 65685fb154414740b6b5e1276111b8bb - - default default] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.747 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.748 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:51:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:51.748 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e105 e105: 6 total, 6 up, 6 in
Feb 20 09:51:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/997567679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4266218097' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.198 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.274 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.337 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.498 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.500 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11344MB free_disk=41.428184509277344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.501 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.501 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.563 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.563 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.564 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:51:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:52.606 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: pgmap v120: 177 pgs: 177 active+clean; 385 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 MiB/s rd, 5.1 MiB/s wr, 228 op/s
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: osdmap e105: 6 total, 6 up, 6 in
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/804935202' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4266218097' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2232832379' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e106 e106: 6 total, 6 up, 6 in
Feb 20 09:51:53 np0005625204.localdomain dnsmasq[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/addn_hosts - 0 addresses
Feb 20 09:51:53 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/host
Feb 20 09:51:53 np0005625204.localdomain dnsmasq-dhcp[309024]: read /var/lib/neutron/dhcp/71b28781-95be-4ab4-86ca-7c852dd117aa/opts
Feb 20 09:51:53 np0005625204.localdomain podman[309382]: 2026-02-20 09:51:53.025249982 +0000 UTC m=+0.059084987 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:51:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:51:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:51:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3716024828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.116 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.122 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:51:53 np0005625204.localdomain podman[309398]: 2026-02-20 09:51:53.150631265 +0000 UTC m=+0.094213694 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:51:53 np0005625204.localdomain podman[309398]: 2026-02-20 09:51:53.170165338 +0000 UTC m=+0.113723146 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.182 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.184 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:51:53 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.185 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:51:53 np0005625204.localdomain kernel: device tapf56de90b-39 left promiscuous mode
Feb 20 09:51:53 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:53Z|00119|binding|INFO|Releasing lport f56de90b-39da-4f5e-beb2-23f63fa15081 from this chassis (sb_readonly=0)
Feb 20 09:51:53 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:53Z|00120|binding|INFO|Setting lport f56de90b-39da-4f5e-beb2-23f63fa15081 down in Southbound
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.281 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:53.293 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-71b28781-95be-4ab4-86ca-7c852dd117aa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bf3a16481834e3a81e04ea40bee1d8d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26340ad6-3c33-4a82-9f2f-3413cbaaea9f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f56de90b-39da-4f5e-beb2-23f63fa15081) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:53.295 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f56de90b-39da-4f5e-beb2-23f63fa15081 in datapath 71b28781-95be-4ab4-86ca-7c852dd117aa unbound from our chassis
Feb 20 09:51:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:53.300 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 71b28781-95be-4ab4-86ca-7c852dd117aa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:53 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:53.301 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[f44705ee-999d-4348-a543-2bf332b51ec1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:53.307 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:53.666 2 INFO neutron.agent.securitygroups_rpc [None req-2028af40-368e-4b25-90de-8401d53be72c 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']
Feb 20 09:51:53 np0005625204.localdomain ceph-mon[301857]: osdmap e106: 6 total, 6 up, 6 in
Feb 20 09:51:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3716024828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:51:54 np0005625204.localdomain podman[309430]: 2026-02-20 09:51:54.155855241 +0000 UTC m=+0.086885216 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1770267347, version=9.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 20 09:51:54 np0005625204.localdomain podman[309430]: 2026-02-20 09:51:54.19801301 +0000 UTC m=+0.129042965 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:51:54 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:51:54 np0005625204.localdomain ceph-mon[301857]: pgmap v123: 177 pgs: 177 active+clean; 385 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 624 KiB/s wr, 163 op/s
Feb 20 09:51:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/151132608' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:55 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:55.103 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:51:15Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59a9100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59a97c0>], id=609a0699-8716-4bf8-9f50-bfeec5f65721, ip_allocation=immediate, mac_address=fa:16:3e:c0:a3:f9, name=tempest-parent-420346976, network_id=51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, port_security_enabled=True, project_id=e704aae5b1ba49d59262f9aa0c366fb4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=14, security_groups=['6a912071-fd9c-4d5f-8453-7f993db3506d'], standard_attr_id=521, status=DOWN, tags=[], tenant_id=e704aae5b1ba49d59262f9aa0c366fb4, trunk_details=sub_ports=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59a9f40>], trunk_id=bb723cd7-ac34-46b0-bf66-79c7ed1fe96f, updated_at=2026-02-20T09:51:52Z on network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.172 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.180 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.181 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.203 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:55 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:55.204 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.204 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:55 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:55.205 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.206 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:55 np0005625204.localdomain systemd[1]: tmp-crun.RFGBpS.mount: Deactivated successfully.
Feb 20 09:51:55 np0005625204.localdomain podman[309463]: 2026-02-20 09:51:55.347032629 +0000 UTC m=+0.061676252 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:51:55 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 2 addresses
Feb 20 09:51:55 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:51:55 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:51:55 np0005625204.localdomain dnsmasq[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/addn_hosts - 0 addresses
Feb 20 09:51:55 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/host
Feb 20 09:51:55 np0005625204.localdomain dnsmasq-dhcp[308755]: read /var/lib/neutron/dhcp/09ccac50-3316-4f5e-b2ff-0e97a71903d8/opts
Feb 20 09:51:55 np0005625204.localdomain podman[309497]: 2026-02-20 09:51:55.471819096 +0000 UTC m=+0.042592563 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:51:55 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:55.541 264355 INFO neutron.agent.dhcp.agent [None req-3217b94a-ac56-4875-a052-545d312cdafa - - - - - -] DHCP configuration for ports {'609a0699-8716-4bf8-9f50-bfeec5f65721'} is completed
Feb 20 09:51:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:55.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:56Z|00121|binding|INFO|Releasing lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 from this chassis (sb_readonly=0)
Feb 20 09:51:56 np0005625204.localdomain kernel: device tap21b010cc-c3 left promiscuous mode
Feb 20 09:51:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:56Z|00122|binding|INFO|Setting lport 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 down in Southbound
Feb 20 09:51:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:56.292 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:56.304 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-09ccac50-3316-4f5e-b2ff-0e97a71903d8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0e7d3e1cfe9f4e4d8451c6f0b8be3a29', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04bb3800-97e8-42cd-83bb-692b59d74b62, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=21b010cc-c3ff-4013-97b7-6b7eb23e47a9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:56.306 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 21b010cc-c3ff-4013-97b7-6b7eb23e47a9 in datapath 09ccac50-3316-4f5e-b2ff-0e97a71903d8 unbound from our chassis
Feb 20 09:51:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:56.311 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 09ccac50-3316-4f5e-b2ff-0e97a71903d8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:56.311 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:56.312 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bbe344-3a26-43c6-861a-d7328fd06174]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:56.488 2 INFO neutron.agent.securitygroups_rpc [req-9bfa3730-17a6-4d3e-b4cb-835cdd225f2c req-73160dbe-971e-4219-ac30-c0c28777ca1e 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group member updated ['4439e19b-bf91-4420-aff1-6854f961fef4']
Feb 20 09:51:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:51:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:51:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:51:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:51:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:51:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:51:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:56.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:56.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:51:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:56.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:51:56 np0005625204.localdomain ceph-mon[301857]: pgmap v124: 177 pgs: 177 active+clean; 307 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 8.1 MiB/s rd, 7.8 MiB/s wr, 301 op/s
Feb 20 09:51:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3016823328' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.096 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.097 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.097 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.098 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.368 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:57 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:57.444 2 INFO neutron.agent.securitygroups_rpc [None req-426d7c59-43bb-4b5f-98f0-2945e94d9430 ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']
Feb 20 09:51:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e107 e107: 6 total, 6 up, 6 in
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.912 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:51:57 np0005625204.localdomain podman[309539]: 2026-02-20 09:51:57.915746371 +0000 UTC m=+0.061414265 container kill d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:57 np0005625204.localdomain dnsmasq[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/addn_hosts - 0 addresses
Feb 20 09:51:57 np0005625204.localdomain dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/host
Feb 20 09:51:57 np0005625204.localdomain dnsmasq-dhcp[307844]: read /var/lib/neutron/dhcp/9021dc49-7e01-42e7-8f32-572dec89afcc/opts
Feb 20 09:51:57 np0005625204.localdomain systemd[1]: tmp-crun.eTQV1A.mount: Deactivated successfully.
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.961 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.962 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:51:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:57.963 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:51:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:58Z|00123|binding|INFO|Removing iface tap22bf7523-8a ovn-installed in OVS
Feb 20 09:51:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:58.335 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fc126e7a-67b5-4025-9da6-7c8301672033 with type ""
Feb 20 09:51:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:58Z|00124|binding|INFO|Removing lport 22bf7523-8a19-46b0-a0b7-53070ea1823e ovn-installed in OVS
Feb 20 09:51:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:58.337 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9021dc49-7e01-42e7-8f32-572dec89afcc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7655fb8f-4890-4990-9fdf-4d25849654f0, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=22bf7523-8a19-46b0-a0b7-53070ea1823e) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.337 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:58.339 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 22bf7523-8a19-46b0-a0b7-53070ea1823e in datapath 9021dc49-7e01-42e7-8f32-572dec89afcc unbound from our chassis
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.344 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:58.343 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9021dc49-7e01-42e7-8f32-572dec89afcc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:51:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:51:58.345 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c5273751-0973-4044-bc7e-cffbdd344cd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:51:58 np0005625204.localdomain podman[309578]: 2026-02-20 09:51:58.376430278 +0000 UTC m=+0.064273020 container kill d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:51:58 np0005625204.localdomain dnsmasq[307844]: exiting on receipt of SIGTERM
Feb 20 09:51:58 np0005625204.localdomain systemd[1]: libpod-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635.scope: Deactivated successfully.
Feb 20 09:51:58 np0005625204.localdomain podman[309592]: 2026-02-20 09:51:58.453736656 +0000 UTC m=+0.061633842 container died d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:51:58 np0005625204.localdomain podman[309592]: 2026-02-20 09:51:58.483482814 +0000 UTC m=+0.091379950 container cleanup d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:51:58 np0005625204.localdomain systemd[1]: libpod-conmon-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635.scope: Deactivated successfully.
Feb 20 09:51:58 np0005625204.localdomain podman[309599]: 2026-02-20 09:51:58.524698345 +0000 UTC m=+0.120383605 container remove d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9021dc49-7e01-42e7-8f32-572dec89afcc, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:51:58 np0005625204.localdomain ceph-mon[301857]: pgmap v125: 177 pgs: 177 active+clean; 307 MiB data, 1021 MiB used, 41 GiB / 42 GiB avail; 6.3 MiB/s rd, 6.2 MiB/s wr, 216 op/s
Feb 20 09:51:58 np0005625204.localdomain ceph-mon[301857]: osdmap e107: 6 total, 6 up, 6 in
Feb 20 09:51:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2462011471' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1662759014' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:58 np0005625204.localdomain kernel: device tap22bf7523-8a left promiscuous mode
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.579 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.589 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:58.616 264355 INFO neutron.agent.dhcp.agent [None req-a7409a55-f09d-478f-9f60-9dd13371890d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:51:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:51:58.617 264355 INFO neutron.agent.dhcp.agent [None req-a7409a55-f09d-478f-9f60-9dd13371890d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:51:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:51:58Z|00125|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.733 281292 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1771581103.7327886, 90eb8d1f-8d13-4395-9d15-67fdaa60632d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.734 281292 INFO nova.compute.manager [-] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] VM Stopped (Lifecycle Event)
Feb 20 09:51:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:51:58.751 281292 DEBUG nova.compute.manager [None req-6ef98482-7fb0-450d-9035-039a093d5d7b - - - - - -] [instance: 90eb8d1f-8d13-4395-9d15-67fdaa60632d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 20 09:51:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-faa9e2373e9edef47d940be28280fa18e93a1d836a7561ac2b42ed8a739e240e-merged.mount: Deactivated successfully.
Feb 20 09:51:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5d6c99301375bb77af7c2c52fef173fbed9a5a083d4ea63e9399e8b684ed635-userdata-shm.mount: Deactivated successfully.
Feb 20 09:51:58 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d9021dc49\x2d7e01\x2d42e7\x2d8f32\x2d572dec89afcc.mount: Deactivated successfully.
Feb 20 09:51:59 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:59.074 2 INFO neutron.agent.securitygroups_rpc [None req-bcd9a2b7-ab94-49ae-b942-9c3b757c3657 0db48d5f6f5e44fc93154cf4b34a94e0 a966116e4ddf4bdea0571a1bb751916e - - default default] Security group member updated ['07d2fe18-fbbf-4547-931e-bb55f378bade']
Feb 20 09:51:59 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:51:59.517 2 INFO neutron.agent.securitygroups_rpc [None req-da379379-3275-471e-8ade-92d9716364d1 ba15d0e9919d4594a2e6e9d6b3414a5e e704aae5b1ba49d59262f9aa0c366fb4 - - default default] Security group member updated ['6a912071-fd9c-4d5f-8453-7f993db3506d']
Feb 20 09:51:59 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2189930680' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:51:59 np0005625204.localdomain podman[309641]: 2026-02-20 09:51:59.795110469 +0000 UTC m=+0.057762255 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 20 09:51:59 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 1 addresses
Feb 20 09:51:59 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:51:59 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:51:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:51:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:51:59 np0005625204.localdomain podman[309653]: 2026-02-20 09:51:59.918943456 +0000 UTC m=+0.097311876 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:52:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:00.415 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:00 np0005625204.localdomain podman[309653]: 2026-02-20 09:52:00.459080645 +0000 UTC m=+0.637449125 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:52:00 np0005625204.localdomain systemd[1]: tmp-crun.plNHWW.mount: Deactivated successfully.
Feb 20 09:52:00 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:52:00 np0005625204.localdomain podman[309654]: 2026-02-20 09:52:00.487771072 +0000 UTC m=+0.659926096 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:52:00 np0005625204.localdomain podman[309654]: 2026-02-20 09:52:00.491151263 +0000 UTC m=+0.663306277 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 20 09:52:00 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:52:00 np0005625204.localdomain ceph-mon[301857]: pgmap v127: 177 pgs: 177 active+clean; 277 MiB data, 970 MiB used, 41 GiB / 42 GiB avail; 6.3 MiB/s rd, 8.4 MiB/s wr, 283 op/s
Feb 20 09:52:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:00.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:00.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:52:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4022390931' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:52:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:02.414 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:02 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:02Z|00126|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:02.554 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:02 np0005625204.localdomain ceph-mon[301857]: pgmap v128: 177 pgs: 177 active+clean; 273 MiB data, 960 MiB used, 41 GiB / 42 GiB avail; 6.0 MiB/s rd, 8.0 MiB/s wr, 294 op/s
Feb 20 09:52:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1831565294' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:52:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3397992773' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:52:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3397992773' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:52:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1586058909' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:52:02 np0005625204.localdomain podman[309721]: 2026-02-20 09:52:02.733955662 +0000 UTC m=+0.060992273 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:52:02 np0005625204.localdomain dnsmasq[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/addn_hosts - 0 addresses
Feb 20 09:52:02 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/host
Feb 20 09:52:02 np0005625204.localdomain dnsmasq-dhcp[307248]: read /var/lib/neutron/dhcp/51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0/opts
Feb 20 09:52:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:03.208 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:52:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:03Z|00127|binding|INFO|Releasing lport 9906e141-c388-453f-9169-7c98a351db5e from this chassis (sb_readonly=0)
Feb 20 09:52:03 np0005625204.localdomain kernel: device tap9906e141-c3 left promiscuous mode
Feb 20 09:52:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:03.278 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:03Z|00128|binding|INFO|Setting lport 9906e141-c388-453f-9169-7c98a351db5e down in Southbound
Feb 20 09:52:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:03.289 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e704aae5b1ba49d59262f9aa0c366fb4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ad9ac3f8-d9ff-4a1d-8092-e57f93de7b33, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=9906e141-c388-453f-9169-7c98a351db5e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:52:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:03.291 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 9906e141-c388-453f-9169-7c98a351db5e in datapath 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0 unbound from our chassis
Feb 20 09:52:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:03.295 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:52:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:03.296 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d759f397-0a33-40ad-b0c1-57a983e190b0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:52:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:03.309 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e108 e108: 6 total, 6 up, 6 in
Feb 20 09:52:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1024349497' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:52:03 np0005625204.localdomain dnsmasq[309024]: exiting on receipt of SIGTERM
Feb 20 09:52:03 np0005625204.localdomain podman[309761]: 2026-02-20 09:52:03.718844351 +0000 UTC m=+0.045560071 container kill 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:52:03 np0005625204.localdomain systemd[1]: libpod-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f.scope: Deactivated successfully.
Feb 20 09:52:03 np0005625204.localdomain podman[309776]: 2026-02-20 09:52:03.822254389 +0000 UTC m=+0.086568796 container died 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 20 09:52:03 np0005625204.localdomain systemd[1]: tmp-crun.TqJWhT.mount: Deactivated successfully.
Feb 20 09:52:03 np0005625204.localdomain podman[309776]: 2026-02-20 09:52:03.846366579 +0000 UTC m=+0.110680976 container cleanup 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:52:03 np0005625204.localdomain systemd[1]: libpod-conmon-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f.scope: Deactivated successfully.
Feb 20 09:52:03 np0005625204.localdomain podman[309775]: 2026-02-20 09:52:03.88357189 +0000 UTC m=+0.141552198 container remove 427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-71b28781-95be-4ab4-86ca-7c852dd117aa, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:52:04 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:04.142 264355 INFO neutron.agent.dhcp.agent [None req-199b7a8b-74a0-461c-94c6-8bf3935123a5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:04 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:04.623 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:04 np0005625204.localdomain ceph-mon[301857]: pgmap v129: 177 pgs: 177 active+clean; 273 MiB data, 960 MiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 6.8 MiB/s wr, 250 op/s
Feb 20 09:52:04 np0005625204.localdomain ceph-mon[301857]: osdmap e108: 6 total, 6 up, 6 in
Feb 20 09:52:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:04Z|00129|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-251cfd009f8c3881b0902462fc67fb4f6f911df6e30e6a2f4b7d252da92d4521-merged.mount: Deactivated successfully.
Feb 20 09:52:04 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-427a5c67791011878c1b9874c97665116caf603a4449068d6efa742d47e1ec8f-userdata-shm.mount: Deactivated successfully.
Feb 20 09:52:04 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d71b28781\x2d95be\x2d4ab4\x2d86ca\x2d7c852dd117aa.mount: Deactivated successfully.
Feb 20 09:52:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:04.890 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:05.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e108 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:06.015 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:52:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:06.016 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:52:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:06.016 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:52:06 np0005625204.localdomain ceph-mon[301857]: pgmap v131: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 8.5 MiB/s wr, 282 op/s
Feb 20 09:52:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:07.414 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:07.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 e109: 6 total, 6 up, 6 in
Feb 20 09:52:07 np0005625204.localdomain dnsmasq[308755]: exiting on receipt of SIGTERM
Feb 20 09:52:07 np0005625204.localdomain podman[309822]: 2026-02-20 09:52:07.998740857 +0000 UTC m=+0.062542969 container kill aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:07 np0005625204.localdomain systemd[1]: libpod-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19.scope: Deactivated successfully.
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:52:08 np0005625204.localdomain podman[309836]: 2026-02-20 09:52:08.058327276 +0000 UTC m=+0.048240771 container died aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: tmp-crun.bKKvT6.mount: Deactivated successfully.
Feb 20 09:52:08 np0005625204.localdomain podman[309836]: 2026-02-20 09:52:08.09631628 +0000 UTC m=+0.086229735 container cleanup aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: libpod-conmon-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19.scope: Deactivated successfully.
Feb 20 09:52:08 np0005625204.localdomain podman[309843]: 2026-02-20 09:52:08.104239377 +0000 UTC m=+0.080032590 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:08 np0005625204.localdomain podman[309843]: 2026-02-20 09:52:08.118972858 +0000 UTC m=+0.094766061 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true)
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:52:08 np0005625204.localdomain podman[309844]: 2026-02-20 09:52:08.17062936 +0000 UTC m=+0.140270120 container remove aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-09ccac50-3316-4f5e-b2ff-0e97a71903d8, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:52:08 np0005625204.localdomain ceph-mon[301857]: pgmap v132: 177 pgs: 177 active+clean; 354 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 6.9 MiB/s wr, 228 op/s
Feb 20 09:52:08 np0005625204.localdomain ceph-mon[301857]: osdmap e109: 6 total, 6 up, 6 in
Feb 20 09:52:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:08.601 264355 INFO neutron.agent.dhcp.agent [None req-c27b73df-3cdd-4605-8878-1cda769fd03b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:08.721 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:08.937 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-422ec30970010cad66130b89158fe73344869aca88877d7c2bd64592bab9a6ec-merged.mount: Deactivated successfully.
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aee416ade6ba15b2cb79269ab8f68e6f59ecade7a2c7d8b7c07d2c401b9b0a19-userdata-shm.mount: Deactivated successfully.
Feb 20 09:52:08 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d09ccac50\x2d3316\x2d4f5e\x2db2ff\x2d0e97a71903d8.mount: Deactivated successfully.
Feb 20 09:52:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:10.471 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:10 np0005625204.localdomain ceph-mon[301857]: pgmap v134: 177 pgs: 177 active+clean; 282 MiB data, 1017 MiB used, 41 GiB / 42 GiB avail; 8.2 MiB/s rd, 5.9 MiB/s wr, 254 op/s
Feb 20 09:52:10 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1218142157' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:11Z|00130|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:11.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:11 np0005625204.localdomain sshd[309877]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:52:12 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:11Z|00131|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:12.040 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:12.460 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:12 np0005625204.localdomain ceph-mon[301857]: pgmap v135: 177 pgs: 177 active+clean; 192 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 5.9 MiB/s wr, 376 op/s
Feb 20 09:52:12 np0005625204.localdomain podman[309896]: 2026-02-20 09:52:12.719052755 +0000 UTC m=+0.040682566 container kill a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Feb 20 09:52:12 np0005625204.localdomain dnsmasq[307248]: exiting on receipt of SIGTERM
Feb 20 09:52:12 np0005625204.localdomain systemd[1]: libpod-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7.scope: Deactivated successfully.
Feb 20 09:52:12 np0005625204.localdomain sshd[309877]: Received disconnect from 57.128.218.144 port 55762:11: Bye Bye [preauth]
Feb 20 09:52:12 np0005625204.localdomain sshd[309877]: Disconnected from authenticating user root 57.128.218.144 port 55762 [preauth]
Feb 20 09:52:12 np0005625204.localdomain podman[309910]: 2026-02-20 09:52:12.784625413 +0000 UTC m=+0.051374765 container died a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:52:12 np0005625204.localdomain systemd[1]: tmp-crun.qomzvQ.mount: Deactivated successfully.
Feb 20 09:52:12 np0005625204.localdomain podman[309910]: 2026-02-20 09:52:12.822756792 +0000 UTC m=+0.089506104 container cleanup a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:52:12 np0005625204.localdomain systemd[1]: libpod-conmon-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7.scope: Deactivated successfully.
Feb 20 09:52:12 np0005625204.localdomain podman[309911]: 2026-02-20 09:52:12.876113825 +0000 UTC m=+0.137705843 container remove a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-51f8ae9c-1ccc-4ec5-8a06-5c7802ad29e0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:52:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:12.900 264355 INFO neutron.agent.dhcp.agent [None req-c0560a25-3a64-44e8-b1f8-05a9e7a24113 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:13.212 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4b7f60ded7c720c0eef64273aac4a3860430b8d407f2c1fb2fc8c0fb6a9fb560-merged.mount: Deactivated successfully.
Feb 20 09:52:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a6fd2cb7e946adbe2fe4f436940f36c985800ddbee4d23f3e146456cba56e8a7-userdata-shm.mount: Deactivated successfully.
Feb 20 09:52:13 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d51f8ae9c\x2d1ccc\x2d4ec5\x2d8a06\x2d5c7802ad29e0.mount: Deactivated successfully.
Feb 20 09:52:14 np0005625204.localdomain ceph-mon[301857]: pgmap v136: 177 pgs: 177 active+clean; 192 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 9.2 MiB/s rd, 4.8 MiB/s wr, 308 op/s
Feb 20 09:52:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:15.474 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:15 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:15.939 2 INFO neutron.agent.securitygroups_rpc [None req-343265ee-aef8-4c0b-8b69-d5c79e80995b ad3bee90b7c843958ab29e9ae5697cd5 78fdd34f107b4ec7ac81795ecc3f677c - - default default] Security group member updated ['7f2f6730-5897-423d-9b80-6a0cf94c3a8f']
Feb 20 09:52:16 np0005625204.localdomain ceph-mon[301857]: pgmap v137: 177 pgs: 177 active+clean; 192 MiB data, 823 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 19 KiB/s wr, 168 op/s
Feb 20 09:52:17 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:17Z|00132|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:17.139 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:17.463 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:52:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:52:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:52:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:52:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:52:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18329 "" "Go-http-client/1.1"
Feb 20 09:52:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:17.828 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:17 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:17Z|00133|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:17.948 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:52:18 np0005625204.localdomain systemd[1]: tmp-crun.0Smlan.mount: Deactivated successfully.
Feb 20 09:52:18 np0005625204.localdomain podman[309938]: 2026-02-20 09:52:18.156137666 +0000 UTC m=+0.086087381 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:52:18 np0005625204.localdomain podman[309938]: 2026-02-20 09:52:18.173703881 +0000 UTC m=+0.103653576 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:52:18 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:52:18 np0005625204.localdomain ceph-mon[301857]: pgmap v138: 177 pgs: 177 active+clean; 192 MiB data, 823 MiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 19 KiB/s wr, 168 op/s
Feb 20 09:52:18 np0005625204.localdomain sshd[309961]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:52:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:19.519 2 INFO neutron.agent.securitygroups_rpc [None req-e2d6b938-f6f5-4317-a8d2-0776bdf5afe2 ad3bee90b7c843958ab29e9ae5697cd5 78fdd34f107b4ec7ac81795ecc3f677c - - default default] Security group member updated ['7f2f6730-5897-423d-9b80-6a0cf94c3a8f']
Feb 20 09:52:19 np0005625204.localdomain sshd[309961]: Invalid user max from 196.189.116.182 port 51364
Feb 20 09:52:19 np0005625204.localdomain sshd[309961]: Received disconnect from 196.189.116.182 port 51364:11: Bye Bye [preauth]
Feb 20 09:52:19 np0005625204.localdomain sshd[309961]: Disconnected from invalid user max 196.189.116.182 port 51364 [preauth]
Feb 20 09:52:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:20.477 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:20 np0005625204.localdomain ceph-mon[301857]: pgmap v139: 177 pgs: 177 active+clean; 220 MiB data, 864 MiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 1.5 MiB/s wr, 183 op/s
Feb 20 09:52:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:22.502 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:22 np0005625204.localdomain ceph-mon[301857]: pgmap v140: 177 pgs: 177 active+clean; 225 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 2.4 MiB/s rd, 2.1 MiB/s wr, 151 op/s
Feb 20 09:52:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:23.031 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:52:24 np0005625204.localdomain systemd[1]: tmp-crun.e8MLG2.mount: Deactivated successfully.
Feb 20 09:52:24 np0005625204.localdomain podman[309963]: 2026-02-20 09:52:24.165135314 +0000 UTC m=+0.095715529 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:52:24 np0005625204.localdomain podman[309963]: 2026-02-20 09:52:24.172439492 +0000 UTC m=+0.103019697 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:52:24 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:52:24 np0005625204.localdomain ceph-mon[301857]: pgmap v141: 177 pgs: 177 active+clean; 225 MiB data, 889 MiB used, 41 GiB / 42 GiB avail; 373 KiB/s rd, 2.1 MiB/s wr, 70 op/s
Feb 20 09:52:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:52:25 np0005625204.localdomain podman[309986]: 2026-02-20 09:52:25.134941583 +0000 UTC m=+0.071029883 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, version=9.7, vcs-type=git, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z)
Feb 20 09:52:25 np0005625204.localdomain podman[309986]: 2026-02-20 09:52:25.152068524 +0000 UTC m=+0.088156814 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container)
Feb 20 09:52:25 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:52:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:25.523 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:52:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:52:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:52:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:52:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:52:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:52:26 np0005625204.localdomain ceph-mon[301857]: pgmap v142: 177 pgs: 177 active+clean; 225 MiB data, 890 MiB used, 41 GiB / 42 GiB avail; 373 KiB/s rd, 2.1 MiB/s wr, 70 op/s
Feb 20 09:52:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:27.506 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:27Z|00134|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:28.000 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:28 np0005625204.localdomain ceph-mon[301857]: pgmap v143: 177 pgs: 177 active+clean; 225 MiB data, 890 MiB used, 41 GiB / 42 GiB avail; 369 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Feb 20 09:52:28 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:28.824 2 INFO neutron.agent.securitygroups_rpc [req-b5848cd8-466b-4ef0-b46b-f454b04eeec2 req-730cd1ba-0675-45ee-8c23-360f67ec8632 2188e6de9cae445dadfba1541701ebd2 f299da1b635f4dafbe62328983ad1fae - - default default] Security group member updated ['4439e19b-bf91-4420-aff1-6854f961fef4']
Feb 20 09:52:29 np0005625204.localdomain sshd[310006]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:52:29 np0005625204.localdomain sshd[310006]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:52:29 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3541994609' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:30.529 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:52:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:52:30 np0005625204.localdomain ceph-mon[301857]: pgmap v144: 177 pgs: 177 active+clean; 176 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 380 KiB/s rd, 2.1 MiB/s wr, 80 op/s
Feb 20 09:52:30 np0005625204.localdomain podman[310008]: 2026-02-20 09:52:30.881139094 +0000 UTC m=+0.093954367 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 09:52:30 np0005625204.localdomain podman[310009]: 2026-02-20 09:52:30.949857875 +0000 UTC m=+0.160174073 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 20 09:52:30 np0005625204.localdomain podman[310008]: 2026-02-20 09:52:30.979206162 +0000 UTC m=+0.192021465 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Feb 20 09:52:30 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:52:31 np0005625204.localdomain podman[310009]: 2026-02-20 09:52:31.035200984 +0000 UTC m=+0.245517192 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:52:31 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:52:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:32.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:32 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:32.577 264355 INFO neutron.agent.linux.ip_lib [None req-8dac0a0d-a09c-464f-a6ac-4c368d2bba8b - - - - - -] Device tap7fc1435c-10 cannot be used as it has no MAC address
Feb 20 09:52:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:32.603 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:32 np0005625204.localdomain kernel: device tap7fc1435c-10 entered promiscuous mode
Feb 20 09:52:32 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581152.6128] manager: (tap7fc1435c-10): new Generic device (/org/freedesktop/NetworkManager/Devices/26)
Feb 20 09:52:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:32Z|00135|binding|INFO|Claiming lport 7fc1435c-10da-4551-8776-a30225c1584b for this chassis.
Feb 20 09:52:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:32.614 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:32 np0005625204.localdomain systemd-udevd[310061]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:52:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:32Z|00136|binding|INFO|7fc1435c-10da-4551-8776-a30225c1584b: Claiming unknown
Feb 20 09:52:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:32.631 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1bfffabed04c6d8fc33cdd0ddf56a4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46ce4a9-c894-4089-9b26-e586ca861a84, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=7fc1435c-10da-4551-8776-a30225c1584b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:52:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:32.632 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fc1435c-10da-4551-8776-a30225c1584b in datapath d260fabb-b595-4411-92db-b47a732060f6 bound to our chassis
Feb 20 09:52:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:32.634 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d260fabb-b595-4411-92db-b47a732060f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:52:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:32.635 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[668a0102-af41-4b42-ae5d-5732297e91dd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:32Z|00137|binding|INFO|Setting lport 7fc1435c-10da-4551-8776-a30225c1584b ovn-installed in OVS
Feb 20 09:52:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:32Z|00138|binding|INFO|Setting lport 7fc1435c-10da-4551-8776-a30225c1584b up in Southbound
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:32.656 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fc1435c-10: No such device
Feb 20 09:52:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:32.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:32.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:32 np0005625204.localdomain ceph-mon[301857]: pgmap v145: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 177 KiB/s rd, 660 KiB/s wr, 52 op/s
Feb 20 09:52:33 np0005625204.localdomain podman[310134]: 
Feb 20 09:52:33 np0005625204.localdomain podman[310134]: 2026-02-20 09:52:33.604581165 +0000 UTC m=+0.094194164 container create ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 20 09:52:33 np0005625204.localdomain systemd[1]: Started libpod-conmon-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331.scope.
Feb 20 09:52:33 np0005625204.localdomain podman[310134]: 2026-02-20 09:52:33.559788848 +0000 UTC m=+0.049401887 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:52:33 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:52:33 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92bd17572daf82e995501baa24d56fe0cc42bf636ed181dc5240c8cecc9c8bc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:52:33 np0005625204.localdomain podman[310134]: 2026-02-20 09:52:33.679368609 +0000 UTC m=+0.168981608 container init ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:52:33 np0005625204.localdomain podman[310134]: 2026-02-20 09:52:33.692780668 +0000 UTC m=+0.182393637 container start ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:52:33 np0005625204.localdomain dnsmasq[310152]: started, version 2.85 cachesize 150
Feb 20 09:52:33 np0005625204.localdomain dnsmasq[310152]: DNS service limited to local subnets
Feb 20 09:52:33 np0005625204.localdomain dnsmasq[310152]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:52:33 np0005625204.localdomain dnsmasq[310152]: warning: no upstream servers configured
Feb 20 09:52:33 np0005625204.localdomain dnsmasq-dhcp[310152]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:52:33 np0005625204.localdomain dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 0 addresses
Feb 20 09:52:33 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host
Feb 20 09:52:33 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts
Feb 20 09:52:33 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:33Z|00139|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:33 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:33.905 264355 INFO neutron.agent.dhcp.agent [None req-dc7fdb6d-8eae-4ab0-83a1-3cd0980d6458 - - - - - -] DHCP configuration for ports {'abe0ba2a-c493-425d-8827-68be9f0f0a81'} is completed
Feb 20 09:52:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:33.914 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:34 np0005625204.localdomain ceph-mon[301857]: pgmap v146: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 20 09:52:34 np0005625204.localdomain sudo[310153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:52:34 np0005625204.localdomain sudo[310153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:52:34 np0005625204.localdomain sudo[310153]: pam_unix(sudo:session): session closed for user root
Feb 20 09:52:35 np0005625204.localdomain sudo[310171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:52:35 np0005625204.localdomain sudo[310171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:52:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:35.564 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:35.673 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:35 np0005625204.localdomain sudo[310171]: pam_unix(sudo:session): session closed for user root
Feb 20 09:52:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:52:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:52:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:52:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:52:36 np0005625204.localdomain sudo[310220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:52:36 np0005625204.localdomain sudo[310220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:52:36 np0005625204.localdomain sudo[310220]: pam_unix(sudo:session): session closed for user root
Feb 20 09:52:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:36.617 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:36Z, description=, device_id=3095f6e8-d4c1-4b47-b904-07c6a9deaaf2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62ecb20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62ecbb0>], id=5e004d3c-491a-4ae0-ad9a-f29043bf90a7, ip_allocation=immediate, mac_address=fa:16:3e:2b:6a:50, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:30Z, description=, dns_domain=, id=d260fabb-b595-4411-92db-b47a732060f6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-1855742561-network, port_security_enabled=True, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41604, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=834, status=ACTIVE, subnets=['663a7f2b-e539-4525-8f2f-a5461c1df7da'], tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:31Z, vlan_transparent=None, network_id=d260fabb-b595-4411-92db-b47a732060f6, port_security_enabled=False, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=864, status=DOWN, tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:36Z on network d260fabb-b595-4411-92db-b47a732060f6
Feb 20 09:52:36 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:36.824 2 INFO neutron.agent.securitygroups_rpc [None req-d63b0875-b3f1-4849-b165-16313644e666 eab28fccca6a48139a7d8b395d8f0b9a dc182b0a7cbb4e47b6b88befc2c48022 - - default default] Security group member updated ['8d0cb685-1e0f-43aa-973a-a081d9962496']
Feb 20 09:52:36 np0005625204.localdomain dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 1 addresses
Feb 20 09:52:36 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host
Feb 20 09:52:36 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts
Feb 20 09:52:36 np0005625204.localdomain podman[310255]: 2026-02-20 09:52:36.826746209 +0000 UTC m=+0.047926802 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:36 np0005625204.localdomain ceph-mon[301857]: pgmap v147: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 20 09:52:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:37.081 264355 INFO neutron.agent.dhcp.agent [None req-3b46eb09-90dd-425c-a00e-51b5f307d183 - - - - - -] DHCP configuration for ports {'5e004d3c-491a-4ae0-ad9a-f29043bf90a7'} is completed
Feb 20 09:52:37 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:37.385 2 INFO neutron.agent.securitygroups_rpc [None req-51ac2042-ec94-4975-95ca-42a72712c92b eab28fccca6a48139a7d8b395d8f0b9a dc182b0a7cbb4e47b6b88befc2c48022 - - default default] Security group member updated ['8d0cb685-1e0f-43aa-973a-a081d9962496']
Feb 20 09:52:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:37.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:37.704 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:36Z, description=, device_id=3095f6e8-d4c1-4b47-b904-07c6a9deaaf2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a80cd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a80520>], id=5e004d3c-491a-4ae0-ad9a-f29043bf90a7, ip_allocation=immediate, mac_address=fa:16:3e:2b:6a:50, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:30Z, description=, dns_domain=, id=d260fabb-b595-4411-92db-b47a732060f6, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-1855742561-network, port_security_enabled=True, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=41604, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=834, status=ACTIVE, subnets=['663a7f2b-e539-4525-8f2f-a5461c1df7da'], tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:31Z, vlan_transparent=None, network_id=d260fabb-b595-4411-92db-b47a732060f6, port_security_enabled=False, project_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=864, status=DOWN, tags=[], tenant_id=ca1bfffabed04c6d8fc33cdd0ddf56a4, updated_at=2026-02-20T09:52:36Z on network d260fabb-b595-4411-92db-b47a732060f6
Feb 20 09:52:37 np0005625204.localdomain dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 1 addresses
Feb 20 09:52:37 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host
Feb 20 09:52:37 np0005625204.localdomain systemd[1]: tmp-crun.m6f0tc.mount: Deactivated successfully.
Feb 20 09:52:37 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts
Feb 20 09:52:37 np0005625204.localdomain podman[310292]: 2026-02-20 09:52:37.95235449 +0000 UTC m=+0.065277840 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:52:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:38.232 264355 INFO neutron.agent.dhcp.agent [None req-e378f723-20c9-4d9f-881b-1e24f8f086d3 - - - - - -] DHCP configuration for ports {'5e004d3c-491a-4ae0-ad9a-f29043bf90a7'} is completed
Feb 20 09:52:38 np0005625204.localdomain ceph-mon[301857]: pgmap v148: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 20 09:52:38 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:52:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:52:39 np0005625204.localdomain podman[310312]: 2026-02-20 09:52:39.148120116 +0000 UTC m=+0.079418344 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:52:39 np0005625204.localdomain podman[310312]: 2026-02-20 09:52:39.158930368 +0000 UTC m=+0.090228586 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 20 09:52:39 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:52:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:40.595 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:41 np0005625204.localdomain ceph-mon[301857]: pgmap v149: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Feb 20 09:52:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:41.673 264355 INFO neutron.agent.linux.ip_lib [None req-ef006bf0-d1fe-4738-a151-08769a2e9194 - - - - - -] Device tape04a3d27-d2 cannot be used as it has no MAC address
Feb 20 09:52:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:41.736 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:41 np0005625204.localdomain kernel: device tape04a3d27-d2 entered promiscuous mode
Feb 20 09:52:41 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581161.7466] manager: (tape04a3d27-d2): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Feb 20 09:52:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:41Z|00140|binding|INFO|Claiming lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab for this chassis.
Feb 20 09:52:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:41Z|00141|binding|INFO|e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab: Claiming unknown
Feb 20 09:52:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:41.753 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:41 np0005625204.localdomain systemd-udevd[310351]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:52:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:41.764 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2039424f830e4ef5aa461223cac1ffd5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=32672433-7024-441f-825a-7707135603bd, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:52:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:41.767 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab in datapath 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9 bound to our chassis
Feb 20 09:52:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:41.774 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port d954753b-8f76-43d0-95a6-a39aa6a0330d IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:52:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:41.774 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:52:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:41.775 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8d413acf-5040-48af-ab55-74175dde4b3b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:41Z|00142|binding|INFO|Setting lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab ovn-installed in OVS
Feb 20 09:52:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:41Z|00143|binding|INFO|Setting lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab up in Southbound
Feb 20 09:52:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:41.806 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape04a3d27-d2: No such device
Feb 20 09:52:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:41.845 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:41.876 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:41 np0005625204.localdomain podman[310383]: 2026-02-20 09:52:41.922796516 +0000 UTC m=+0.059965281 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 20 09:52:41 np0005625204.localdomain dnsmasq[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/addn_hosts - 0 addresses
Feb 20 09:52:41 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/host
Feb 20 09:52:41 np0005625204.localdomain dnsmasq-dhcp[310152]: read /var/lib/neutron/dhcp/d260fabb-b595-4411-92db-b47a732060f6/opts
Feb 20 09:52:41 np0005625204.localdomain systemd[1]: tmp-crun.y6SinQ.mount: Deactivated successfully.
Feb 20 09:52:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:42Z|00144|binding|INFO|Releasing lport 7fc1435c-10da-4551-8776-a30225c1584b from this chassis (sb_readonly=0)
Feb 20 09:52:42 np0005625204.localdomain kernel: device tap7fc1435c-10 left promiscuous mode
Feb 20 09:52:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:42.088 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:42Z|00145|binding|INFO|Setting lport 7fc1435c-10da-4551-8776-a30225c1584b down in Southbound
Feb 20 09:52:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:42.099 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d260fabb-b595-4411-92db-b47a732060f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ca1bfffabed04c6d8fc33cdd0ddf56a4', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a46ce4a9-c894-4089-9b26-e586ca861a84, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=7fc1435c-10da-4551-8776-a30225c1584b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:52:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:42.100 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fc1435c-10da-4551-8776-a30225c1584b in datapath d260fabb-b595-4411-92db-b47a732060f6 unbound from our chassis
Feb 20 09:52:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:42.101 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d260fabb-b595-4411-92db-b47a732060f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:52:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:42.102 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5720a484-671e-4a17-b479-35da2881524b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:52:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:42.108 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:42.110 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:42.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.588464) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162588508, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1126, "num_deletes": 253, "total_data_size": 1367012, "memory_usage": 1395672, "flush_reason": "Manual Compaction"}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162595542, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 646394, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17058, "largest_seqno": 18179, "table_properties": {"data_size": 642460, "index_size": 1597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10892, "raw_average_key_size": 21, "raw_value_size": 633788, "raw_average_value_size": 1245, "num_data_blocks": 71, "num_entries": 509, "num_filter_entries": 509, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581095, "oldest_key_time": 1771581095, "file_creation_time": 1771581162, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 7127 microseconds, and 3400 cpu microseconds.
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.595591) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 646394 bytes OK
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.595615) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.597672) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.597697) EVENT_LOG_v1 {"time_micros": 1771581162597690, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.597717) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 1361433, prev total WAL file size 1361757, number of live WAL files 2.
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.599669) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373536' seq:72057594037927935, type:22 .. '6D6772737461740034303037' seq:0, type:0; will stop at (end)
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(631KB)], [21(18MB)]
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162599716, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 19536903, "oldest_snapshot_seqno": -1}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12299 keys, 17598600 bytes, temperature: kUnknown
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162681755, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 17598600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17530187, "index_size": 36568, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30789, "raw_key_size": 328436, "raw_average_key_size": 26, "raw_value_size": 17322493, "raw_average_value_size": 1408, "num_data_blocks": 1390, "num_entries": 12299, "num_filter_entries": 12299, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581162, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.682027) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 17598600 bytes
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.683693) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 237.9 rd, 214.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 18.0 +0.0 blob) out(16.8 +0.0 blob), read-write-amplify(57.5) write-amplify(27.2) OK, records in: 12798, records dropped: 499 output_compression: NoCompression
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.683715) EVENT_LOG_v1 {"time_micros": 1771581162683706, "job": 10, "event": "compaction_finished", "compaction_time_micros": 82134, "compaction_time_cpu_micros": 46366, "output_level": 6, "num_output_files": 1, "total_output_size": 17598600, "num_input_records": 12798, "num_output_records": 12299, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162683879, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581162685719, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.599564) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:52:42 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:52:42.685750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:52:42 np0005625204.localdomain podman[310451]: 
Feb 20 09:52:42 np0005625204.localdomain podman[310451]: 2026-02-20 09:52:42.71761479 +0000 UTC m=+0.086050621 container create b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:52:42 np0005625204.localdomain systemd[1]: Started libpod-conmon-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf.scope.
Feb 20 09:52:42 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:52:42 np0005625204.localdomain podman[310451]: 2026-02-20 09:52:42.675456701 +0000 UTC m=+0.043892572 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:52:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e326f6f6e12d73c52b76eeb8e2267cd2522e4c76d841349323c3f224f65b99/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:52:42 np0005625204.localdomain podman[310451]: 2026-02-20 09:52:42.786360023 +0000 UTC m=+0.154795864 container init b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 20 09:52:42 np0005625204.localdomain podman[310451]: 2026-02-20 09:52:42.796098413 +0000 UTC m=+0.164534254 container start b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:42 np0005625204.localdomain dnsmasq[310469]: started, version 2.85 cachesize 150
Feb 20 09:52:42 np0005625204.localdomain dnsmasq[310469]: DNS service limited to local subnets
Feb 20 09:52:42 np0005625204.localdomain dnsmasq[310469]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:52:42 np0005625204.localdomain dnsmasq[310469]: warning: no upstream servers configured
Feb 20 09:52:42 np0005625204.localdomain dnsmasq-dhcp[310469]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:52:42 np0005625204.localdomain dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 0 addresses
Feb 20 09:52:42 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host
Feb 20 09:52:42 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts
Feb 20 09:52:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:42.892 264355 INFO neutron.agent.dhcp.agent [None req-539f1d93-9842-489c-8a64-1b88aa6e049f - - - - - -] DHCP configuration for ports {'8089ee58-1d9f-439a-ac4b-21c4ed035ba8'} is completed
Feb 20 09:52:43 np0005625204.localdomain ceph-mon[301857]: pgmap v150: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 8.8 KiB/s rd, 341 B/s wr, 12 op/s
Feb 20 09:52:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:44.245 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:44 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:44Z|00146|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:45.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:45 np0005625204.localdomain ceph-mon[301857]: pgmap v151: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:52:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e110 e110: 6 total, 6 up, 6 in
Feb 20 09:52:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:45.597 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:45 np0005625204.localdomain systemd[1]: tmp-crun.txsIH4.mount: Deactivated successfully.
Feb 20 09:52:45 np0005625204.localdomain dnsmasq[310152]: exiting on receipt of SIGTERM
Feb 20 09:52:45 np0005625204.localdomain podman[310485]: 2026-02-20 09:52:45.651488702 +0000 UTC m=+0.072347176 container kill ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:45 np0005625204.localdomain systemd[1]: libpod-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331.scope: Deactivated successfully.
Feb 20 09:52:45 np0005625204.localdomain podman[310498]: 2026-02-20 09:52:45.719392312 +0000 UTC m=+0.056835056 container died ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:52:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:45 np0005625204.localdomain podman[310498]: 2026-02-20 09:52:45.753140517 +0000 UTC m=+0.090583261 container cleanup ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:52:45 np0005625204.localdomain systemd[1]: libpod-conmon-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331.scope: Deactivated successfully.
Feb 20 09:52:45 np0005625204.localdomain podman[310505]: 2026-02-20 09:52:45.796325746 +0000 UTC m=+0.122251150 container remove ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d260fabb-b595-4411-92db-b47a732060f6, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:52:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.033 264355 INFO neutron.agent.dhcp.agent [None req-33bf7414-f555-433c-8361-85d7e8823947 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:46 np0005625204.localdomain ceph-mon[301857]: osdmap e110: 6 total, 6 up, 6 in
Feb 20 09:52:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.174 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:46.497 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-92bd17572daf82e995501baa24d56fe0cc42bf636ed181dc5240c8cecc9c8bc8-merged.mount: Deactivated successfully.
Feb 20 09:52:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ca1bbc8d0969d571201a54d5dae485a103af622d7b9d7a471d90ddb82bb0f331-userdata-shm.mount: Deactivated successfully.
Feb 20 09:52:46 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dd260fabb\x2db595\x2d4411\x2d92db\x2db47a732060f6.mount: Deactivated successfully.
Feb 20 09:52:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.865 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:46Z, description=, device_id=d472f0b4-01df-4346-9239-5246395c8051, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a625b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a622e0>], id=ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa, ip_allocation=immediate, mac_address=fa:16:3e:9a:69:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:39Z, description=, dns_domain=, id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1575908493-network, port_security_enabled=True, project_id=2039424f830e4ef5aa461223cac1ffd5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39792, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=894, status=ACTIVE, subnets=['ddb79cd2-81f4-40d9-9ce0-203e7af8c023'], tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:39Z, vlan_transparent=None, network_id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, port_security_enabled=False, project_id=2039424f830e4ef5aa461223cac1ffd5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=953, status=DOWN, tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:46Z on network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9
Feb 20 09:52:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:46.997 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:47 np0005625204.localdomain systemd[1]: tmp-crun.X8E8QB.mount: Deactivated successfully.
Feb 20 09:52:47 np0005625204.localdomain dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 1 addresses
Feb 20 09:52:47 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host
Feb 20 09:52:47 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts
Feb 20 09:52:47 np0005625204.localdomain podman[310544]: 2026-02-20 09:52:47.09839958 +0000 UTC m=+0.076254395 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:52:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e111 e111: 6 total, 6 up, 6 in
Feb 20 09:52:47 np0005625204.localdomain ceph-mon[301857]: pgmap v153: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 3.8 KiB/s rd, 307 B/s wr, 5 op/s
Feb 20 09:52:47 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:47.375 264355 INFO neutron.agent.dhcp.agent [None req-bdd6a5c4-2441-4f1e-9251-d014d8172b09 - - - - - -] DHCP configuration for ports {'ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa'} is completed
Feb 20 09:52:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:47.622 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:52:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:52:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:52:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157080 "" "Go-http-client/1.1"
Feb 20 09:52:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:52:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18804 "" "Go-http-client/1.1"
Feb 20 09:52:48 np0005625204.localdomain ceph-mon[301857]: osdmap e111: 6 total, 6 up, 6 in
Feb 20 09:52:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:52:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:49.092 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:52:46Z, description=, device_id=d472f0b4-01df-4346-9239-5246395c8051, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59f9790>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df626c130>], id=ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa, ip_allocation=immediate, mac_address=fa:16:3e:9a:69:22, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:52:39Z, description=, dns_domain=, id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroup264TestJSON-1575908493-network, port_security_enabled=True, project_id=2039424f830e4ef5aa461223cac1ffd5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39792, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=894, status=ACTIVE, subnets=['ddb79cd2-81f4-40d9-9ce0-203e7af8c023'], tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:39Z, vlan_transparent=None, network_id=76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, port_security_enabled=False, project_id=2039424f830e4ef5aa461223cac1ffd5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=953, status=DOWN, tags=[], tenant_id=2039424f830e4ef5aa461223cac1ffd5, updated_at=2026-02-20T09:52:46Z on network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9
Feb 20 09:52:49 np0005625204.localdomain ceph-mon[301857]: pgmap v155: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 4.7 KiB/s rd, 383 B/s wr, 6 op/s
Feb 20 09:52:49 np0005625204.localdomain systemd[1]: tmp-crun.MoB5l7.mount: Deactivated successfully.
Feb 20 09:52:49 np0005625204.localdomain podman[310564]: 2026-02-20 09:52:49.170364201 +0000 UTC m=+0.104768350 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:52:49 np0005625204.localdomain podman[310564]: 2026-02-20 09:52:49.18318424 +0000 UTC m=+0.117588369 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:52:49 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:52:49 np0005625204.localdomain dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 1 addresses
Feb 20 09:52:49 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host
Feb 20 09:52:49 np0005625204.localdomain podman[310606]: 2026-02-20 09:52:49.315729031 +0000 UTC m=+0.061234029 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:52:49 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts
Feb 20 09:52:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:49.622 264355 INFO neutron.agent.dhcp.agent [None req-95e51d7c-eec2-451f-bfd2-388d45597fda - - - - - -] DHCP configuration for ports {'ffc0e5b5-3d48-4ba4-b97e-817d2cfecbaa'} is completed
Feb 20 09:52:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:49.925 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:50.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:51 np0005625204.localdomain ceph-mon[301857]: pgmap v156: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.9 KiB/s wr, 20 op/s
Feb 20 09:52:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:51.524 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/85473296' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 e112: 6 total, 6 up, 6 in
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.747 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:52:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:52.747 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:52:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:52:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2540140338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:53 np0005625204.localdomain ceph-mon[301857]: pgmap v157: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 4.1 KiB/s wr, 49 op/s
Feb 20 09:52:53 np0005625204.localdomain ceph-mon[301857]: osdmap e112: 6 total, 6 up, 6 in
Feb 20 09:52:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3931567334' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2540140338' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.207 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.274 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.495 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.497 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11366MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.497 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.498 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.602 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.603 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:52:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:53.638 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:52:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:52:54 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2299012886' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.154 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.158 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.164 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.178 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.181 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.181 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:52:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2299012886' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:52:54 np0005625204.localdomain dnsmasq[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/addn_hosts - 0 addresses
Feb 20 09:52:54 np0005625204.localdomain podman[310687]: 2026-02-20 09:52:54.227921021 +0000 UTC m=+0.044613124 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:52:54 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/host
Feb 20 09:52:54 np0005625204.localdomain dnsmasq-dhcp[310469]: read /var/lib/neutron/dhcp/76a56eb0-2df1-4460-a42f-3d7d5c92bfc9/opts
Feb 20 09:52:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:52:54 np0005625204.localdomain systemd[1]: tmp-crun.2MMR78.mount: Deactivated successfully.
Feb 20 09:52:54 np0005625204.localdomain systemd[1]: tmp-crun.hSJIzO.mount: Deactivated successfully.
Feb 20 09:52:54 np0005625204.localdomain podman[310700]: 2026-02-20 09:52:54.312038463 +0000 UTC m=+0.069935372 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:52:54 np0005625204.localdomain podman[310700]: 2026-02-20 09:52:54.320538391 +0000 UTC m=+0.078435290 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:52:54 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:52:54 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:54Z|00147|binding|INFO|Releasing lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab from this chassis (sb_readonly=0)
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.407 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:54 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:54Z|00148|binding|INFO|Setting lport e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab down in Southbound
Feb 20 09:52:54 np0005625204.localdomain kernel: device tape04a3d27-d2 left promiscuous mode
Feb 20 09:52:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:54.415 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2039424f830e4ef5aa461223cac1ffd5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=32672433-7024-441f-825a-7707135603bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:52:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:54.416 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e04a3d27-d2c5-430c-ae0a-6f9f0b1045ab in datapath 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9 unbound from our chassis
Feb 20 09:52:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:54.418 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:52:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:54.419 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[685bc5f8-ea40-4ad1-8024-976a766add66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:52:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:54.436 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:55 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:55.210 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:55 np0005625204.localdomain ceph-mon[301857]: pgmap v159: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 3.7 KiB/s wr, 43 op/s
Feb 20 09:52:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:55.671 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:52:56 np0005625204.localdomain sshd[310733]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:52:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:56.045 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:52:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:56.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:52:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:52:56.048 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:52:56 np0005625204.localdomain podman[310734]: 2026-02-20 09:52:56.144631262 +0000 UTC m=+0.084186675 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, version=9.7, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 09:52:56 np0005625204.localdomain podman[310734]: 2026-02-20 09:52:56.158453641 +0000 UTC m=+0.098009064 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7)
Feb 20 09:52:56 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:52:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:56.306 2 INFO neutron.agent.securitygroups_rpc [None req-35b4089d-d96b-4223-91a7-29363be26031 9b5edcaf5d0f48eea2ef440e3b3c2f79 85741ccf160049968710bbf0d3ed7a21 - - default default] Security group member updated ['1f0747df-ad50-4106-9a56-f1b68b2201c8']
Feb 20 09:52:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:56Z|00149|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:56.387 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:52:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:52:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:52:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:52:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:52:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:52:57 np0005625204.localdomain sshd[310733]: Invalid user oracle from 86.99.116.54 port 39228
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.182 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.184 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.185 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.185 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:57 np0005625204.localdomain ceph-mon[301857]: pgmap v160: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 3.6 KiB/s wr, 41 op/s
Feb 20 09:52:57 np0005625204.localdomain sshd[310733]: Received disconnect from 86.99.116.54 port 39228:11: Bye Bye [preauth]
Feb 20 09:52:57 np0005625204.localdomain sshd[310733]: Disconnected from invalid user oracle 86.99.116.54 port 39228 [preauth]
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.628 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:57 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:57Z|00150|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.687 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:57.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:58Z|00151|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.070 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:52:58Z|00152|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.341 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:52:58 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:52:58.354 2 INFO neutron.agent.securitygroups_rpc [None req-3bf8b391-96d6-4728-ae92-83d8f7b4ba3a 9b5edcaf5d0f48eea2ef440e3b3c2f79 85741ccf160049968710bbf0d3ed7a21 - - default default] Security group member updated ['1f0747df-ad50-4106-9a56-f1b68b2201c8']
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:52:58 np0005625204.localdomain dnsmasq[310469]: exiting on receipt of SIGTERM
Feb 20 09:52:58 np0005625204.localdomain podman[310774]: 2026-02-20 09:52:58.801168819 +0000 UTC m=+0.060013312 container kill b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:52:58 np0005625204.localdomain systemd[1]: libpod-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf.scope: Deactivated successfully.
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.809 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:52:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:58.810 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:52:58 np0005625204.localdomain podman[310789]: 2026-02-20 09:52:58.870107831 +0000 UTC m=+0.049159303 container died b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:52:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf-userdata-shm.mount: Deactivated successfully.
Feb 20 09:52:58 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-46e326f6f6e12d73c52b76eeb8e2267cd2522e4c76d841349323c3f224f65b99-merged.mount: Deactivated successfully.
Feb 20 09:52:58 np0005625204.localdomain podman[310789]: 2026-02-20 09:52:58.918030894 +0000 UTC m=+0.097082316 container remove b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-76a56eb0-2df1-4460-a42f-3d7d5c92bfc9, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:52:58 np0005625204.localdomain systemd[1]: libpod-conmon-b2f9a2eecb7321741cf9d8213e5fc94aa6bf501a8e0730cf1b7aa115e38949bf.scope: Deactivated successfully.
Feb 20 09:52:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:58.963 264355 INFO neutron.agent.dhcp.agent [None req-f9b829a4-b8c5-467b-8847-f002c25f37b2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:58 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d76a56eb0\x2d2df1\x2d4460\x2da42f\x2d3d7d5c92bfc9.mount: Deactivated successfully.
Feb 20 09:52:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:52:59.147 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:52:59 np0005625204.localdomain ceph-mon[301857]: pgmap v161: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 3.0 KiB/s wr, 34 op/s
Feb 20 09:52:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:59.432 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:52:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:59.464 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:52:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:52:59.465 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:53:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4201398583' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:00.709 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:00.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:00.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:53:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:53:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:53:01 np0005625204.localdomain podman[310815]: 2026-02-20 09:53:01.159685853 +0000 UTC m=+0.085570426 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:53:01 np0005625204.localdomain podman[310815]: 2026-02-20 09:53:01.199071358 +0000 UTC m=+0.124955891 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:53:01 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:53:01 np0005625204.localdomain podman[310814]: 2026-02-20 09:53:01.211011881 +0000 UTC m=+0.143634379 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:53:01 np0005625204.localdomain podman[310814]: 2026-02-20 09:53:01.246410115 +0000 UTC m=+0.179032623 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_controller)
Feb 20 09:53:01 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:53:01 np0005625204.localdomain ceph-mon[301857]: pgmap v162: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1023 B/s wr, 23 op/s
Feb 20 09:53:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3538498167' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4284690376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4284690376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:53:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:02.187 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: pgmap v163: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4284690376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:53:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4284690376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:53:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:02.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:04 np0005625204.localdomain ceph-mon[301857]: pgmap v164: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:05.051 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:53:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:05.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:06 np0005625204.localdomain sshd[310855]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:06.016 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:53:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:53:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:53:06 np0005625204.localdomain ceph-mon[301857]: pgmap v165: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:06 np0005625204.localdomain sshd[310857]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:07 np0005625204.localdomain sshd[310857]: Received disconnect from 18.221.252.160 port 41602:11: Bye Bye [preauth]
Feb 20 09:53:07 np0005625204.localdomain sshd[310857]: Disconnected from authenticating user root 18.221.252.160 port 41602 [preauth]
Feb 20 09:53:07 np0005625204.localdomain sshd[310855]: Invalid user sol from 45.148.10.240 port 54686
Feb 20 09:53:07 np0005625204.localdomain sshd[310855]: Connection closed by invalid user sol 45.148.10.240 port 54686 [preauth]
Feb 20 09:53:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:07.679 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:08 np0005625204.localdomain ceph-mon[301857]: pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:10 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:53:10 np0005625204.localdomain podman[310859]: 2026-02-20 09:53:10.156723403 +0000 UTC m=+0.094751176 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:53:10 np0005625204.localdomain podman[310859]: 2026-02-20 09:53:10.172929945 +0000 UTC m=+0.110957728 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:53:10 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:53:10 np0005625204.localdomain ceph-mon[301857]: pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:10.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:11 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:11.447 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:12 np0005625204.localdomain sshd[310878]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:12 np0005625204.localdomain sshd[310878]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:53:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:12.683 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:12 np0005625204.localdomain ceph-mon[301857]: pgmap v168: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:14.368 2 INFO neutron.agent.securitygroups_rpc [None req-8db2cda0-f70c-405a-8e32-bbd09e8f7101 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']
Feb 20 09:53:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:14.445 2 INFO neutron.agent.securitygroups_rpc [None req-8db2cda0-f70c-405a-8e32-bbd09e8f7101 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']
Feb 20 09:53:14 np0005625204.localdomain ceph-mon[301857]: pgmap v169: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:14.933 2 INFO neutron.agent.securitygroups_rpc [None req-ea37899b-0895-4039-936c-a92dc4af71cc 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']
Feb 20 09:53:15 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:15.277 2 INFO neutron.agent.securitygroups_rpc [None req-6181ff08-47fa-4ccb-88a6-fd4810762b1a 253c13edd6b940a9b5cd64cc7bfa0ff5 bae77758d77d4d43af7ac10744892742 - - default default] Security group member updated ['f7e09389-cc97-415e-b643-e34abd10c9b7']
Feb 20 09:53:15 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:15.313 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:15.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:16 np0005625204.localdomain sshd[310880]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:16 np0005625204.localdomain ceph-mon[301857]: pgmap v170: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:16 np0005625204.localdomain sshd[310880]: Received disconnect from 154.91.170.41 port 43918:11: Bye Bye [preauth]
Feb 20 09:53:16 np0005625204.localdomain sshd[310880]: Disconnected from authenticating user root 154.91.170.41 port 43918 [preauth]
Feb 20 09:53:17 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:17Z|00153|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:53:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:17.460 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:17.685 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:53:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:53:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:53:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:53:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:53:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18338 "" "Go-http-client/1.1"
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.209 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.209 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.209 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.215 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85cddcab-e1ba-4038-8de8-3a88f2ce8995', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.210353', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fb9be9be-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'ec8bf3898737f735819f5db2fffcd7b1cef436b8da5111d5343b5388f0163bb2'}]}, 'timestamp': '2026-02-20 09:53:18.216140', '_unique_id': 'f9220cc9dcaf4bedbb201fd4203e388f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.217 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.219 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ea0b5f5-d6e2-440d-95b8-2a9f6776a2db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.219165', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fb9c78ca-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '0dedeb6f160609512b7914c61c4300fe74db374d331378eca39b439140ac4835'}]}, 'timestamp': '2026-02-20 09:53:18.219840', '_unique_id': '0bc49c2bea0543ccb83ee6f31a2a1bc2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.220 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.254 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.255 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c2c45bc-ed93-4641-b579-1294869be61a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.222152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fba1e364-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '1bf9f50a05e1746c237d63c3c69a328e5d366319834db50c5dddfebcb68ae0a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.222152', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fba1f8f4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'e42fb933d7779b7147f878dea45cb254bee717a0bd6b22f4d1a15f3263af9bd2'}]}, 'timestamp': '2026-02-20 09:53:18.255857', '_unique_id': '7f53ff5bef6246d4b5c3c730185fbfb7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.257 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.258 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 17020000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dd8b2d02-e4c4-495a-b2b9-55f5435517c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17020000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:53:18.258530', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'fba4f946-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.514018839, 'message_signature': 'df1ef83cc1684faccd8605f907629be6c001aad8c4a4fa9e8923099591f95fbd'}]}, 'timestamp': '2026-02-20 09:53:18.275443', '_unique_id': 'dbb20c1906a14f3b827193e07031ba90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.277 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.277 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ef533af-af0f-4e25-8cf8-5e338f170601', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:53:18.277874', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'fba56aac-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.514018839, 'message_signature': 'b721388681561c72bfe9ba88155dad8c74a2e075fe40ff63210e38aad8b6811d'}]}, 'timestamp': '2026-02-20 09:53:18.278336', '_unique_id': 'e931144974234ea29000b2386dcb38c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.279 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.280 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.280 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4fa010f-b7c4-4225-b2ae-b615a33f470d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.280705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fba5db36-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '73537861ef1090a9df195dee265a1250c6f189ac533046a907ec4ebd8897575b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.280705', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fba5ec3e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'c0db6ae4c5e421cd4ff187a2d9ad3698cccd156c274aed84d9145e45c5851e77'}]}, 'timestamp': '2026-02-20 09:53:18.281667', '_unique_id': 'f22991d8cdaf47868cc7c68bf4f3a3c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.284 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd307503f-3c91-4ff8-b50e-f969d51c6198', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.283990', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fba659e4-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'f2b2447a9e91c2a71defcb61139c7fe11f93a740d93b7824c93483dbcc9479f9'}]}, 'timestamp': '2026-02-20 09:53:18.284473', '_unique_id': '558f5cb1a1a7448b9cf75cf5bfed0817'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.285 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.287 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5268b6d8-b743-4a71-9f8b-20d21884d5c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.287083', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fba6d266-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'b6520196223df5e0f543ef08f6afad0ead823ae41cbbdb6fabc4aa220ebef70d'}]}, 'timestamp': '2026-02-20 09:53:18.287561', '_unique_id': '9d1816720a47461cb95ec87612a97fe7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.288 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.289 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.289 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '68599abd-d603-465d-8431-7afccb8e1a4b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.289787', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fba73bde-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'a30b2b0f901c1c65584a8e4d93445e2080a03dedf16a9e84b2b83987400e23fe'}]}, 'timestamp': '2026-02-20 09:53:18.290255', '_unique_id': 'a8b1c1b12545456694aa54158c324b11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.291 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bbb12f8a-4e29-4391-bba0-be94fe59193d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.292512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fba9730e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '02b1cdbee1e4ea7777f95c66368d3cfdc7b6152cd49bbb6cb939cada2bad08c6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.292512', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fba98e2a-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '91fd42f9192411ae6fba0e9dbef4dae4b52a8c53d57e7a1220a62ee802c98e30'}]}, 'timestamp': '2026-02-20 09:53:18.305561', '_unique_id': '629cfc0e1f694b999d0c83f614321673'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.308 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.309 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b83e493a-a202-46bc-a838-5e3f73f39da8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.309118', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbaa338e-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '78a8f7da5b12390c8e030950bf4b112c70cc55a3d63ee2cffc76fcaaa83ca456'}]}, 'timestamp': '2026-02-20 09:53:18.309854', '_unique_id': '01e859984deb4cdfa5e4e963c1ecaa95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.311 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38a52775-8893-4a95-961e-af422b28c926', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.312965', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbaac95c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '77ff1392e57a5ac9d529cfcd7977f27696b89c8ab970853976924796aae37889'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.312965', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbaae2de-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': 'cb286e6a78437a62746219f0c8e5b121f0b09d1d96ad5630b4af8e1cda189f66'}]}, 'timestamp': '2026-02-20 09:53:18.314274', '_unique_id': '01a55539d2804ace84ae5fff71aacd4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.317 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dcd4fa4-3c24-4026-a756-14ce0d02655c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.317620', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbab81c6-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '1475d9c436cc2d054b481cc40e6645bd6ec8f541e3078616d314a6df1535537a'}]}, 'timestamp': '2026-02-20 09:53:18.318370', '_unique_id': 'b1e2c5016071465bb8c66f8a7fa9fd1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.319 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.322 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7349302d-2bce-4b7a-bb34-d02612d57054', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.321482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbac17a8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '0e742abaf9f7d3bdd4dd13df30bbbf7770bc1a71ba3a3c38bc535a55f82bb927'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.321482', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbac2ffe-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '257e2a370ced78e7b31eef770ca3c83c10f8add5c313cab02d1c418e4cc12d1f'}]}, 'timestamp': '2026-02-20 09:53:18.322812', '_unique_id': '1bcea503a3834a948383da9c0576dc68'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd92dafe8-51cb-47a0-aa11-faae5e7ba786', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.325571', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbacb6d6-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'e634c65c5c6d84667c738ca6939bf2ac41663e55d5c11dcfe954649b787bff3d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.325571', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbaccdba-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '00b82c3e770a946cf6482421293c9ee40c41c7a34d19d8cb6148f554fb550c67'}]}, 'timestamp': '2026-02-20 09:53:18.326879', '_unique_id': '9b41e0686aa94f6494818509c2ad36c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc3ffec8-432e-44a6-b28f-aae7d213a014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.329307', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbad41fa-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'b5bdd9ac208a5e9e32641e2e0922892b52ac5300f96b2ed7ca22e6d8c58e6558'}]}, 'timestamp': '2026-02-20 09:53:18.329712', '_unique_id': 'e4e93e7b88fc41f79a1e03b08035d698'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3852c963-af1f-4647-8dd0-4370211f7714', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.331519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbad98ee-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '77102fccf1a9ee2ea964e690ee698c1f53dfaf5da22173d7796aa7b6a5651b90'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.331519', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbada7f8-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '3f2796e8765632aa5ee0d8e3ef8176e606bed23d4b4c8a3875b357bdac4f65e5'}]}, 'timestamp': '2026-02-20 09:53:18.332293', '_unique_id': '99512252946d46829ae6f1ccb6c4befd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.333 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.334 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.334 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '414acb6d-44d8-411d-b824-ae94bed388ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.334213', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbae070c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': 'd8248b47590ac660e28b6345ce9ec6007b678c03ee3d60b12232df571e3367ca'}]}, 'timestamp': '2026-02-20 09:53:18.334768', '_unique_id': '43c305aca04c49358f994a2fcc7e584a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.335 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.336 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.336 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.336 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd47adfac-9b9e-4e9e-a759-817126384ebe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.336328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbae53ec-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': 'b245e35a6facc9bc943024e6a6b8d5877997d459152435736c6056321de71a1f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.336328', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbae5f40-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.461376872, 'message_signature': '5f4f25f9788d274201f0deb104015658512facd7a232eaa2279f602ec6918beb'}]}, 'timestamp': '2026-02-20 09:53:18.336928', '_unique_id': '386b57fead55499a82356e464939fecc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.337 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.338 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.338 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9360518f-f609-4a5d-a2df-31059eb52a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:53:18.338269', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'fbae9e92-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.449578504, 'message_signature': '4eab9125a22dabf1173bf4a27feeab46384bc1fc01e7477eed28623dde72b665'}]}, 'timestamp': '2026-02-20 09:53:18.338576', '_unique_id': 'bb7e0563df5e4f68bf3d149787426264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.339 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.340 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c22ad54-6342-43b1-bed6-538b47cbfece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:53:18.339940', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'fbaedf4c-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '71095806b1157a0a0e36f87f03a8530def284fe994a9f0c8db2e997fc43a37c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:53:18.339940', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'fbaee8fc-0e41-11f1-9294-fa163ef029e2', 'monotonic_time': 11757.531733097, 'message_signature': '3ec6a8390e73d67dffd5868fba5dce4fbb4b4008dc9d3523e4858b9eab818d11'}]}, 'timestamp': '2026-02-20 09:53:18.340453', '_unique_id': '1effd05e8ddd4b339f1a2abef4e9a899'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:53:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:53:18.341 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:53:18 np0005625204.localdomain ceph-mon[301857]: pgmap v171: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:20 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:53:20 np0005625204.localdomain systemd[1]: tmp-crun.xDbH2D.mount: Deactivated successfully.
Feb 20 09:53:20 np0005625204.localdomain podman[310882]: 2026-02-20 09:53:20.153393612 +0000 UTC m=+0.090496896 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:53:20 np0005625204.localdomain podman[310882]: 2026-02-20 09:53:20.16615841 +0000 UTC m=+0.103261634 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:53:20 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:53:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:20 np0005625204.localdomain ceph-mon[301857]: pgmap v172: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:20.874 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.619402) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202619444, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 752, "num_deletes": 256, "total_data_size": 682253, "memory_usage": 696848, "flush_reason": "Manual Compaction"}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202625012, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 444548, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18184, "largest_seqno": 18931, "table_properties": {"data_size": 441222, "index_size": 1181, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7843, "raw_average_key_size": 18, "raw_value_size": 434335, "raw_average_value_size": 1049, "num_data_blocks": 52, "num_entries": 414, "num_filter_entries": 414, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581162, "oldest_key_time": 1771581162, "file_creation_time": 1771581202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 5662 microseconds, and 2083 cpu microseconds.
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.625065) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 444548 bytes OK
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.625088) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.626858) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.626881) EVENT_LOG_v1 {"time_micros": 1771581202626875, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.626902) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 678233, prev total WAL file size 678233, number of live WAL files 2.
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.627563) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373731' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end)
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(434KB)], [24(16MB)]
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202628242, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 18043148, "oldest_snapshot_seqno": -1}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12184 keys, 17901858 bytes, temperature: kUnknown
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202716036, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 17901858, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17833220, "index_size": 37113, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30469, "raw_key_size": 327028, "raw_average_key_size": 26, "raw_value_size": 17626568, "raw_average_value_size": 1446, "num_data_blocks": 1409, "num_entries": 12184, "num_filter_entries": 12184, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581202, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.716386) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 17901858 bytes
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.718049) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 205.5 rd, 203.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 16.8 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(80.9) write-amplify(40.3) OK, records in: 12713, records dropped: 529 output_compression: NoCompression
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.718082) EVENT_LOG_v1 {"time_micros": 1771581202718067, "job": 12, "event": "compaction_finished", "compaction_time_micros": 87800, "compaction_time_cpu_micros": 50830, "output_level": 6, "num_output_files": 1, "total_output_size": 17901858, "num_input_records": 12713, "num_output_records": 12184, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202718307, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581202720678, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.627485) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720756) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720763) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:53:22.720766) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:53:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:22.730 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:22 np0005625204.localdomain ceph-mon[301857]: pgmap v173: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:23 np0005625204.localdomain sshd[310906]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:24 np0005625204.localdomain sshd[310906]: Invalid user peng from 182.93.7.194 port 62646
Feb 20 09:53:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:53:24 np0005625204.localdomain podman[310908]: 2026-02-20 09:53:24.53282162 +0000 UTC m=+0.069846820 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:53:24 np0005625204.localdomain podman[310908]: 2026-02-20 09:53:24.541947537 +0000 UTC m=+0.078972747 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:53:24 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:53:24 np0005625204.localdomain sshd[310906]: Received disconnect from 182.93.7.194 port 62646:11: Bye Bye [preauth]
Feb 20 09:53:24 np0005625204.localdomain sshd[310906]: Disconnected from invalid user peng 182.93.7.194 port 62646 [preauth]
Feb 20 09:53:24 np0005625204.localdomain ceph-mon[301857]: pgmap v174: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:25.029 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:25 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:25.074 2 INFO neutron.agent.securitygroups_rpc [None req-9d454723-199e-4c87-997c-435a75780787 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:25.324 264355 INFO neutron.agent.linux.ip_lib [None req-13bbaa62-f04c-4d9b-baf6-53ade5713af0 - - - - - -] Device tapf687f8e2-05 cannot be used as it has no MAC address
Feb 20 09:53:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:25.391 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:25 np0005625204.localdomain kernel: device tapf687f8e2-05 entered promiscuous mode
Feb 20 09:53:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:25.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:25 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581205.4026] manager: (tapf687f8e2-05): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Feb 20 09:53:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:25Z|00154|binding|INFO|Claiming lport f687f8e2-05bc-41c6-b5b4-d21133776b71 for this chassis.
Feb 20 09:53:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:25Z|00155|binding|INFO|f687f8e2-05bc-41c6-b5b4-d21133776b71: Claiming unknown
Feb 20 09:53:25 np0005625204.localdomain systemd-udevd[310942]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:53:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:25.416 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5a2540adf694dd98037b7689be10187', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dffa584b-3b39-44ca-bfd8-0760f34a6a59, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f687f8e2-05bc-41c6-b5b4-d21133776b71) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:53:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:25.419 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f687f8e2-05bc-41c6-b5b4-d21133776b71 in datapath a25517b2-5049-4a57-ad98-549dad6f59bf bound to our chassis
Feb 20 09:53:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:25.422 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 7e882f4a-c254-4aea-aa42-5ef07e3c29fe IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:53:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:25.422 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a25517b2-5049-4a57-ad98-549dad6f59bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:53:25 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:25.424 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ab932b0c-eac9-4e06-82d4-835fcd089d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:53:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:25Z|00156|binding|INFO|Setting lport f687f8e2-05bc-41c6-b5b4-d21133776b71 ovn-installed in OVS
Feb 20 09:53:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:25Z|00157|binding|INFO|Setting lport f687f8e2-05bc-41c6-b5b4-d21133776b71 up in Southbound
Feb 20 09:53:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:25.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:25.499 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:25.532 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:25 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:25.544 2 INFO neutron.agent.securitygroups_rpc [None req-1234b8d9-654d-451e-95cb-316b1fc4ede0 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:25.876 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:26 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:26.315 2 INFO neutron.agent.securitygroups_rpc [None req-2cdd2daf-d30d-4deb-a790-e995ba310f91 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:26 np0005625204.localdomain podman[310997]: 
Feb 20 09:53:26 np0005625204.localdomain podman[310997]: 2026-02-20 09:53:26.358142509 +0000 UTC m=+0.088521276 container create e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 20 09:53:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:53:26 np0005625204.localdomain systemd[1]: Started libpod-conmon-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981.scope.
Feb 20 09:53:26 np0005625204.localdomain podman[310997]: 2026-02-20 09:53:26.316199696 +0000 UTC m=+0.046578513 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:53:26 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:53:26 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8f122d768a0c33de3c5f1477629b4a40cd49a429ab5028fb1c39b3e5c6badc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:53:26 np0005625204.localdomain podman[310997]: 2026-02-20 09:53:26.438766865 +0000 UTC m=+0.169145632 container init e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:53:26 np0005625204.localdomain dnsmasq[311027]: started, version 2.85 cachesize 150
Feb 20 09:53:26 np0005625204.localdomain dnsmasq[311027]: DNS service limited to local subnets
Feb 20 09:53:26 np0005625204.localdomain dnsmasq[311027]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:53:26 np0005625204.localdomain dnsmasq[311027]: warning: no upstream servers configured
Feb 20 09:53:26 np0005625204.localdomain dnsmasq-dhcp[311027]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:53:26 np0005625204.localdomain dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 0 addresses
Feb 20 09:53:26 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host
Feb 20 09:53:26 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts
Feb 20 09:53:26 np0005625204.localdomain podman[311012]: 2026-02-20 09:53:26.4850933 +0000 UTC m=+0.084735241 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, config_id=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, distribution-scope=public, version=9.7)
Feb 20 09:53:26 np0005625204.localdomain podman[310997]: 2026-02-20 09:53:26.505992605 +0000 UTC m=+0.236371322 container start e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 20 09:53:26 np0005625204.localdomain podman[311012]: 2026-02-20 09:53:26.526240509 +0000 UTC m=+0.125882510 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7)
Feb 20 09:53:26 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:53:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:53:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:53:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:53:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:53:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:53:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:53:26 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:26.718 264355 INFO neutron.agent.dhcp.agent [None req-43377fec-b530-4ba7-bacd-9799b71bae15 - - - - - -] DHCP configuration for ports {'533631ef-8ab5-4d0a-a021-5d8f4521578b'} is completed
Feb 20 09:53:26 np0005625204.localdomain ceph-mon[301857]: pgmap v175: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:27.236 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:27 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:27.302 2 INFO neutron.agent.securitygroups_rpc [None req-170d638f-d647-4f80-a7a3-f133bd9dbf7c 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:27 np0005625204.localdomain systemd[1]: tmp-crun.Ua84ZA.mount: Deactivated successfully.
Feb 20 09:53:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:27.762 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:28.422 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:53:28Z, description=, device_id=c60b906a-b861-42f1-98c4-c7541cbc3cf5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5aaa730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df6300970>], id=70f9ae8f-4400-4da9-92b5-3befab79f396, ip_allocation=immediate, mac_address=fa:16:3e:33:28:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:53:23Z, description=, dns_domain=, id=a25517b2-5049-4a57-ad98-549dad6f59bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-1446003171-network, port_security_enabled=True, project_id=c5a2540adf694dd98037b7689be10187, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12102, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1222, status=ACTIVE, subnets=['a7e1a7af-4593-4a29-a16f-95ab101a15e7'], tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:23Z, vlan_transparent=None, network_id=a25517b2-5049-4a57-ad98-549dad6f59bf, port_security_enabled=False, project_id=c5a2540adf694dd98037b7689be10187, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1258, status=DOWN, tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:28Z on network a25517b2-5049-4a57-ad98-549dad6f59bf
Feb 20 09:53:28 np0005625204.localdomain podman[311054]: 2026-02-20 09:53:28.648540839 +0000 UTC m=+0.062962612 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:53:28 np0005625204.localdomain dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 1 addresses
Feb 20 09:53:28 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host
Feb 20 09:53:28 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts
Feb 20 09:53:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:28.867 264355 INFO neutron.agent.dhcp.agent [None req-8949ddb7-38c0-4438-b691-395532cd4240 - - - - - -] DHCP configuration for ports {'70f9ae8f-4400-4da9-92b5-3befab79f396'} is completed
Feb 20 09:53:28 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:28.909 2 INFO neutron.agent.securitygroups_rpc [None req-c8a66b7a-94e3-4469-9bdc-a709861d759e 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:28 np0005625204.localdomain ceph-mon[301857]: pgmap v176: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:29.296 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:53:28Z, description=, device_id=c60b906a-b861-42f1-98c4-c7541cbc3cf5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5b0f4f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5b0fa60>], id=70f9ae8f-4400-4da9-92b5-3befab79f396, ip_allocation=immediate, mac_address=fa:16:3e:33:28:3c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:53:23Z, description=, dns_domain=, id=a25517b2-5049-4a57-ad98-549dad6f59bf, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-DeleteServersTestJSON-1446003171-network, port_security_enabled=True, project_id=c5a2540adf694dd98037b7689be10187, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12102, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1222, status=ACTIVE, subnets=['a7e1a7af-4593-4a29-a16f-95ab101a15e7'], tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:23Z, vlan_transparent=None, network_id=a25517b2-5049-4a57-ad98-549dad6f59bf, port_security_enabled=False, project_id=c5a2540adf694dd98037b7689be10187, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1258, status=DOWN, tags=[], tenant_id=c5a2540adf694dd98037b7689be10187, updated_at=2026-02-20T09:53:28Z on network a25517b2-5049-4a57-ad98-549dad6f59bf
Feb 20 09:53:29 np0005625204.localdomain dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 1 addresses
Feb 20 09:53:29 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host
Feb 20 09:53:29 np0005625204.localdomain podman[311093]: 2026-02-20 09:53:29.497743292 +0000 UTC m=+0.063865859 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:53:29 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts
Feb 20 09:53:29 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:29.554 2 INFO neutron.agent.securitygroups_rpc [None req-47b66a4e-d860-4539-a526-725ae67efd11 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:29.722 264355 INFO neutron.agent.dhcp.agent [None req-54eea36c-7112-4fb3-897a-62ef0cd96ce8 - - - - - -] DHCP configuration for ports {'70f9ae8f-4400-4da9-92b5-3befab79f396'} is completed
Feb 20 09:53:30 np0005625204.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 20 09:53:30 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:30.457 2 INFO neutron.agent.securitygroups_rpc [None req-11ba55d2-9392-4bf8-866e-0f6b7421a111 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:30 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:30.868 2 INFO neutron.agent.securitygroups_rpc [None req-813a578a-3c42-4faa-a8fa-18fef9f75e4f 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:30.877 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:30 np0005625204.localdomain ceph-mon[301857]: pgmap v177: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:31 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:31.469 2 INFO neutron.agent.securitygroups_rpc [None req-cf8e4aa2-945c-4732-b1d9-06cd52c9d8b9 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:53:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:53:32 np0005625204.localdomain podman[311115]: 2026-02-20 09:53:32.153375 +0000 UTC m=+0.090781995 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ovn_controller)
Feb 20 09:53:32 np0005625204.localdomain podman[311115]: 2026-02-20 09:53:32.203177912 +0000 UTC m=+0.140584947 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:53:32 np0005625204.localdomain podman[311116]: 2026-02-20 09:53:32.216860697 +0000 UTC m=+0.150901830 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:53:32 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:53:32 np0005625204.localdomain podman[311116]: 2026-02-20 09:53:32.231104849 +0000 UTC m=+0.165145982 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:53:32 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:53:32 np0005625204.localdomain sshd[311158]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:32.765 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:32 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:32.884 2 INFO neutron.agent.securitygroups_rpc [None req-0ba4bd77-300a-4aef-8bf9-70b27ff0d0d5 eedc91db7da847aab912b3b8401d5b18 8d5c2f81bbf4423c8ccdbeb44081c499 - - default default] Security group member updated ['943c86ba-7264-4974-89ae-938b95d72620']
Feb 20 09:53:32 np0005625204.localdomain ceph-mon[301857]: pgmap v178: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:33.258 2 INFO neutron.agent.securitygroups_rpc [None req-e49f1726-acc6-4366-8506-477a12f2a7e4 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:33.259 2 INFO neutron.agent.securitygroups_rpc [None req-bbdf9d9d-afcb-4396-b80a-79eb3001d8e5 eedc91db7da847aab912b3b8401d5b18 8d5c2f81bbf4423c8ccdbeb44081c499 - - default default] Security group member updated ['943c86ba-7264-4974-89ae-938b95d72620']
Feb 20 09:53:33 np0005625204.localdomain sshd[311158]: Invalid user x from 188.166.218.64 port 49060
Feb 20 09:53:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:33.963 2 INFO neutron.agent.securitygroups_rpc [None req-55d03934-9517-419a-915b-7eb31a90c9a3 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:34 np0005625204.localdomain sshd[311158]: Received disconnect from 188.166.218.64 port 49060:11: Bye Bye [preauth]
Feb 20 09:53:34 np0005625204.localdomain sshd[311158]: Disconnected from invalid user x 188.166.218.64 port 49060 [preauth]
Feb 20 09:53:34 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:34.366 2 INFO neutron.agent.securitygroups_rpc [None req-b58a03ae-8801-426a-8262-b9afab11fa37 945c7903a05c4b26bd0f26bad77773be 75007688d77c439d8ee3fe7c58acf581 - - default default] Security group member updated ['9ad83b0d-3fc1-4df1-be0c-b38363c67626']
Feb 20 09:53:34 np0005625204.localdomain dnsmasq[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/addn_hosts - 0 addresses
Feb 20 09:53:34 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/host
Feb 20 09:53:34 np0005625204.localdomain dnsmasq-dhcp[311027]: read /var/lib/neutron/dhcp/a25517b2-5049-4a57-ad98-549dad6f59bf/opts
Feb 20 09:53:34 np0005625204.localdomain podman[311177]: 2026-02-20 09:53:34.688981038 +0000 UTC m=+0.060674022 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:53:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:34Z|00158|binding|INFO|Releasing lport f687f8e2-05bc-41c6-b5b4-d21133776b71 from this chassis (sb_readonly=0)
Feb 20 09:53:34 np0005625204.localdomain kernel: device tapf687f8e2-05 left promiscuous mode
Feb 20 09:53:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:34Z|00159|binding|INFO|Setting lport f687f8e2-05bc-41c6-b5b4-d21133776b71 down in Southbound
Feb 20 09:53:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:34.863 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:34.877 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a25517b2-5049-4a57-ad98-549dad6f59bf', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c5a2540adf694dd98037b7689be10187', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dffa584b-3b39-44ca-bfd8-0760f34a6a59, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f687f8e2-05bc-41c6-b5b4-d21133776b71) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:53:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:34.879 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f687f8e2-05bc-41c6-b5b4-d21133776b71 in datapath a25517b2-5049-4a57-ad98-549dad6f59bf unbound from our chassis
Feb 20 09:53:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:34.880 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:34.882 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:34.883 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a25517b2-5049-4a57-ad98-549dad6f59bf, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:53:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:34.884 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[75771f41-cb0f-4d14-8be3-57319846f24a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:53:34 np0005625204.localdomain ceph-mon[301857]: pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:35.913 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:36.131 264355 INFO neutron.agent.linux.ip_lib [None req-d35a7a42-6de1-4598-9fef-1364e6632d43 - - - - - -] Device tap5a613b4f-1d cannot be used as it has no MAC address
Feb 20 09:53:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:36.159 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:36 np0005625204.localdomain kernel: device tap5a613b4f-1d entered promiscuous mode
Feb 20 09:53:36 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581216.1711] manager: (tap5a613b4f-1d): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Feb 20 09:53:36 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:36Z|00160|binding|INFO|Claiming lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 for this chassis.
Feb 20 09:53:36 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:36Z|00161|binding|INFO|5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422: Claiming unknown
Feb 20 09:53:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:36.173 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:36 np0005625204.localdomain systemd-udevd[311211]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:53:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:36.182 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c31c2cb7-7585-4255-8195-898589cc1c5d, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:53:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:36.184 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 in datapath 4d3d4b22-89ae-4b72-8269-db16bc023693 bound to our chassis
Feb 20 09:53:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:36.186 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4d3d4b22-89ae-4b72-8269-db16bc023693 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:53:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:36.188 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1d846e24-d841-4d57-aa5f-5822655873cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:36Z|00162|binding|INFO|Setting lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 ovn-installed in OVS
Feb 20 09:53:36 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:36Z|00163|binding|INFO|Setting lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 up in Southbound
Feb 20 09:53:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:36.209 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap5a613b4f-1d: No such device
Feb 20 09:53:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:36.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:36 np0005625204.localdomain sudo[311214]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:53:36 np0005625204.localdomain sudo[311214]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:53:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:36.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:36 np0005625204.localdomain sudo[311214]: pam_unix(sudo:session): session closed for user root
Feb 20 09:53:36 np0005625204.localdomain sudo[311256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:53:36 np0005625204.localdomain sudo[311256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:53:36 np0005625204.localdomain ceph-mon[301857]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:37 np0005625204.localdomain sudo[311256]: pam_unix(sudo:session): session closed for user root
Feb 20 09:53:37 np0005625204.localdomain podman[311350]: 
Feb 20 09:53:37 np0005625204.localdomain podman[311350]: 2026-02-20 09:53:37.152436195 +0000 UTC m=+0.094227349 container create e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:53:37 np0005625204.localdomain systemd[1]: Started libpod-conmon-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0.scope.
Feb 20 09:53:37 np0005625204.localdomain podman[311350]: 2026-02-20 09:53:37.100738067 +0000 UTC m=+0.042529311 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:53:37 np0005625204.localdomain systemd[1]: tmp-crun.yFDNIv.mount: Deactivated successfully.
Feb 20 09:53:37 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:53:37 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f09de51b9fc12b68b5d61385aaa744ed437b92490600eef9db4abfcb8fb69f0d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:53:37 np0005625204.localdomain podman[311350]: 2026-02-20 09:53:37.238112365 +0000 UTC m=+0.179903519 container init e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:53:37 np0005625204.localdomain podman[311350]: 2026-02-20 09:53:37.24523183 +0000 UTC m=+0.187023014 container start e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:53:37 np0005625204.localdomain dnsmasq[311368]: started, version 2.85 cachesize 150
Feb 20 09:53:37 np0005625204.localdomain dnsmasq[311368]: DNS service limited to local subnets
Feb 20 09:53:37 np0005625204.localdomain dnsmasq[311368]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:53:37 np0005625204.localdomain dnsmasq[311368]: warning: no upstream servers configured
Feb 20 09:53:37 np0005625204.localdomain dnsmasq-dhcp[311368]: DHCPv6, static leases only on 2001:db8:0:ffff::, lease time 1d
Feb 20 09:53:37 np0005625204.localdomain dnsmasq[311368]: read /var/lib/neutron/dhcp/4d3d4b22-89ae-4b72-8269-db16bc023693/addn_hosts - 0 addresses
Feb 20 09:53:37 np0005625204.localdomain dnsmasq-dhcp[311368]: read /var/lib/neutron/dhcp/4d3d4b22-89ae-4b72-8269-db16bc023693/host
Feb 20 09:53:37 np0005625204.localdomain dnsmasq-dhcp[311368]: read /var/lib/neutron/dhcp/4d3d4b22-89ae-4b72-8269-db16bc023693/opts
Feb 20 09:53:37 np0005625204.localdomain sudo[311369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:53:37 np0005625204.localdomain sudo[311369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:53:37 np0005625204.localdomain sudo[311369]: pam_unix(sudo:session): session closed for user root
Feb 20 09:53:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:37.408 264355 INFO neutron.agent.dhcp.agent [None req-337407bb-513c-414c-ae67-9214459e5909 - - - - - -] DHCP configuration for ports {'ff22c793-cbfd-4069-ba90-fc7d2d6ce756'} is completed
Feb 20 09:53:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:37.808 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:37 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:53:37 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:53:37 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:53:37 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:53:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:38Z|00164|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:53:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:38.296 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:39 np0005625204.localdomain podman[311402]: 2026-02-20 09:53:39.035345761 +0000 UTC m=+0.066020814 container kill e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:53:39 np0005625204.localdomain dnsmasq[311027]: exiting on receipt of SIGTERM
Feb 20 09:53:39 np0005625204.localdomain systemd[1]: tmp-crun.XgagWJ.mount: Deactivated successfully.
Feb 20 09:53:39 np0005625204.localdomain systemd[1]: libpod-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981.scope: Deactivated successfully.
Feb 20 09:53:39 np0005625204.localdomain podman[311418]: 2026-02-20 09:53:39.117967248 +0000 UTC m=+0.063372424 container died e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:53:39 np0005625204.localdomain podman[311418]: 2026-02-20 09:53:39.149452343 +0000 UTC m=+0.094857519 container cleanup e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:53:39 np0005625204.localdomain systemd[1]: libpod-conmon-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981.scope: Deactivated successfully.
Feb 20 09:53:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d8f122d768a0c33de3c5f1477629b4a40cd49a429ab5028fb1c39b3e5c6badc0-merged.mount: Deactivated successfully.
Feb 20 09:53:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981-userdata-shm.mount: Deactivated successfully.
Feb 20 09:53:39 np0005625204.localdomain ceph-mon[301857]: pgmap v181: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:53:39 np0005625204.localdomain podman[311419]: 2026-02-20 09:53:39.188142847 +0000 UTC m=+0.131377307 container remove e281d6cff31c17c6dfe11cc321dc3b10473d71c6031474e62e7238f9cf4ab981 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a25517b2-5049-4a57-ad98-549dad6f59bf, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:53:39 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2da25517b2\x2d5049\x2d4a57\x2dad98\x2d549dad6f59bf.mount: Deactivated successfully.
Feb 20 09:53:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:39.226 264355 INFO neutron.agent.dhcp.agent [None req-cae3916f-00a6-4868-8b5d-8a4a4171ef81 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:39.549 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:40.948 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:53:41 np0005625204.localdomain podman[311446]: 2026-02-20 09:53:41.147979086 +0000 UTC m=+0.084633069 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:53:41 np0005625204.localdomain podman[311446]: 2026-02-20 09:53:41.160193956 +0000 UTC m=+0.096847959 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 20 09:53:41 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:53:41 np0005625204.localdomain ceph-mon[301857]: pgmap v182: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:41.199 264355 INFO neutron.agent.linux.ip_lib [None req-af1bf122-f499-4e2b-a4de-2605dede576f - - - - - -] Device tap470cc6c5-ef cannot be used as it has no MAC address
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.226 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain kernel: device tap470cc6c5-ef entered promiscuous mode
Feb 20 09:53:41 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581221.2359] manager: (tap470cc6c5-ef): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.236 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:41Z|00165|binding|INFO|Claiming lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 for this chassis.
Feb 20 09:53:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:41Z|00166|binding|INFO|470cc6c5-ef78-4a24-869b-d34965cd09b3: Claiming unknown
Feb 20 09:53:41 np0005625204.localdomain systemd-udevd[311476]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.249 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46b3b5fc-1b08-4417-b622-f776fc49a8cc, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=470cc6c5-ef78-4a24-869b-d34965cd09b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.251 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 470cc6c5-ef78-4a24-869b-d34965cd09b3 in datapath 567afd59-5f4e-4b3f-84e1-87f98341e0f7 bound to our chassis
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.252 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 567afd59-5f4e-4b3f-84e1-87f98341e0f7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.253 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[65e924c6-3284-456c-9e75-182c962f5cc7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:41Z|00167|binding|INFO|Setting lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 ovn-installed in OVS
Feb 20 09:53:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:41Z|00168|binding|INFO|Setting lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 up in Southbound
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.281 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap470cc6c5-ef: No such device
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.319 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.348 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:41Z|00169|binding|INFO|Removing iface tap470cc6c5-ef ovn-installed in OVS
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.881 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 947ddfc5-c53d-4d85-b4de-5bbea365f011 with type ""
Feb 20 09:53:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:41Z|00170|binding|INFO|Removing lport 470cc6c5-ef78-4a24-869b-d34965cd09b3 ovn-installed in OVS
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.882 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-567afd59-5f4e-4b3f-84e1-87f98341e0f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=46b3b5fc-1b08-4417-b622-f776fc49a8cc, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=470cc6c5-ef78-4a24-869b-d34965cd09b3) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.882 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.885 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 470cc6c5-ef78-4a24-869b-d34965cd09b3 in datapath 567afd59-5f4e-4b3f-84e1-87f98341e0f7 unbound from our chassis
Feb 20 09:53:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:41.885 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.887 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 567afd59-5f4e-4b3f-84e1-87f98341e0f7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:53:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:41.887 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b3ff1356-1a35-4cef-896b-c14614e5605c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:53:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:42Z|00171|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:53:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:42.093 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:42 np0005625204.localdomain podman[311545]: 
Feb 20 09:53:42 np0005625204.localdomain podman[311545]: 2026-02-20 09:53:42.117416388 +0000 UTC m=+0.119959041 container create dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:53:42 np0005625204.localdomain podman[311545]: 2026-02-20 09:53:42.046056282 +0000 UTC m=+0.048598965 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:53:42 np0005625204.localdomain systemd[1]: Started libpod-conmon-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347.scope.
Feb 20 09:53:42 np0005625204.localdomain systemd[1]: tmp-crun.yAAPuR.mount: Deactivated successfully.
Feb 20 09:53:42 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:53:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eb973c786a4a0e773abfd175e84104553d0c9274294a3974cf56eb42cc926635/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:53:42 np0005625204.localdomain podman[311545]: 2026-02-20 09:53:42.18377077 +0000 UTC m=+0.186313433 container init dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:53:42 np0005625204.localdomain podman[311545]: 2026-02-20 09:53:42.193766794 +0000 UTC m=+0.196309447 container start dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:53:42 np0005625204.localdomain dnsmasq[311563]: started, version 2.85 cachesize 150
Feb 20 09:53:42 np0005625204.localdomain dnsmasq[311563]: DNS service limited to local subnets
Feb 20 09:53:42 np0005625204.localdomain dnsmasq[311563]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:53:42 np0005625204.localdomain dnsmasq[311563]: warning: no upstream servers configured
Feb 20 09:53:42 np0005625204.localdomain dnsmasq-dhcp[311563]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:53:42 np0005625204.localdomain dnsmasq[311563]: read /var/lib/neutron/dhcp/567afd59-5f4e-4b3f-84e1-87f98341e0f7/addn_hosts - 0 addresses
Feb 20 09:53:42 np0005625204.localdomain dnsmasq-dhcp[311563]: read /var/lib/neutron/dhcp/567afd59-5f4e-4b3f-84e1-87f98341e0f7/host
Feb 20 09:53:42 np0005625204.localdomain dnsmasq-dhcp[311563]: read /var/lib/neutron/dhcp/567afd59-5f4e-4b3f-84e1-87f98341e0f7/opts
Feb 20 09:53:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:42.267 264355 INFO neutron.agent.dhcp.agent [None req-fd94e593-7a9b-43c1-b2e6-a3e510d8b525 - - - - - -] DHCP configuration for ports {'dd020ec0-dd49-42b0-8f23-d87283043b3d'} is completed
Feb 20 09:53:42 np0005625204.localdomain dnsmasq[311563]: exiting on receipt of SIGTERM
Feb 20 09:53:42 np0005625204.localdomain podman[311581]: 2026-02-20 09:53:42.422861965 +0000 UTC m=+0.058640920 container kill dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:53:42 np0005625204.localdomain systemd[1]: libpod-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347.scope: Deactivated successfully.
Feb 20 09:53:42 np0005625204.localdomain podman[311594]: 2026-02-20 09:53:42.490457396 +0000 UTC m=+0.054997520 container died dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:53:42 np0005625204.localdomain podman[311594]: 2026-02-20 09:53:42.520257079 +0000 UTC m=+0.084797163 container cleanup dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:53:42 np0005625204.localdomain systemd[1]: libpod-conmon-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347.scope: Deactivated successfully.
Feb 20 09:53:42 np0005625204.localdomain podman[311596]: 2026-02-20 09:53:42.567695448 +0000 UTC m=+0.123612561 container remove dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-567afd59-5f4e-4b3f-84e1-87f98341e0f7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:53:42 np0005625204.localdomain kernel: device tap470cc6c5-ef left promiscuous mode
Feb 20 09:53:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:42.579 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:42.593 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:42.619 264355 INFO neutron.agent.dhcp.agent [None req-a584c831-5291-46d0-94ac-116dd4400218 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:42.620 264355 INFO neutron.agent.dhcp.agent [None req-a584c831-5291-46d0-94ac-116dd4400218 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:42.811 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-eb973c786a4a0e773abfd175e84104553d0c9274294a3974cf56eb42cc926635-merged.mount: Deactivated successfully.
Feb 20 09:53:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dc51fbe9d047cd2846ac1b07fccc10e30fa37b25e6fa81171dca5d4c048c1347-userdata-shm.mount: Deactivated successfully.
Feb 20 09:53:43 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d567afd59\x2d5f4e\x2d4b3f\x2d84e1\x2d87f98341e0f7.mount: Deactivated successfully.
Feb 20 09:53:43 np0005625204.localdomain ceph-mon[301857]: pgmap v183: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:43.546 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.495 264355 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 20 09:53:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.617 264355 INFO neutron.agent.dhcp.agent [None req-8fbabde5-8c48-45b8-bf8d-fe8f256e795b - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:53:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.618 264355 INFO neutron.agent.dhcp.agent [-] Starting network a6178b53-adde-4e45-a5fb-ba3e8333d1f7 dhcp configuration
Feb 20 09:53:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.618 264355 INFO neutron.agent.dhcp.agent [-] Finished network a6178b53-adde-4e45-a5fb-ba3e8333d1f7 dhcp configuration
Feb 20 09:53:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.619 264355 INFO neutron.agent.dhcp.agent [None req-8fbabde5-8c48-45b8-bf8d-fe8f256e795b - - - - - -] Synchronizing state complete
Feb 20 09:53:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:44.619 264355 INFO neutron.agent.dhcp.agent [None req-757a0fa7-eac3-47ea-9d4e-70afd5a945ff - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:45 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:45.179 2 INFO neutron.agent.securitygroups_rpc [None req-a4da6bb1-700c-4d71-a646-fe34335ad1c4 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:45.184 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:45 np0005625204.localdomain ceph-mon[301857]: pgmap v184: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:45.783 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:45.950 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:46 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:46.461 2 INFO neutron.agent.securitygroups_rpc [None req-82d4853b-8792-42bf-a9bd-621206147606 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:47 np0005625204.localdomain ceph-mon[301857]: pgmap v185: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:47 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:47.540 2 INFO neutron.agent.securitygroups_rpc [None req-2045a801-a884-4a06-b206-987ac9e8d82c 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:53:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:53:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:53:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 20 09:53:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:53:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18808 "" "Go-http-client/1.1"
Feb 20 09:53:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:47.813 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:47 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:47Z|00172|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:53:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:47.961 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e113 e113: 6 total, 6 up, 6 in
Feb 20 09:53:48 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:48.550 2 INFO neutron.agent.securitygroups_rpc [None req-34460107-5767-4790-bb51-43f170627a06 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:53:48Z|00173|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:53:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:49.010 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:49 np0005625204.localdomain ceph-mon[301857]: pgmap v186: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:53:49 np0005625204.localdomain ceph-mon[301857]: osdmap e113: 6 total, 6 up, 6 in
Feb 20 09:53:49 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:49.997 2 INFO neutron.agent.securitygroups_rpc [None req-f7428e6a-a4fd-4f95-a528-55a12406007a 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:50.995 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:51 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:53:51 np0005625204.localdomain podman[311624]: 2026-02-20 09:53:51.152056509 +0000 UTC m=+0.086570038 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:53:51 np0005625204.localdomain podman[311624]: 2026-02-20 09:53:51.166126375 +0000 UTC m=+0.100639964 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:53:51 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:53:51 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:51.242 2 INFO neutron.agent.securitygroups_rpc [None req-e18dac8b-7691-49db-a6f9-a9ef86b93bee 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:51 np0005625204.localdomain ceph-mon[301857]: pgmap v188: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 1.5 KiB/s wr, 14 op/s
Feb 20 09:53:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.791 264355 INFO neutron.agent.dhcp.agent [None req-8fbabde5-8c48-45b8-bf8d-fe8f256e795b - - - - - -] Synchronizing state
Feb 20 09:53:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.939 264355 INFO neutron.agent.dhcp.agent [None req-99c0b8ff-a1dd-4d14-8ec8-9a6b4f78d738 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:53:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.940 264355 INFO neutron.agent.dhcp.agent [-] Starting network cd099dbb-ee85-46d6-aee0-e12b432c9b7f dhcp configuration
Feb 20 09:53:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.941 264355 INFO neutron.agent.dhcp.agent [-] Finished network cd099dbb-ee85-46d6-aee0-e12b432c9b7f dhcp configuration
Feb 20 09:53:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.942 264355 INFO neutron.agent.dhcp.agent [None req-99c0b8ff-a1dd-4d14-8ec8-9a6b4f78d738 - - - - - -] Synchronizing state complete
Feb 20 09:53:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:51.942 264355 INFO neutron.agent.dhcp.agent [None req-56328f20-7238-4e1a-b62d-244eb3392de4 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e114 e114: 6 total, 6 up, 6 in
Feb 20 09:53:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:52.515 2 INFO neutron.agent.securitygroups_rpc [None req-debccdd3-a1fb-4577-8737-97107297a2b7 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:52 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:53:52.531 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:53:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:52.817 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:53.059 2 INFO neutron.agent.securitygroups_rpc [None req-f48de32a-1488-41ca-9ac7-21eba1b907c6 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:53 np0005625204.localdomain ceph-mon[301857]: pgmap v189: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 1.6 KiB/s wr, 14 op/s
Feb 20 09:53:53 np0005625204.localdomain ceph-mon[301857]: osdmap e114: 6 total, 6 up, 6 in
Feb 20 09:53:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/870495445' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e115 e115: 6 total, 6 up, 6 in
Feb 20 09:53:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:53.497 2 INFO neutron.agent.securitygroups_rpc [None req-2540619c-1d57-4e86-a386-76258833753f 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:54 np0005625204.localdomain ceph-mon[301857]: osdmap e115: 6 total, 6 up, 6 in
Feb 20 09:53:54 np0005625204.localdomain ceph-mon[301857]: pgmap v192: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 2.7 KiB/s wr, 24 op/s
Feb 20 09:53:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4028388329' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:54 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:54.638 2 INFO neutron.agent.securitygroups_rpc [None req-6b4ba7bd-a2d9-4ae3-ba6a-627aca1feb8f 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:54.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:53:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:54.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:53:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:54.751 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:53:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:54.752 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:53:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:54.752 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:53:55 np0005625204.localdomain sshd[311667]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:53:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:53:55 np0005625204.localdomain systemd[1]: tmp-crun.mYSorS.mount: Deactivated successfully.
Feb 20 09:53:55 np0005625204.localdomain podman[311669]: 2026-02-20 09:53:55.144543866 +0000 UTC m=+0.085529156 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:53:55 np0005625204.localdomain podman[311669]: 2026-02-20 09:53:55.152366544 +0000 UTC m=+0.093351844 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:53:55 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:53:55 np0005625204.localdomain sshd[311667]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:53:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:53:55 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2307489764' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.262 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.332 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.333 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:53:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e116 e116: 6 total, 6 up, 6 in
Feb 20 09:53:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2307489764' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.568 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.569 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11353MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.570 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.570 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:53:55 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:55.612 2 INFO neutron.agent.securitygroups_rpc [None req-11ea8750-a0ca-4484-988a-d6e41cea3e7c 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.680 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.680 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.681 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:53:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e116 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:53:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:55.736 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.030 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:53:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2163954098' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.224 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.230 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.415 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.418 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.418 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.848s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:53:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e117 e117: 6 total, 6 up, 6 in
Feb 20 09:53:56 np0005625204.localdomain ceph-mon[301857]: osdmap e116: 6 total, 6 up, 6 in
Feb 20 09:53:56 np0005625204.localdomain ceph-mon[301857]: pgmap v194: 177 pgs: 177 active+clean; 257 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 19 MiB/s wr, 50 op/s
Feb 20 09:53:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2163954098' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:53:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:56.506 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:53:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:56.507 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:53:56.507 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:53:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:53:56.515 2 INFO neutron.agent.securitygroups_rpc [None req-2195fa93-d724-491c-94cb-1a1a7da48d3e 1b2d25a511a14396a23c89999045cc74 c91094550bca41b8ac81e1aef3ebc3a4 - - default default] Security group member updated ['075aba65-9a9e-4a7a-9bd7-67cbc5ab58ab']
Feb 20 09:53:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:53:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:53:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:53:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:53:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:53:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:53:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:53:57 np0005625204.localdomain podman[311718]: 2026-02-20 09:53:57.14540514 +0000 UTC m=+0.084107523 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, release=1770267347, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, distribution-scope=public, build-date=2026-02-05T04:57:10Z, managed_by=edpm_ansible, vcs-type=git)
Feb 20 09:53:57 np0005625204.localdomain podman[311718]: 2026-02-20 09:53:57.162141008 +0000 UTC m=+0.100843411 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.openshift.expose-services=, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, vendor=Red Hat, Inc.)
Feb 20 09:53:57 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:53:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:57.415 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:57 np0005625204.localdomain ceph-mon[301857]: osdmap e117: 6 total, 6 up, 6 in
Feb 20 09:53:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e118 e118: 6 total, 6 up, 6 in
Feb 20 09:53:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:57.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:57.485 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e119 e119: 6 total, 6 up, 6 in
Feb 20 09:53:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:57.819 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:53:58 np0005625204.localdomain ceph-mon[301857]: osdmap e118: 6 total, 6 up, 6 in
Feb 20 09:53:58 np0005625204.localdomain ceph-mon[301857]: pgmap v197: 177 pgs: 177 active+clean; 257 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 46 KiB/s rd, 27 MiB/s wr, 71 op/s
Feb 20 09:53:58 np0005625204.localdomain ceph-mon[301857]: osdmap e119: 6 total, 6 up, 6 in
Feb 20 09:53:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:53:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:53:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:00 np0005625204.localdomain ceph-mon[301857]: pgmap v199: 177 pgs: 177 active+clean; 169 MiB data, 855 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 7.4 KiB/s wr, 88 op/s
Feb 20 09:54:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:00.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:00.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:54:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:00.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:54:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.092 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.094 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:54:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:54:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:54:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:04Z|00174|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3721321372' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:04 np0005625204.localdomain podman[311738]: 2026-02-20 09:54:04.153876462 +0000 UTC m=+0.047138764 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:54:04 np0005625204.localdomain podman[311738]: 2026-02-20 09:54:04.202049396 +0000 UTC m=+0.095311678 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Feb 20 09:54:04 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:54:04 np0005625204.localdomain podman[311739]: 2026-02-20 09:54:04.21303661 +0000 UTC m=+0.103877438 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:54:04 np0005625204.localdomain podman[311739]: 2026-02-20 09:54:04.219870947 +0000 UTC m=+0.110711765 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 20 09:54:04 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:54:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:04Z|00175|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:04Z|00176|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.677 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:04Z|00177|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.920 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.920 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.921 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.921 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:54:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:04.922 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:05.509 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:54:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:54:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:54:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:54:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:06Z|00178|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:06.284 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: pgmap v200: 177 pgs: 177 active+clean; 145 MiB data, 751 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 7.7 KiB/s wr, 99 op/s
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4199280483' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4199280483' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1336582611' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: pgmap v201: 177 pgs: 177 active+clean; 145 MiB data, 751 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 6.5 KiB/s wr, 85 op/s
Feb 20 09:54:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:06Z|00179|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:06.398 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1081899576' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:54:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1081899576' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:06Z|00180|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:06.885 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:07.036 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:54:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:07.056 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:54:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:07.057 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:54:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:07.057 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:07.057 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: pgmap v202: 177 pgs: 177 active+clean; 145 MiB data, 751 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 6.0 KiB/s wr, 92 op/s
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1081899576' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1081899576' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4199115342' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4199115342' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e120 e120: 6 total, 6 up, 6 in
Feb 20 09:54:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e121 e121: 6 total, 6 up, 6 in
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: osdmap e120: 6 total, 6 up, 6 in
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 751 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 4.9 KiB/s wr, 75 op/s
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: osdmap e121: 6 total, 6 up, 6 in
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2750035915' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2750035915' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:08 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:08.496 2 INFO neutron.agent.securitygroups_rpc [None req-b9c4f92c-e0aa-4ddd-a393-48b8bc5d6b0b 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['b7daa996-a450-47f2-a46b-44613b415203']
Feb 20 09:54:08 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:08.646 2 INFO neutron.agent.securitygroups_rpc [None req-fa8e1861-ba97-4550-97ae-0d61a37c286d 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['b7daa996-a450-47f2-a46b-44613b415203']
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e122 e122: 6 total, 6 up, 6 in
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/523875761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:54:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/523875761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:09.095 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:09 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:09.142 2 INFO neutron.agent.securitygroups_rpc [None req-60615bbe-69b8-4a6d-a777-f3532d76e589 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:09 np0005625204.localdomain ceph-mon[301857]: osdmap e122: 6 total, 6 up, 6 in
Feb 20 09:54:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/523875761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/523875761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:10 np0005625204.localdomain ceph-mon[301857]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 751 MiB used, 41 GiB / 42 GiB avail; 106 KiB/s rd, 7.2 KiB/s wr, 143 op/s
Feb 20 09:54:10 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:10.726 2 INFO neutron.agent.securitygroups_rpc [None req-05c5180a-791e-4a36-b283-1b3700162f32 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:11.072 2 INFO neutron.agent.securitygroups_rpc [None req-fda66745-7058-43a6-bd22-7e541948feca 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:11.376 2 INFO neutron.agent.securitygroups_rpc [None req-484ad5d4-98d6-44d1-ade8-e83a00451c3f 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:11.848 2 INFO neutron.agent.securitygroups_rpc [None req-d28a91c5-19c2-44e5-9f1d-5b67d2d402bb 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:12.048 2 INFO neutron.agent.securitygroups_rpc [None req-afec8fb8-e434-475a-b1f9-4fa41fdb543f 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:54:12 np0005625204.localdomain podman[311784]: 2026-02-20 09:54:12.140437625 +0000 UTC m=+0.081466812 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 20 09:54:12 np0005625204.localdomain podman[311784]: 2026-02-20 09:54:12.148953664 +0000 UTC m=+0.089982821 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:54:12 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:54:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:12.313 2 INFO neutron.agent.securitygroups_rpc [None req-ce85a640-3696-4e3b-b081-e77c6a0d5165 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:12.475 2 INFO neutron.agent.securitygroups_rpc [None req-2e25a288-cc52-4176-8755-11e2b4f58624 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e123 e123: 6 total, 6 up, 6 in
Feb 20 09:54:12 np0005625204.localdomain ceph-mon[301857]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 110 KiB/s rd, 6.8 KiB/s wr, 151 op/s
Feb 20 09:54:12 np0005625204.localdomain ceph-mon[301857]: osdmap e123: 6 total, 6 up, 6 in
Feb 20 09:54:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:12.772 2 INFO neutron.agent.securitygroups_rpc [None req-8a952836-35e7-4a3b-8a56-14552dc3796b 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:13.303 264355 INFO neutron.agent.linux.ip_lib [None req-6eb1cc23-1640-4bc7-902e-ac5ab4859a19 - - - - - -] Device tapc66aa099-67 cannot be used as it has no MAC address
Feb 20 09:54:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:13.326 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:13 np0005625204.localdomain kernel: device tapc66aa099-67 entered promiscuous mode
Feb 20 09:54:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:13.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:13 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581253.3365] manager: (tapc66aa099-67): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Feb 20 09:54:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:13Z|00181|binding|INFO|Claiming lport c66aa099-6743-4204-8797-c164cf332f51 for this chassis.
Feb 20 09:54:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:13Z|00182|binding|INFO|c66aa099-6743-4204-8797-c164cf332f51: Claiming unknown
Feb 20 09:54:13 np0005625204.localdomain systemd-udevd[311814]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:54:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:13.346 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e50d36cf-02a1-45d9-8e27-acc0fe1139fb, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c66aa099-6743-4204-8797-c164cf332f51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:13.349 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c66aa099-6743-4204-8797-c164cf332f51 in datapath ce344b2b-0116-41ad-9d2e-ee513171891f bound to our chassis
Feb 20 09:54:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:13.351 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ce344b2b-0116-41ad-9d2e-ee513171891f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:13.352 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5221dac8-6f79-4f99-bb52-31a6304a0e92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:13Z|00183|binding|INFO|Setting lport c66aa099-6743-4204-8797-c164cf332f51 ovn-installed in OVS
Feb 20 09:54:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:13Z|00184|binding|INFO|Setting lport c66aa099-6743-4204-8797-c164cf332f51 up in Southbound
Feb 20 09:54:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:13.374 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc66aa099-67: No such device
Feb 20 09:54:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:13.419 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:13.448 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:13.692 2 INFO neutron.agent.securitygroups_rpc [None req-72cb9bc1-a00e-467b-ba2c-70207cf7bcb5 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['5237cd22-9777-4b5a-aa35-83bccb0fdfdc']
Feb 20 09:54:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:14.098 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:14 np0005625204.localdomain podman[311885]: 
Feb 20 09:54:14 np0005625204.localdomain podman[311885]: 2026-02-20 09:54:14.205029043 +0000 UTC m=+0.066740686 container create 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:14 np0005625204.localdomain systemd[1]: Started libpod-conmon-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb.scope.
Feb 20 09:54:14 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:54:14 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d89d075bd4bd3762074982c4b1f354c4380264ee115e953111bb781925418ea1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:54:14 np0005625204.localdomain podman[311885]: 2026-02-20 09:54:14.165876795 +0000 UTC m=+0.027588528 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:54:14 np0005625204.localdomain podman[311885]: 2026-02-20 09:54:14.271768348 +0000 UTC m=+0.133480021 container init 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:14 np0005625204.localdomain podman[311885]: 2026-02-20 09:54:14.281421321 +0000 UTC m=+0.143132994 container start 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:54:14 np0005625204.localdomain dnsmasq[311904]: started, version 2.85 cachesize 150
Feb 20 09:54:14 np0005625204.localdomain dnsmasq[311904]: DNS service limited to local subnets
Feb 20 09:54:14 np0005625204.localdomain dnsmasq[311904]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:54:14 np0005625204.localdomain dnsmasq[311904]: warning: no upstream servers configured
Feb 20 09:54:14 np0005625204.localdomain dnsmasq-dhcp[311904]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:54:14 np0005625204.localdomain dnsmasq[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/addn_hosts - 0 addresses
Feb 20 09:54:14 np0005625204.localdomain dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/host
Feb 20 09:54:14 np0005625204.localdomain dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/opts
Feb 20 09:54:14 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:14.409 264355 INFO neutron.agent.dhcp.agent [None req-31651160-d023-4725-9c26-9d793ef8969e - - - - - -] DHCP configuration for ports {'c5fe307a-fee2-41f6-aab2-23f1423edaff'} is completed
Feb 20 09:54:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:14.479 2 INFO neutron.agent.securitygroups_rpc [None req-0e571d4b-2938-4d22-abb1-e6b913186df6 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['6b2659dc-8adf-40b4-b971-7bc179be3dc5']
Feb 20 09:54:14 np0005625204.localdomain dnsmasq[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/addn_hosts - 0 addresses
Feb 20 09:54:14 np0005625204.localdomain dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/host
Feb 20 09:54:14 np0005625204.localdomain dnsmasq-dhcp[311904]: read /var/lib/neutron/dhcp/ce344b2b-0116-41ad-9d2e-ee513171891f/opts
Feb 20 09:54:14 np0005625204.localdomain podman[311921]: 2026-02-20 09:54:14.589730294 +0000 UTC m=+0.060887748 container kill 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:14 np0005625204.localdomain ceph-mon[301857]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 110 KiB/s rd, 6.8 KiB/s wr, 151 op/s
Feb 20 09:54:14 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:14.911 264355 INFO neutron.agent.dhcp.agent [None req-17912c8e-9f91-4766-bff5-70038d6a8bd5 - - - - - -] DHCP configuration for ports {'c66aa099-6743-4204-8797-c164cf332f51', 'c5fe307a-fee2-41f6-aab2-23f1423edaff'} is completed
Feb 20 09:54:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:15.348 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port fdc01f3b-a923-4f35-aaea-9adaa7d8d882 with type ""
Feb 20 09:54:15 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:15Z|00185|binding|INFO|Removing iface tapc66aa099-67 ovn-installed in OVS
Feb 20 09:54:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:15.350 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce344b2b-0116-41ad-9d2e-ee513171891f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e50d36cf-02a1-45d9-8e27-acc0fe1139fb, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c66aa099-6743-4204-8797-c164cf332f51) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:15.352 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c66aa099-6743-4204-8797-c164cf332f51 in datapath ce344b2b-0116-41ad-9d2e-ee513171891f unbound from our chassis
Feb 20 09:54:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:15.354 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce344b2b-0116-41ad-9d2e-ee513171891f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:54:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:15.355 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8b4aa644-422d-4835-819f-2c0853129461]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:15 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:15Z|00186|binding|INFO|Removing lport c66aa099-6743-4204-8797-c164cf332f51 ovn-installed in OVS
Feb 20 09:54:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:15.385 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:15 np0005625204.localdomain dnsmasq[311904]: exiting on receipt of SIGTERM
Feb 20 09:54:15 np0005625204.localdomain podman[311959]: 2026-02-20 09:54:15.489296316 +0000 UTC m=+0.061033953 container kill 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:54:15 np0005625204.localdomain systemd[1]: tmp-crun.bDNkfZ.mount: Deactivated successfully.
Feb 20 09:54:15 np0005625204.localdomain systemd[1]: libpod-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb.scope: Deactivated successfully.
Feb 20 09:54:15 np0005625204.localdomain podman[311973]: 2026-02-20 09:54:15.565276451 +0000 UTC m=+0.054420891 container died 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:54:15 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb-userdata-shm.mount: Deactivated successfully.
Feb 20 09:54:15 np0005625204.localdomain podman[311973]: 2026-02-20 09:54:15.66347487 +0000 UTC m=+0.152619250 container remove 7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ce344b2b-0116-41ad-9d2e-ee513171891f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:54:15 np0005625204.localdomain systemd[1]: libpod-conmon-7f6e3fdcecad44e6cb9c7ce1d8b56210af9d6a1e233db7f27b0a4772d7e8aacb.scope: Deactivated successfully.
Feb 20 09:54:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:15.677 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:15 np0005625204.localdomain kernel: device tapc66aa099-67 left promiscuous mode
Feb 20 09:54:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:15.693 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:15 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:15.917 264355 INFO neutron.agent.dhcp.agent [None req-99c0b8ff-a1dd-4d14-8ec8-9a6b4f78d738 - - - - - -] Synchronizing state
Feb 20 09:54:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:16.036 2 INFO neutron.agent.securitygroups_rpc [None req-9065899c-9819-48e8-b360-946a69906bd9 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['56cc5e29-9f6f-4f35-9ade-42d618bdd35b']
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.087 264355 INFO neutron.agent.dhcp.agent [None req-1a2e3763-4058-4706-bf05-99d8b392f202 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.088 264355 INFO neutron.agent.dhcp.agent [-] Starting network 9e8d3a83-6496-4dd1-b203-39e31016ea09 dhcp configuration
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.089 264355 INFO neutron.agent.dhcp.agent [-] Finished network 9e8d3a83-6496-4dd1-b203-39e31016ea09 dhcp configuration
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.089 264355 INFO neutron.agent.dhcp.agent [-] Starting network ce344b2b-0116-41ad-9d2e-ee513171891f dhcp configuration
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.090 264355 INFO neutron.agent.dhcp.agent [-] Finished network ce344b2b-0116-41ad-9d2e-ee513171891f dhcp configuration
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.091 264355 INFO neutron.agent.dhcp.agent [None req-1a2e3763-4058-4706-bf05-99d8b392f202 - - - - - -] Synchronizing state complete
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.133 264355 INFO neutron.agent.dhcp.agent [None req-4e0c3fb0-1208-45a7-b513-5624e7f180c6 - - - - - -] DHCP configuration for ports {'c5fe307a-fee2-41f6-aab2-23f1423edaff'} is completed
Feb 20 09:54:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-d89d075bd4bd3762074982c4b1f354c4380264ee115e953111bb781925418ea1-merged.mount: Deactivated successfully.
Feb 20 09:54:16 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dce344b2b\x2d0116\x2d41ad\x2d9d2e\x2dee513171891f.mount: Deactivated successfully.
Feb 20 09:54:16 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:16Z|00187|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:16.248 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:16.283 2 INFO neutron.agent.securitygroups_rpc [None req-00b37041-3488-4088-b5e4-b424dd1f63aa 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['56cc5e29-9f6f-4f35-9ade-42d618bdd35b']
Feb 20 09:54:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:16.696 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:16 np0005625204.localdomain ceph-mon[301857]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 5.2 KiB/s wr, 116 op/s
Feb 20 09:54:17 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:17.514 2 INFO neutron.agent.securitygroups_rpc [None req-bee57eed-2319-4f09-8acf-ebe401de5df5 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['39c9ea95-070c-4bc4-9287-e329c91de991']
Feb 20 09:54:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:54:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:54:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:54:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 20 09:54:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:54:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18808 "" "Go-http-client/1.1"
Feb 20 09:54:18 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:18.291 2 INFO neutron.agent.securitygroups_rpc [None req-63f4c14d-d8d8-4887-9902-5c1bd910d46b 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['39c9ea95-070c-4bc4-9287-e329c91de991']
Feb 20 09:54:18 np0005625204.localdomain ceph-mon[301857]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 75 KiB/s rd, 4.7 KiB/s wr, 103 op/s
Feb 20 09:54:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:19.102 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:19.287 2 INFO neutron.agent.securitygroups_rpc [None req-0eb3d32b-3601-4bfa-a4ec-07263fd65bfe 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']
Feb 20 09:54:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:19.697 2 INFO neutron.agent.securitygroups_rpc [None req-c48149ca-5e2e-4ce1-9578-04651a281b9c 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']
Feb 20 09:54:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2049357571' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2049357571' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:20.116 2 INFO neutron.agent.securitygroups_rpc [None req-db7ac1b3-e635-4b44-b582-fc808315e3e2 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']
Feb 20 09:54:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:20.437 2 INFO neutron.agent.securitygroups_rpc [None req-59de0fbc-5d34-4b72-b4ef-ac2a7c9b1e9d 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']
Feb 20 09:54:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:20.728 2 INFO neutron.agent.securitygroups_rpc [None req-cd9b335d-ac52-4aea-badd-19c0c11ff6a7 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']
Feb 20 09:54:20 np0005625204.localdomain ceph-mon[301857]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 409 B/s wr, 20 op/s
Feb 20 09:54:21 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:21.033 2 INFO neutron.agent.securitygroups_rpc [None req-eaa2444c-9904-4bf6-912c-ce544aec0944 a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']
Feb 20 09:54:21 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:21.193 2 INFO neutron.agent.securitygroups_rpc [None req-eef0b483-52c5-455d-8e50-fdf7323c6cd9 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['92902041-7601-4e4f-984f-f40e84e5d87a']
Feb 20 09:54:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:54:22 np0005625204.localdomain systemd[1]: tmp-crun.vtyI4Y.mount: Deactivated successfully.
Feb 20 09:54:22 np0005625204.localdomain podman[311999]: 2026-02-20 09:54:22.14051521 +0000 UTC m=+0.082671220 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:54:22 np0005625204.localdomain podman[311999]: 2026-02-20 09:54:22.150116621 +0000 UTC m=+0.092272681 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:54:22 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:54:22 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:22.463 2 INFO neutron.agent.securitygroups_rpc [None req-5a5ae7af-13a6-42c5-a0fc-7ba1957a4294 b9d64681c327441a81dfa771b4b413f6 ce97c44a73f94ada962654654798a4af - - default default] Security group member updated ['203b95e6-8f62-4037-821a-d64a45daeaf8']
Feb 20 09:54:22 np0005625204.localdomain ceph-mon[301857]: pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 716 B/s wr, 16 op/s
Feb 20 09:54:22 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:22.961 2 INFO neutron.agent.securitygroups_rpc [None req-ec64d548-962f-41dd-a02f-4a25f3310f4f a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']
Feb 20 09:54:23 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:23.156 2 INFO neutron.agent.securitygroups_rpc [None req-a78e376a-5faf-4597-9e34-68a60251f328 124fec5084164515a4a8079d3fba8fba 0d4c2b96a324436e80a9b37c9d4a15eb - - default default] Security group rule updated ['11123030-cb07-4b38-85fd-08bf79b16579']
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.107 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.108 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.127 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.127 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:54:24 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:24.356 2 INFO neutron.agent.securitygroups_rpc [None req-f54c4838-5ce8-45ff-b090-12b4bbbb882f a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']
Feb 20 09:54:24 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:24.569 2 INFO neutron.agent.securitygroups_rpc [None req-5c523cd2-f346-4abd-990f-316e1d877f9a b9d64681c327441a81dfa771b4b413f6 ce97c44a73f94ada962654654798a4af - - default default] Security group member updated ['203b95e6-8f62-4037-821a-d64a45daeaf8']
Feb 20 09:54:24 np0005625204.localdomain ceph-mon[301857]: pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 663 B/s wr, 15 op/s
Feb 20 09:54:24 np0005625204.localdomain dnsmasq[311368]: exiting on receipt of SIGTERM
Feb 20 09:54:24 np0005625204.localdomain systemd[1]: tmp-crun.zm6qgB.mount: Deactivated successfully.
Feb 20 09:54:24 np0005625204.localdomain podman[312038]: 2026-02-20 09:54:24.899510434 +0000 UTC m=+0.057893537 container kill e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:24 np0005625204.localdomain systemd[1]: libpod-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0.scope: Deactivated successfully.
Feb 20 09:54:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:24Z|00188|binding|INFO|Removing iface tap5a613b4f-1d ovn-installed in OVS
Feb 20 09:54:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:24.906 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port f8795842-3b76-4096-9eb1-555fb2e17873 with type ""
Feb 20 09:54:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:24Z|00189|binding|INFO|Removing lport 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 ovn-installed in OVS
Feb 20 09:54:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:24.908 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:ffff::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4d3d4b22-89ae-4b72-8269-db16bc023693', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b7ef526d9e1e4325afc5a8eb7c2f52b5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c31c2cb7-7585-4255-8195-898589cc1c5d, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.909 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:24.910 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 5a613b4f-1d3f-4bb3-9024-fbd0dbe4f422 in datapath 4d3d4b22-89ae-4b72-8269-db16bc023693 unbound from our chassis
Feb 20 09:54:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:24.911 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4d3d4b22-89ae-4b72-8269-db16bc023693 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:24.912 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e62f88-0a85-4ea3-b308-33a6e69228b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:24.916 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:24 np0005625204.localdomain podman[312051]: 2026-02-20 09:54:24.972459957 +0000 UTC m=+0.060730143 container died e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:54:25 np0005625204.localdomain podman[312051]: 2026-02-20 09:54:25.003247971 +0000 UTC m=+0.091518107 container cleanup e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 20 09:54:25 np0005625204.localdomain systemd[1]: libpod-conmon-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0.scope: Deactivated successfully.
Feb 20 09:54:25 np0005625204.localdomain podman[312059]: 2026-02-20 09:54:25.057185268 +0000 UTC m=+0.130084538 container remove e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4d3d4b22-89ae-4b72-8269-db16bc023693, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:54:25 np0005625204.localdomain kernel: device tap5a613b4f-1d left promiscuous mode
Feb 20 09:54:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:25.069 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:25.084 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:25.119 264355 INFO neutron.agent.dhcp.agent [None req-0316d5ad-e94a-4888-8259-441d1361f8b8 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:25 np0005625204.localdomain sshd[312084]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:54:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:25.301 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:25 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:25.529 2 INFO neutron.agent.securitygroups_rpc [None req-1b48fcb2-1507-433d-8e69-fc9c0e8a60aa a7bf5bd2a08b40838819ba2189f49015 8affd803090342c7a0b4a8c10fbcda95 - - default default] Security group member updated ['863be2ee-f2c8-49ce-9c0f-4d58bca5441c']
Feb 20 09:54:25 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:25Z|00190|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:25.747 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:54:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e124 e124: 6 total, 6 up, 6 in
Feb 20 09:54:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f09de51b9fc12b68b5d61385aaa744ed437b92490600eef9db4abfcb8fb69f0d-merged.mount: Deactivated successfully.
Feb 20 09:54:25 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e0cfd31d66548a38a60d1c008853443294f52c55628903b073609350dd705fd0-userdata-shm.mount: Deactivated successfully.
Feb 20 09:54:25 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d4d3d4b22\x2d89ae\x2d4b72\x2d8269\x2ddb16bc023693.mount: Deactivated successfully.
Feb 20 09:54:25 np0005625204.localdomain podman[312086]: 2026-02-20 09:54:25.90341975 +0000 UTC m=+0.086939658 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:54:25 np0005625204.localdomain podman[312086]: 2026-02-20 09:54:25.941401422 +0000 UTC m=+0.124921350 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:54:25 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:54:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e124 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:54:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:54:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:54:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:54:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:54:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:54:26 np0005625204.localdomain sshd[312084]: Invalid user n8n from 103.191.14.210 port 46040
Feb 20 09:54:26 np0005625204.localdomain ceph-mon[301857]: pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 597 B/s wr, 14 op/s
Feb 20 09:54:26 np0005625204.localdomain ceph-mon[301857]: osdmap e124: 6 total, 6 up, 6 in
Feb 20 09:54:26 np0005625204.localdomain sshd[312084]: Received disconnect from 103.191.14.210 port 46040:11: Bye Bye [preauth]
Feb 20 09:54:26 np0005625204.localdomain sshd[312084]: Disconnected from invalid user n8n 103.191.14.210 port 46040 [preauth]
Feb 20 09:54:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e125 e125: 6 total, 6 up, 6 in
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.687077) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267687211, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1164, "num_deletes": 254, "total_data_size": 1413785, "memory_usage": 1439328, "flush_reason": "Manual Compaction"}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267697112, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 923135, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18936, "largest_seqno": 20095, "table_properties": {"data_size": 918254, "index_size": 2416, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11360, "raw_average_key_size": 20, "raw_value_size": 908237, "raw_average_value_size": 1666, "num_data_blocks": 106, "num_entries": 545, "num_filter_entries": 545, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581202, "oldest_key_time": 1771581202, "file_creation_time": 1771581267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 10083 microseconds, and 5136 cpu microseconds.
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.697174) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 923135 bytes OK
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.697209) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.699147) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.699172) EVENT_LOG_v1 {"time_micros": 1771581267699166, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.699204) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 1408047, prev total WAL file size 1408047, number of live WAL files 2.
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.700133) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(901KB)], [27(17MB)]
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267700199, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 18824993, "oldest_snapshot_seqno": -1}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12202 keys, 15913972 bytes, temperature: kUnknown
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267774019, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 15913972, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15845959, "index_size": 36424, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30533, "raw_key_size": 328079, "raw_average_key_size": 26, "raw_value_size": 15639612, "raw_average_value_size": 1281, "num_data_blocks": 1377, "num_entries": 12202, "num_filter_entries": 12202, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.774332) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 15913972 bytes
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.776060) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 254.6 rd, 215.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 17.1 +0.0 blob) out(15.2 +0.0 blob), read-write-amplify(37.6) write-amplify(17.2) OK, records in: 12729, records dropped: 527 output_compression: NoCompression
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.776088) EVENT_LOG_v1 {"time_micros": 1771581267776075, "job": 14, "event": "compaction_finished", "compaction_time_micros": 73936, "compaction_time_cpu_micros": 42823, "output_level": 6, "num_output_files": 1, "total_output_size": 15913972, "num_input_records": 12729, "num_output_records": 12202, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267776348, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581267778837, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.700021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:54:27.778959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: osdmap e125: 6 total, 6 up, 6 in
Feb 20 09:54:27 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e126 e126: 6 total, 6 up, 6 in
Feb 20 09:54:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:54:28 np0005625204.localdomain podman[312109]: 2026-02-20 09:54:28.161747585 +0000 UTC m=+0.092809846 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, version=9.7, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, container_name=openstack_network_exporter)
Feb 20 09:54:28 np0005625204.localdomain podman[312109]: 2026-02-20 09:54:28.179982728 +0000 UTC m=+0.111044999 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, config_id=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.7, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9)
Feb 20 09:54:28 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:54:28 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:28.781 2 INFO neutron.agent.securitygroups_rpc [None req-3a0ac74b-e97f-4840-a2be-cf3db56b29ba eed45d0e6e9a4013a0e822ffa85bb5cb 13f7a9ed49974d1596cd7746bdf2e7c4 - - default default] Security group rule updated ['92258b95-63d5-4c8a-9734-555bdc627d97']
Feb 20 09:54:28 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e127 e127: 6 total, 6 up, 6 in
Feb 20 09:54:28 np0005625204.localdomain ceph-mon[301857]: pgmap v219: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 14 KiB/s rd, 383 B/s wr, 18 op/s
Feb 20 09:54:28 np0005625204.localdomain ceph-mon[301857]: osdmap e126: 6 total, 6 up, 6 in
Feb 20 09:54:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:29.163 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:29 np0005625204.localdomain ceph-mon[301857]: osdmap e127: 6 total, 6 up, 6 in
Feb 20 09:54:29 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e128 e128: 6 total, 6 up, 6 in
Feb 20 09:54:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e129 e129: 6 total, 6 up, 6 in
Feb 20 09:54:31 np0005625204.localdomain ceph-mon[301857]: pgmap v222: 177 pgs: 177 active+clean; 145 MiB data, 759 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 5.7 KiB/s wr, 63 op/s
Feb 20 09:54:31 np0005625204.localdomain ceph-mon[301857]: osdmap e128: 6 total, 6 up, 6 in
Feb 20 09:54:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:31 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:31.622 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:32 np0005625204.localdomain ceph-mon[301857]: osdmap e129: 6 total, 6 up, 6 in
Feb 20 09:54:32 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e130 e130: 6 total, 6 up, 6 in
Feb 20 09:54:33 np0005625204.localdomain ceph-mon[301857]: pgmap v225: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 121 KiB/s rd, 22 KiB/s wr, 175 op/s
Feb 20 09:54:33 np0005625204.localdomain ceph-mon[301857]: osdmap e130: 6 total, 6 up, 6 in
Feb 20 09:54:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e131 e131: 6 total, 6 up, 6 in
Feb 20 09:54:34 np0005625204.localdomain ceph-mon[301857]: osdmap e131: 6 total, 6 up, 6 in
Feb 20 09:54:34 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:34.083 264355 INFO neutron.agent.linux.ip_lib [None req-09b59930-cf6c-44f0-b792-6736fd28ff7b - - - - - -] Device tapf0ad1ac2-f9 cannot be used as it has no MAC address
Feb 20 09:54:34 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e132 e132: 6 total, 6 up, 6 in
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:34 np0005625204.localdomain kernel: device tapf0ad1ac2-f9 entered promiscuous mode
Feb 20 09:54:34 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581274.1500] manager: (tapf0ad1ac2-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.153 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:34 np0005625204.localdomain systemd-udevd[312138]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:54:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:34Z|00191|binding|INFO|Claiming lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa for this chassis.
Feb 20 09:54:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:34Z|00192|binding|INFO|f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa: Claiming unknown
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.167 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:34Z|00193|binding|INFO|Setting lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa ovn-installed in OVS
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.198 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:34Z|00194|binding|INFO|Setting lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa up in Southbound
Feb 20 09:54:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:34.210 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189489f9-7d29-4419-9e31-fd1fec55fc46, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:34.213 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa in datapath d6b1eef5-3137-454f-8164-8278293c350a bound to our chassis
Feb 20 09:54:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:34.215 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network d6b1eef5-3137-454f-8164-8278293c350a or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:34.216 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[03d05a97-e110-4289-88ad-b7a02bb85872]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.244 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:34.274 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:54:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:54:35 np0005625204.localdomain ceph-mon[301857]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 760 MiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 16 KiB/s wr, 111 op/s
Feb 20 09:54:35 np0005625204.localdomain ceph-mon[301857]: osdmap e132: 6 total, 6 up, 6 in
Feb 20 09:54:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e133 e133: 6 total, 6 up, 6 in
Feb 20 09:54:35 np0005625204.localdomain podman[312186]: 2026-02-20 09:54:35.182209698 +0000 UTC m=+0.115101973 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:35 np0005625204.localdomain podman[312186]: 2026-02-20 09:54:35.272967481 +0000 UTC m=+0.205859706 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller)
Feb 20 09:54:35 np0005625204.localdomain podman[312213]: 
Feb 20 09:54:35 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:54:35 np0005625204.localdomain podman[312213]: 2026-02-20 09:54:35.283669555 +0000 UTC m=+0.129385626 container create 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:35 np0005625204.localdomain podman[312187]: 2026-02-20 09:54:35.237153265 +0000 UTC m=+0.165152962 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 20 09:54:35 np0005625204.localdomain podman[312187]: 2026-02-20 09:54:35.323072091 +0000 UTC m=+0.251071808 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:54:35 np0005625204.localdomain podman[312213]: 2026-02-20 09:54:35.236181315 +0000 UTC m=+0.081897386 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:54:35 np0005625204.localdomain systemd[1]: Started libpod-conmon-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c.scope.
Feb 20 09:54:35 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:54:35 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:54:35 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58eee122d4e0dd04c061f43cc5e2ecc11b5cf7d4c3051551afe09a5a3c5f2a8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:54:35 np0005625204.localdomain podman[312213]: 2026-02-20 09:54:35.375693548 +0000 UTC m=+0.221409599 container init 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:54:35 np0005625204.localdomain podman[312213]: 2026-02-20 09:54:35.38469155 +0000 UTC m=+0.230407601 container start 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:54:35 np0005625204.localdomain dnsmasq[312254]: started, version 2.85 cachesize 150
Feb 20 09:54:35 np0005625204.localdomain dnsmasq[312254]: DNS service limited to local subnets
Feb 20 09:54:35 np0005625204.localdomain dnsmasq[312254]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:54:35 np0005625204.localdomain dnsmasq[312254]: warning: no upstream servers configured
Feb 20 09:54:35 np0005625204.localdomain dnsmasq-dhcp[312254]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:54:35 np0005625204.localdomain dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 0 addresses
Feb 20 09:54:35 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host
Feb 20 09:54:35 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts
Feb 20 09:54:35 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:35.593 264355 INFO neutron.agent.dhcp.agent [None req-ec2a78f5-7302-40f5-82f5-46c38fc7596e - - - - - -] DHCP configuration for ports {'bb02ac54-616f-471c-81a6-165ac4648db9'} is completed
Feb 20 09:54:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e134 e134: 6 total, 6 up, 6 in
Feb 20 09:54:36 np0005625204.localdomain ceph-mon[301857]: osdmap e133: 6 total, 6 up, 6 in
Feb 20 09:54:36 np0005625204.localdomain systemd[1]: tmp-crun.BhWUNY.mount: Deactivated successfully.
Feb 20 09:54:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:37 np0005625204.localdomain ceph-mon[301857]: pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 190 KiB/s rd, 19 KiB/s wr, 262 op/s
Feb 20 09:54:37 np0005625204.localdomain ceph-mon[301857]: osdmap e134: 6 total, 6 up, 6 in
Feb 20 09:54:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e135 e135: 6 total, 6 up, 6 in
Feb 20 09:54:37 np0005625204.localdomain sudo[312255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:54:37 np0005625204.localdomain sudo[312255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:54:37 np0005625204.localdomain sudo[312255]: pam_unix(sudo:session): session closed for user root
Feb 20 09:54:37 np0005625204.localdomain sudo[312273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 09:54:37 np0005625204.localdomain sudo[312273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:54:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:37.638 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e136 e136: 6 total, 6 up, 6 in
Feb 20 09:54:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:37.683 264355 INFO neutron.agent.linux.ip_lib [None req-0019394c-92b8-4fed-9379-791875aaf1b9 - - - - - -] Device tap626e3303-a1 cannot be used as it has no MAC address
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.751 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:37 np0005625204.localdomain kernel: device tap626e3303-a1 entered promiscuous mode
Feb 20 09:54:37 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581277.7623] manager: (tap626e3303-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Feb 20 09:54:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:37Z|00195|binding|INFO|Claiming lport 626e3303-a120-4467-a686-694c8014af7a for this chassis.
Feb 20 09:54:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:37Z|00196|binding|INFO|626e3303-a120-4467-a686-694c8014af7a: Claiming unknown
Feb 20 09:54:37 np0005625204.localdomain systemd-udevd[312301]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:37Z|00197|binding|INFO|Setting lport 626e3303-a120-4467-a686-694c8014af7a ovn-installed in OVS
Feb 20 09:54:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:37Z|00198|binding|INFO|Setting lport 626e3303-a120-4467-a686-694c8014af7a up in Southbound
Feb 20 09:54:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:37.778 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d454f9be-83f3-4c60-8509-6d14a02fd7f5, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=626e3303-a120-4467-a686-694c8014af7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.780 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.781 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:37.782 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 626e3303-a120-4467-a686-694c8014af7a in datapath 2f1c353d-8de5-4616-8b20-8c686b261d9f bound to our chassis
Feb 20 09:54:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:37.785 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 9fb30275-dfb9-4635-9229-a9a2263cae49 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:54:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:37.786 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f1c353d-8de5-4616-8b20-8c686b261d9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:54:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:37.790 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8b584219-7a80-432b-aae2-b4b948bff54f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.807 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.858 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:37.889 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain sshd[312330]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:54:38 np0005625204.localdomain sudo[312273]: pam_unix(sudo:session): session closed for user root
Feb 20 09:54:38 np0005625204.localdomain ceph-mon[301857]: osdmap e135: 6 total, 6 up, 6 in
Feb 20 09:54:38 np0005625204.localdomain ceph-mon[301857]: osdmap e136: 6 total, 6 up, 6 in
Feb 20 09:54:38 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:38 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:38 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:38 np0005625204.localdomain sshd[312330]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:54:38 np0005625204.localdomain sudo[312347]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:54:38 np0005625204.localdomain sudo[312347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:54:38 np0005625204.localdomain sudo[312347]: pam_unix(sudo:session): session closed for user root
Feb 20 09:54:38 np0005625204.localdomain sudo[312370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:54:38 np0005625204.localdomain sudo[312370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:54:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:38.564 264355 INFO neutron.agent.linux.ip_lib [None req-5286dd56-94be-439a-8d96-c43df7ef8ddc - - - - - -] Device tap090441cf-8d cannot be used as it has no MAC address
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.586 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain kernel: device tap090441cf-8d entered promiscuous mode
Feb 20 09:54:38 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581278.5943] manager: (tap090441cf-8d): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.596 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:38Z|00199|binding|INFO|Claiming lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 for this chassis.
Feb 20 09:54:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:38Z|00200|binding|INFO|090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7: Claiming unknown
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:38.615 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a64fc1d2-8a52-4707-811c-fcf1e047c8d6, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:38.617 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 in datapath 5bd7ee4d-003e-46a4-8831-dc1b39078c68 bound to our chassis
Feb 20 09:54:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:38.619 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bd7ee4d-003e-46a4-8831-dc1b39078c68 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:38 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:38.620 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[435c0c98-7843-406c-969c-a4e9032df93b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:38Z|00201|binding|INFO|Setting lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 ovn-installed in OVS
Feb 20 09:54:38 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:38Z|00202|binding|INFO|Setting lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 up in Southbound
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.641 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.701 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e137 e137: 6 total, 6 up, 6 in
Feb 20 09:54:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:38.729 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:38Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5b03850>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5b03d00>], id=bc970bd1-c7c2-4901-b5ca-da9674328188, ip_allocation=immediate, mac_address=fa:16:3e:4d:17:27, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:30Z, description=, dns_domain=, id=d6b1eef5-3137-454f-8164-8278293c350a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-959522799, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20486, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1744, status=ACTIVE, subnets=['35b82aef-e2db-412e-8a11-c85033ae6cf6'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:33Z, vlan_transparent=None, network_id=d6b1eef5-3137-454f-8164-8278293c350a, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1823, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:38Z on network d6b1eef5-3137-454f-8164-8278293c350a
Feb 20 09:54:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:38.739 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:38 np0005625204.localdomain podman[312446]: 
Feb 20 09:54:38 np0005625204.localdomain sshd[312482]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:54:38 np0005625204.localdomain podman[312446]: 2026-02-20 09:54:38.923120723 +0000 UTC m=+0.151346503 container create 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:54:38 np0005625204.localdomain podman[312446]: 2026-02-20 09:54:38.82775297 +0000 UTC m=+0.055978760 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:54:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86.scope.
Feb 20 09:54:38 np0005625204.localdomain dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 1 addresses
Feb 20 09:54:38 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host
Feb 20 09:54:38 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts
Feb 20 09:54:38 np0005625204.localdomain podman[312484]: 2026-02-20 09:54:38.979261896 +0000 UTC m=+0.070264133 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:39 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:54:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58f04d840f8baaa0f71bc8773e36a918813b860415af52dc0d268b64c80757aa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:54:39 np0005625204.localdomain podman[312446]: 2026-02-20 09:54:39.013924608 +0000 UTC m=+0.242150378 container init 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:54:39 np0005625204.localdomain podman[312446]: 2026-02-20 09:54:39.025703696 +0000 UTC m=+0.253929446 container start 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312526]: started, version 2.85 cachesize 150
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312526]: DNS service limited to local subnets
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312526]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312526]: warning: no upstream servers configured
Feb 20 09:54:39 np0005625204.localdomain dnsmasq-dhcp[312526]: DHCP, static leases only on 10.101.0.0, lease time 1d
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 0 addresses
Feb 20 09:54:39 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host
Feb 20 09:54:39 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts
Feb 20 09:54:39 np0005625204.localdomain sudo[312370]: pam_unix(sudo:session): session closed for user root
Feb 20 09:54:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:39.204 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 778 MiB used, 41 GiB / 42 GiB avail; 190 KiB/s rd, 19 KiB/s wr, 262 op/s
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: osdmap e137: 6 total, 6 up, 6 in
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:54:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:39.211 264355 INFO neutron.agent.dhcp.agent [None req-3e8063b1-0647-422d-99c5-413d5e36a215 - - - - - -] DHCP configuration for ports {'ba50db7a-b774-4bda-9089-f43c5b7188aa'} is completed
Feb 20 09:54:39 np0005625204.localdomain sudo[312545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:54:39 np0005625204.localdomain sudo[312545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:54:39 np0005625204.localdomain sudo[312545]: pam_unix(sudo:session): session closed for user root
Feb 20 09:54:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:39.464 264355 INFO neutron.agent.dhcp.agent [None req-a611ad1f-9bb6-4df4-aa6f-29453444ecfc - - - - - -] DHCP configuration for ports {'bc970bd1-c7c2-4901-b5ca-da9674328188'} is completed
Feb 20 09:54:39 np0005625204.localdomain podman[312585]: 
Feb 20 09:54:39 np0005625204.localdomain podman[312585]: 2026-02-20 09:54:39.66758687 +0000 UTC m=+0.092525739 container create f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 20 09:54:39 np0005625204.localdomain systemd[1]: Started libpod-conmon-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e.scope.
Feb 20 09:54:39 np0005625204.localdomain podman[312585]: 2026-02-20 09:54:39.624704029 +0000 UTC m=+0.049642898 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:54:39 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:54:39 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/827134bce1983498aa5eb328de0e6b4c29e8835a5007ea6118894d8813f2ece8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:54:39 np0005625204.localdomain podman[312585]: 2026-02-20 09:54:39.741907565 +0000 UTC m=+0.166846394 container init f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:39 np0005625204.localdomain podman[312585]: 2026-02-20 09:54:39.751428623 +0000 UTC m=+0.176367442 container start f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312603]: started, version 2.85 cachesize 150
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312603]: DNS service limited to local subnets
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312603]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312603]: warning: no upstream servers configured
Feb 20 09:54:39 np0005625204.localdomain dnsmasq-dhcp[312603]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:54:39 np0005625204.localdomain dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 0 addresses
Feb 20 09:54:39 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host
Feb 20 09:54:39 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts
Feb 20 09:54:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:39.921 264355 INFO neutron.agent.dhcp.agent [None req-09be6d0d-2407-4c31-a111-375f4e7015bf - - - - - -] DHCP configuration for ports {'99bc2d50-f443-4862-87f7-b71e29668ed0'} is completed
Feb 20 09:54:39 np0005625204.localdomain sshd[312482]: Invalid user superuser from 203.228.30.198 port 47780
Feb 20 09:54:40 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:40.155 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:39Z, description=, device_id=112ff0d9-32cb-4021-a1c4-20c4dbfa4300, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59de730>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59deca0>], id=d763ba6c-a99a-441a-a009-8982c2eb3ef5, ip_allocation=immediate, mac_address=fa:16:3e:33:11:a0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:35Z, description=, dns_domain=, id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1153125537, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1798, status=ACTIVE, subnets=['3c327061-4348-4983-bd3a-abe5ea80ff0e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:37Z, vlan_transparent=None, network_id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1832, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:39Z on network 5bd7ee4d-003e-46a4-8831-dc1b39078c68
Feb 20 09:54:40 np0005625204.localdomain sshd[312482]: Received disconnect from 203.228.30.198 port 47780:11: Bye Bye [preauth]
Feb 20 09:54:40 np0005625204.localdomain sshd[312482]: Disconnected from invalid user superuser 203.228.30.198 port 47780 [preauth]
Feb 20 09:54:40 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1736566385' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:40 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1736566385' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:40 np0005625204.localdomain dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 1 addresses
Feb 20 09:54:40 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host
Feb 20 09:54:40 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts
Feb 20 09:54:40 np0005625204.localdomain podman[312620]: 2026-02-20 09:54:40.355701066 +0000 UTC m=+0.067193229 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:54:40 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:40.576 264355 INFO neutron.agent.dhcp.agent [None req-0a4937a7-4a0d-44d6-9a86-965f292ccade - - - - - -] DHCP configuration for ports {'d763ba6c-a99a-441a-a009-8982c2eb3ef5'} is completed
Feb 20 09:54:40 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:40.743 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:38Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a02400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a02460>], id=bc970bd1-c7c2-4901-b5ca-da9674328188, ip_allocation=immediate, mac_address=fa:16:3e:4d:17:27, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:30Z, description=, dns_domain=, id=d6b1eef5-3137-454f-8164-8278293c350a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-959522799, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=20486, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1744, status=ACTIVE, subnets=['35b82aef-e2db-412e-8a11-c85033ae6cf6'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:33Z, vlan_transparent=None, network_id=d6b1eef5-3137-454f-8164-8278293c350a, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1823, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:38Z on network d6b1eef5-3137-454f-8164-8278293c350a
Feb 20 09:54:40 np0005625204.localdomain podman[312657]: 2026-02-20 09:54:40.994495447 +0000 UTC m=+0.063127057 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:40 np0005625204.localdomain dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 1 addresses
Feb 20 09:54:40 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host
Feb 20 09:54:40 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts
Feb 20 09:54:41 np0005625204.localdomain ceph-mon[301857]: pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 782 MiB used, 41 GiB / 42 GiB avail; 145 KiB/s rd, 8.7 KiB/s wr, 188 op/s
Feb 20 09:54:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:41.261 264355 INFO neutron.agent.dhcp.agent [None req-086cf761-f22a-46ed-9f9f-3a174ca6583d - - - - - -] DHCP configuration for ports {'bc970bd1-c7c2-4901-b5ca-da9674328188'} is completed
Feb 20 09:54:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:41.499 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:39Z, description=, device_id=112ff0d9-32cb-4021-a1c4-20c4dbfa4300, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59aa130>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59aa370>], id=d763ba6c-a99a-441a-a009-8982c2eb3ef5, ip_allocation=immediate, mac_address=fa:16:3e:33:11:a0, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:35Z, description=, dns_domain=, id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1153125537, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55649, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1798, status=ACTIVE, subnets=['3c327061-4348-4983-bd3a-abe5ea80ff0e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:37Z, vlan_transparent=None, network_id=5bd7ee4d-003e-46a4-8831-dc1b39078c68, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1832, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:39Z on network 5bd7ee4d-003e-46a4-8831-dc1b39078c68
Feb 20 09:54:41 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:41.593 2 INFO neutron.agent.securitygroups_rpc [None req-eb743b9b-8319-4b6c-9522-759dec99d8f5 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:54:41 np0005625204.localdomain dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 1 addresses
Feb 20 09:54:41 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host
Feb 20 09:54:41 np0005625204.localdomain podman[312695]: 2026-02-20 09:54:41.697414722 +0000 UTC m=+0.065968983 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:54:41 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts
Feb 20 09:54:41 np0005625204.localdomain systemd[1]: tmp-crun.rEv27L.mount: Deactivated successfully.
Feb 20 09:54:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:41.919 264355 INFO neutron.agent.dhcp.agent [None req-134f6b14-a764-4e56-9a96-d6340411d015 - - - - - -] DHCP configuration for ports {'d763ba6c-a99a-441a-a009-8982c2eb3ef5'} is completed
Feb 20 09:54:42 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:42.259 2 INFO neutron.agent.securitygroups_rpc [None req-b2b645bc-d759-4ba7-b30a-1010fa24d49e 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:54:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:42.259 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:41Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59ae160>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59aab50>], id=931018ff-773f-42ec-b764-15a4ad18e505, ip_allocation=immediate, mac_address=fa:16:3e:7e:b4:55, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:31Z, description=, dns_domain=, id=2f1c353d-8de5-4616-8b20-8c686b261d9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-571545747, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38343, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1765, status=ACTIVE, subnets=['2b7f7d2f-8d57-4c58-a9a5-42e3b231614b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:35Z, vlan_transparent=None, network_id=2f1c353d-8de5-4616-8b20-8c686b261d9f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:42Z on network 2f1c353d-8de5-4616-8b20-8c686b261d9f
Feb 20 09:54:42 np0005625204.localdomain podman[312733]: 2026-02-20 09:54:42.466895818 +0000 UTC m=+0.057869747 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 20 09:54:42 np0005625204.localdomain dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 1 addresses
Feb 20 09:54:42 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host
Feb 20 09:54:42 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts
Feb 20 09:54:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:54:42 np0005625204.localdomain podman[312748]: 2026-02-20 09:54:42.579793903 +0000 UTC m=+0.085753313 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute)
Feb 20 09:54:42 np0005625204.localdomain podman[312748]: 2026-02-20 09:54:42.615172416 +0000 UTC m=+0.121131846 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:54:42 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:54:42 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e138 e138: 6 total, 6 up, 6 in
Feb 20 09:54:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:42.761 264355 INFO neutron.agent.dhcp.agent [None req-d780b5c2-caee-45a1-9727-7cc1f4f29673 - - - - - -] DHCP configuration for ports {'931018ff-773f-42ec-b764-15a4ad18e505'} is completed
Feb 20 09:54:43 np0005625204.localdomain ceph-mon[301857]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 217 KiB/s rd, 18 KiB/s wr, 300 op/s
Feb 20 09:54:43 np0005625204.localdomain ceph-mon[301857]: osdmap e138: 6 total, 6 up, 6 in
Feb 20 09:54:43 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:43.822 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:41Z, description=, device_id=53f382fa-9b2a-4c2c-b53d-7f58d82124d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a58ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a02c10>], id=931018ff-773f-42ec-b764-15a4ad18e505, ip_allocation=immediate, mac_address=fa:16:3e:7e:b4:55, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:31Z, description=, dns_domain=, id=2f1c353d-8de5-4616-8b20-8c686b261d9f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-571545747, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=38343, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1765, status=ACTIVE, subnets=['2b7f7d2f-8d57-4c58-a9a5-42e3b231614b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:35Z, vlan_transparent=None, network_id=2f1c353d-8de5-4616-8b20-8c686b261d9f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1844, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:54:42Z on network 2f1c353d-8de5-4616-8b20-8c686b261d9f
Feb 20 09:54:44 np0005625204.localdomain dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 1 addresses
Feb 20 09:54:44 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host
Feb 20 09:54:44 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts
Feb 20 09:54:44 np0005625204.localdomain podman[312788]: 2026-02-20 09:54:44.061762995 +0000 UTC m=+0.069383676 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:54:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:44.206 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:54:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:44.208 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:54:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:44.208 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:54:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:44.209 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:54:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:44.241 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:44.241 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:54:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:44.304 264355 INFO neutron.agent.dhcp.agent [None req-95e5c5c7-e3a2-4786-bb3a-add350ccd39a - - - - - -] DHCP configuration for ports {'931018ff-773f-42ec-b764-15a4ad18e505'} is completed
Feb 20 09:54:44 np0005625204.localdomain ceph-mon[301857]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 194 KiB/s rd, 16 KiB/s wr, 268 op/s
Feb 20 09:54:44 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:54:45 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e139 e139: 6 total, 6 up, 6 in
Feb 20 09:54:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:46 np0005625204.localdomain dnsmasq[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/addn_hosts - 0 addresses
Feb 20 09:54:46 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/host
Feb 20 09:54:46 np0005625204.localdomain dnsmasq-dhcp[312526]: read /var/lib/neutron/dhcp/2f1c353d-8de5-4616-8b20-8c686b261d9f/opts
Feb 20 09:54:46 np0005625204.localdomain podman[312826]: 2026-02-20 09:54:46.4255235 +0000 UTC m=+0.060058433 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true)
Feb 20 09:54:46 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:46.612 264355 INFO neutron.agent.linux.ip_lib [None req-bd819d36-1606-418f-9829-aaeaae7eb8a0 - - - - - -] Device tap3f9955f7-a5 cannot be used as it has no MAC address
Feb 20 09:54:46 np0005625204.localdomain ceph-mon[301857]: pgmap v241: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 153 KiB/s rd, 13 KiB/s wr, 213 op/s
Feb 20 09:54:46 np0005625204.localdomain ceph-mon[301857]: osdmap e139: 6 total, 6 up, 6 in
Feb 20 09:54:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:46.637 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:46 np0005625204.localdomain kernel: device tap3f9955f7-a5 entered promiscuous mode
Feb 20 09:54:46 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:46Z|00203|binding|INFO|Claiming lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 for this chassis.
Feb 20 09:54:46 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:46Z|00204|binding|INFO|3f9955f7-a558-438e-b36a-2dfdb0fa6f03: Claiming unknown
Feb 20 09:54:46 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581286.6460] manager: (tap3f9955f7-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Feb 20 09:54:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:46.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:46 np0005625204.localdomain systemd-udevd[312858]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:54:46 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:46.657 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7365eb83b07c4401a5a58afb3f122ce5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5653766c-f9f7-4ea8-b60b-d59b52335179, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=3f9955f7-a558-438e-b36a-2dfdb0fa6f03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:46 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:46.659 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 in datapath c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 bound to our chassis
Feb 20 09:54:46 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:46.661 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:46 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:46.662 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c4de329f-6eb6-46bf-827e-1913efa5eb92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:46Z|00205|binding|INFO|Setting lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 ovn-installed in OVS
Feb 20 09:54:46 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:46Z|00206|binding|INFO|Setting lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 up in Southbound
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:46.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap3f9955f7-a5: No such device
Feb 20 09:54:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:46.735 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:46.763 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:46 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:46.912 2 INFO neutron.agent.securitygroups_rpc [None req-bce28f21-3f23-462f-a383-948742167547 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:54:47 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:47Z|00207|binding|INFO|Releasing lport 626e3303-a120-4467-a686-694c8014af7a from this chassis (sb_readonly=0)
Feb 20 09:54:47 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:47Z|00208|binding|INFO|Setting lport 626e3303-a120-4467-a686-694c8014af7a down in Southbound
Feb 20 09:54:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:47.067 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:47 np0005625204.localdomain kernel: device tap626e3303-a1 left promiscuous mode
Feb 20 09:54:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:47.075 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2f1c353d-8de5-4616-8b20-8c686b261d9f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d454f9be-83f3-4c60-8509-6d14a02fd7f5, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=626e3303-a120-4467-a686-694c8014af7a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:47.077 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 626e3303-a120-4467-a686-694c8014af7a in datapath 2f1c353d-8de5-4616-8b20-8c686b261d9f unbound from our chassis
Feb 20 09:54:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:47.080 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2f1c353d-8de5-4616-8b20-8c686b261d9f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:54:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:47.081 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3bf18d-8d9b-4e0e-bc55-f74a528fb939]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:47.090 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:47 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:47.554 2 INFO neutron.agent.securitygroups_rpc [None req-4cd6b0db-c578-42d5-a167-b93bcbfe0117 d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']
Feb 20 09:54:47 np0005625204.localdomain podman[312930]: 
Feb 20 09:54:47 np0005625204.localdomain podman[312930]: 2026-02-20 09:54:47.581615644 +0000 UTC m=+0.066932841 container create 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:54:47 np0005625204.localdomain systemd[1]: Started libpod-conmon-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6.scope.
Feb 20 09:54:47 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:54:47 np0005625204.localdomain podman[312930]: 2026-02-20 09:54:47.555880974 +0000 UTC m=+0.041198151 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:54:47 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49439000619e910f9d57f4aa05bb363f13b2b24373c80bde35cfff75a4113f51/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:54:47 np0005625204.localdomain podman[312930]: 2026-02-20 09:54:47.667107419 +0000 UTC m=+0.152424626 container init 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:54:47 np0005625204.localdomain podman[312930]: 2026-02-20 09:54:47.676792902 +0000 UTC m=+0.162110099 container start 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:47 np0005625204.localdomain dnsmasq[312949]: started, version 2.85 cachesize 150
Feb 20 09:54:47 np0005625204.localdomain dnsmasq[312949]: DNS service limited to local subnets
Feb 20 09:54:47 np0005625204.localdomain dnsmasq[312949]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:54:47 np0005625204.localdomain dnsmasq[312949]: warning: no upstream servers configured
Feb 20 09:54:47 np0005625204.localdomain dnsmasq-dhcp[312949]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:54:47 np0005625204.localdomain dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 0 addresses
Feb 20 09:54:47 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host
Feb 20 09:54:47 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts
Feb 20 09:54:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:54:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:54:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:54:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162538 "" "Go-http-client/1.1"
Feb 20 09:54:47 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:47.740 264355 INFO neutron.agent.dhcp.agent [None req-bd819d36-1606-418f-9829-aaeaae7eb8a0 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:46Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59aed30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59ae970>], id=21b880c8-0671-49bf-908b-cd75a9b606ab, ip_allocation=immediate, mac_address=fa:16:3e:07:f9:a2, name=tempest-ExtraDHCPOptionsIpV6TestJSON-528365005, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:44Z, description=, dns_domain=, id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1165484479, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42841, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1866, status=ACTIVE, subnets=['271aec3b-f42d-4679-a514-3cc525446f17'], tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:45Z, vlan_transparent=None, network_id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['921ea913-835d-427d-a3f2-d35699dcd043'], standard_attr_id=1888, status=DOWN, tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:47Z on network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1
Feb 20 09:54:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:54:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20237 "" "Go-http-client/1.1"
Feb 20 09:54:47 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:47.895 264355 INFO neutron.agent.dhcp.agent [None req-2b239739-ba0e-4120-bee5-726985ba6e06 - - - - - -] DHCP configuration for ports {'728721aa-7c0e-449a-a140-7666ef1d7539'} is completed
Feb 20 09:54:47 np0005625204.localdomain dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 1 addresses
Feb 20 09:54:47 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host
Feb 20 09:54:47 np0005625204.localdomain podman[312970]: 2026-02-20 09:54:47.995572994 +0000 UTC m=+0.116290170 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:54:47 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts
Feb 20 09:54:48 np0005625204.localdomain dnsmasq[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/addn_hosts - 0 addresses
Feb 20 09:54:48 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/host
Feb 20 09:54:48 np0005625204.localdomain podman[312994]: 2026-02-20 09:54:48.058719959 +0000 UTC m=+0.098445848 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:54:48 np0005625204.localdomain dnsmasq-dhcp[312603]: read /var/lib/neutron/dhcp/5bd7ee4d-003e-46a4-8831-dc1b39078c68/opts
Feb 20 09:54:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:48.256 264355 INFO neutron.agent.dhcp.agent [None req-a6bdfe51-e495-4b8f-a2e2-1d39131de0e8 - - - - - -] DHCP configuration for ports {'21b880c8-0671-49bf-908b-cd75a9b606ab'} is completed
Feb 20 09:54:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:48Z|00209|binding|INFO|Releasing lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 from this chassis (sb_readonly=0)
Feb 20 09:54:48 np0005625204.localdomain kernel: device tap090441cf-8d left promiscuous mode
Feb 20 09:54:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:48Z|00210|binding|INFO|Setting lport 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 down in Southbound
Feb 20 09:54:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:48.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:48.305 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bd7ee4d-003e-46a4-8831-dc1b39078c68', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a64fc1d2-8a52-4707-811c-fcf1e047c8d6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:48.307 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 090441cf-8d73-45d9-9ed4-4c2bb3a5aeb7 in datapath 5bd7ee4d-003e-46a4-8831-dc1b39078c68 unbound from our chassis
Feb 20 09:54:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:48.309 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 5bd7ee4d-003e-46a4-8831-dc1b39078c68 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:48.310 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a9a5b5-53f4-4f3c-9f94-601e3833dc04]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:48.313 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:48 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:48.441 2 INFO neutron.agent.securitygroups_rpc [None req-8cc87b89-3410-419f-80d0-39ae3addedda f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:54:48 np0005625204.localdomain dnsmasq[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/addn_hosts - 0 addresses
Feb 20 09:54:48 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/host
Feb 20 09:54:48 np0005625204.localdomain podman[313043]: 2026-02-20 09:54:48.671236033 +0000 UTC m=+0.060619981 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:48 np0005625204.localdomain dnsmasq-dhcp[312254]: read /var/lib/neutron/dhcp/d6b1eef5-3137-454f-8164-8278293c350a/opts
Feb 20 09:54:48 np0005625204.localdomain ceph-mon[301857]: pgmap v243: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 8.2 KiB/s wr, 113 op/s
Feb 20 09:54:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:48Z|00211|binding|INFO|Releasing lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa from this chassis (sb_readonly=0)
Feb 20 09:54:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:48Z|00212|binding|INFO|Setting lport f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa down in Southbound
Feb 20 09:54:48 np0005625204.localdomain kernel: device tapf0ad1ac2-f9 left promiscuous mode
Feb 20 09:54:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:48.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:49.009 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:49.075 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6b1eef5-3137-454f-8164-8278293c350a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=189489f9-7d29-4419-9e31-fd1fec55fc46, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:49.077 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f0ad1ac2-f9d8-4766-a0cf-b4605d3747aa in datapath d6b1eef5-3137-454f-8164-8278293c350a unbound from our chassis
Feb 20 09:54:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:49.081 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6b1eef5-3137-454f-8164-8278293c350a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:54:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:49.082 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[05e91604-94f4-48b9-844e-9d095f79daff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:49.242 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:49.244 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:49 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:49.522 2 INFO neutron.agent.securitygroups_rpc [None req-fe10c928-22e1-439f-ab59-773382d09580 d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']
Feb 20 09:54:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.858 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:48Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5aa2970>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df594c460>, <neutron.agent.linux.dhcp.DictModel object at 0x7f3df62ba940>, <neutron.agent.linux.dhcp.DictModel object at 0x7f3df5aa2790>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df594c670>], id=4d0355d6-dfb2-4565-94cc-eebabb872f93, ip_allocation=immediate, mac_address=fa:16:3e:eb:1d:72, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1482054331, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:44Z, description=, dns_domain=, id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1165484479, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=42841, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1866, status=ACTIVE, subnets=['271aec3b-f42d-4679-a514-3cc525446f17'], tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:45Z, vlan_transparent=None, network_id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['921ea913-835d-427d-a3f2-d35699dcd043'], standard_attr_id=1891, status=DOWN, tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:48Z on network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1
Feb 20 09:54:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.883 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Feb 20 09:54:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.883 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Feb 20 09:54:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:49.884 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Feb 20 09:54:50 np0005625204.localdomain dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 2 addresses
Feb 20 09:54:50 np0005625204.localdomain podman[313082]: 2026-02-20 09:54:50.052701974 +0000 UTC m=+0.058815365 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:54:50 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host
Feb 20 09:54:50 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts
Feb 20 09:54:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:50.373 264355 INFO neutron.agent.dhcp.agent [None req-dcc37af5-f4a6-45fc-a6d2-2db6a24de490 - - - - - -] DHCP configuration for ports {'4d0355d6-dfb2-4565-94cc-eebabb872f93'} is completed
Feb 20 09:54:50 np0005625204.localdomain ceph-mon[301857]: pgmap v244: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 5.6 KiB/s rd, 1.4 KiB/s wr, 9 op/s
Feb 20 09:54:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3040086044' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3040086044' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:51 np0005625204.localdomain podman[313119]: 2026-02-20 09:54:51.244286726 +0000 UTC m=+0.063482697 container kill f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:54:51 np0005625204.localdomain dnsmasq[312603]: exiting on receipt of SIGTERM
Feb 20 09:54:51 np0005625204.localdomain systemd[1]: libpod-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e.scope: Deactivated successfully.
Feb 20 09:54:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:51 np0005625204.localdomain podman[313134]: 2026-02-20 09:54:51.322911432 +0000 UTC m=+0.059752145 container died f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:54:51 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e-userdata-shm.mount: Deactivated successfully.
Feb 20 09:54:51 np0005625204.localdomain podman[313134]: 2026-02-20 09:54:51.355328915 +0000 UTC m=+0.092169588 container cleanup f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:54:51 np0005625204.localdomain systemd[1]: libpod-conmon-f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e.scope: Deactivated successfully.
Feb 20 09:54:51 np0005625204.localdomain podman[313133]: 2026-02-20 09:54:51.394593846 +0000 UTC m=+0.128502970 container remove f3e20ebc48ac59f5b0ae63424e43661d80ec51f8d6f43c52b5680c3d06c6be0e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bd7ee4d-003e-46a4-8831-dc1b39078c68, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e140 e140: 6 total, 6 up, 6 in
Feb 20 09:54:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:51.943 264355 INFO neutron.agent.dhcp.agent [None req-78dd64e3-d426-4875-b095-eae114cb9051 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:52.142 2 INFO neutron.agent.securitygroups_rpc [None req-ef251e71-dc68-4712-81f3-aa7e64c344fa d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']
Feb 20 09:54:52 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-827134bce1983498aa5eb328de0e6b4c29e8835a5007ea6118894d8813f2ece8-merged.mount: Deactivated successfully.
Feb 20 09:54:52 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d5bd7ee4d\x2d003e\x2d46a4\x2d8831\x2ddc1b39078c68.mount: Deactivated successfully.
Feb 20 09:54:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:54:52 np0005625204.localdomain podman[313160]: 2026-02-20 09:54:52.362366838 +0000 UTC m=+0.088181087 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:54:52 np0005625204.localdomain podman[313160]: 2026-02-20 09:54:52.37033263 +0000 UTC m=+0.096146889 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:54:52 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:54:52 np0005625204.localdomain dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 1 addresses
Feb 20 09:54:52 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host
Feb 20 09:54:52 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts
Feb 20 09:54:52 np0005625204.localdomain podman[313200]: 2026-02-20 09:54:52.518766063 +0000 UTC m=+0.062867989 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:54:52 np0005625204.localdomain ceph-mon[301857]: pgmap v245: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 41 KiB/s rd, 2.3 KiB/s wr, 54 op/s
Feb 20 09:54:52 np0005625204.localdomain ceph-mon[301857]: osdmap e140: 6 total, 6 up, 6 in
Feb 20 09:54:52 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:52.925 2 INFO neutron.agent.securitygroups_rpc [None req-ffde2a61-a3bf-4d03-bb6c-67c1d37d15bf f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:54:53 np0005625204.localdomain dnsmasq[312526]: exiting on receipt of SIGTERM
Feb 20 09:54:53 np0005625204.localdomain podman[313237]: 2026-02-20 09:54:53.024360442 +0000 UTC m=+0.046409620 container kill 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: libpod-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86.scope: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain podman[313251]: 2026-02-20 09:54:53.09813421 +0000 UTC m=+0.058575269 container died 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:54:53 np0005625204.localdomain podman[313251]: 2026-02-20 09:54:53.126957105 +0000 UTC m=+0.087398124 container cleanup 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: libpod-conmon-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86.scope: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.152 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:53 np0005625204.localdomain podman[313253]: 2026-02-20 09:54:53.181110397 +0000 UTC m=+0.133380058 container remove 5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2f1c353d-8de5-4616-8b20-8c686b261d9f, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-58f04d840f8baaa0f71bc8773e36a918813b860415af52dc0d268b64c80757aa-merged.mount: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5de81b9d372a0271474e0fe3febb6adf88e5c1ab7ce8585c7575c2218373bb86-userdata-shm.mount: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:53Z|00213|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.485 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:46Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a41490>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a41ee0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a41310>, <neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a41550>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a41a00>], id=21b880c8-0671-49bf-908b-cd75a9b606ab, ip_allocation=immediate, mac_address=fa:16:3e:07:f9:a2, name=tempest-new-port-name-1177023162, network_id=c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, port_security_enabled=True, project_id=7365eb83b07c4401a5a58afb3f122ce5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['921ea913-835d-427d-a3f2-d35699dcd043'], standard_attr_id=1888, status=DOWN, tags=[], tenant_id=7365eb83b07c4401a5a58afb3f122ce5, updated_at=2026-02-20T09:54:53Z on network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.503 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.503 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.504 264355 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d2f1c353d\x2d8de5\x2d4616\x2d8b20\x2d8c686b261d9f.mount: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.519 264355 INFO neutron.agent.dhcp.agent [None req-015c8f29-66f1-4cdf-a119-07fa3585e028 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:53.520 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:53 np0005625204.localdomain dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 1 addresses
Feb 20 09:54:53 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host
Feb 20 09:54:53 np0005625204.localdomain podman[313299]: 2026-02-20 09:54:53.696278577 +0000 UTC m=+0.065260300 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:54:53 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts
Feb 20 09:54:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2344507398' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:54:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2344507398' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:54:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:54:53 np0005625204.localdomain dnsmasq[312254]: exiting on receipt of SIGTERM
Feb 20 09:54:53 np0005625204.localdomain podman[313332]: 2026-02-20 09:54:53.856953231 +0000 UTC m=+0.068469877 container kill 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: libpod-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c.scope: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain podman[313349]: 2026-02-20 09:54:53.930164152 +0000 UTC m=+0.056433573 container died 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:53 np0005625204.localdomain podman[313349]: 2026-02-20 09:54:53.96304799 +0000 UTC m=+0.089317421 container cleanup 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 20 09:54:53 np0005625204.localdomain systemd[1]: libpod-conmon-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c.scope: Deactivated successfully.
Feb 20 09:54:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:53.980 264355 INFO neutron.agent.dhcp.agent [None req-3cf73f7f-151c-4906-9f4f-8a81f24b7384 - - - - - -] DHCP configuration for ports {'21b880c8-0671-49bf-908b-cd75a9b606ab'} is completed
Feb 20 09:54:54 np0005625204.localdomain podman[313351]: 2026-02-20 09:54:54.01048619 +0000 UTC m=+0.129995685 container remove 1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-d6b1eef5-3137-454f-8164-8278293c350a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:54:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-58eee122d4e0dd04c061f43cc5e2ecc11b5cf7d4c3051551afe09a5a3c5f2a8d-merged.mount: Deactivated successfully.
Feb 20 09:54:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1af24634416e4978c9213b9e1b51b1f33ede6079f8631253d23625063bb5182c-userdata-shm.mount: Deactivated successfully.
Feb 20 09:54:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:54.245 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:54.248 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:54 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dd6b1eef5\x2d3137\x2d454f\x2d8164\x2d8278293c350a.mount: Deactivated successfully.
Feb 20 09:54:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:54.339 264355 INFO neutron.agent.dhcp.agent [None req-df89c7ed-7efc-4420-a977-ea6a863dcdba - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:54 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:54.597 2 INFO neutron.agent.securitygroups_rpc [None req-3a014f90-3d7f-4958-9a67-4182f974566f 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']
Feb 20 09:54:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:54.766 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3ad5ffb2-84e1-4f79-a933-80d3c5a63164", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:54:54 np0005625204.localdomain ceph-mon[301857]: pgmap v247: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 1.6 KiB/s wr, 52 op/s
Feb 20 09:54:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3ad5ffb2-84e1-4f79-a933-80d3c5a63164", "format": "json"}]: dispatch
Feb 20 09:54:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3514695046' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:54 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:54.960 2 INFO neutron.agent.securitygroups_rpc [None req-208eb96e-53cb-409e-b54b-a8915a18b91f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:54:55 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:55.016 2 INFO neutron.agent.securitygroups_rpc [None req-d0404fc6-f217-496c-b11d-08ac05ffdfb7 d89d4d1c90df43a18f75b47e31c78f62 7365eb83b07c4401a5a58afb3f122ce5 - - default default] Security group member updated ['921ea913-835d-427d-a3f2-d35699dcd043']
Feb 20 09:54:55 np0005625204.localdomain systemd[1]: tmp-crun.UXs7YT.mount: Deactivated successfully.
Feb 20 09:54:55 np0005625204.localdomain dnsmasq[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/addn_hosts - 0 addresses
Feb 20 09:54:55 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/host
Feb 20 09:54:55 np0005625204.localdomain podman[313393]: 2026-02-20 09:54:55.367092898 +0000 UTC m=+0.073074448 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:54:55 np0005625204.localdomain dnsmasq-dhcp[312949]: read /var/lib/neutron/dhcp/c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1/opts
Feb 20 09:54:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:55.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:55.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:54:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:55.760 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:54:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:55.760 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:54:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:55.761 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:54:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:55.762 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:54:55 np0005625204.localdomain ceph-mon[301857]: mgrmap e47: np0005625202.arwxwo(active, since 6m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:54:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3800725616' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:54:56 np0005625204.localdomain podman[313433]: 2026-02-20 09:54:56.143349329 +0000 UTC m=+0.082795063 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:54:56 np0005625204.localdomain podman[313433]: 2026-02-20 09:54:56.178299349 +0000 UTC m=+0.117745023 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:54:56 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3685988020' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.236 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.317 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.321 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:54:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:56.501 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:56 np0005625204.localdomain dnsmasq[312949]: exiting on receipt of SIGTERM
Feb 20 09:54:56 np0005625204.localdomain systemd[1]: tmp-crun.XUDdAV.mount: Deactivated successfully.
Feb 20 09:54:56 np0005625204.localdomain podman[313475]: 2026-02-20 09:54:56.513957703 +0000 UTC m=+0.081180064 container kill 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 20 09:54:56 np0005625204.localdomain systemd[1]: libpod-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6.scope: Deactivated successfully.
Feb 20 09:54:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:54:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:54:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:54:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:54:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:54:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.596 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:54:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:56.601 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.601 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11357MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.603 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:54:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:56.603 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.603 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.605 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:56 np0005625204.localdomain podman[313495]: 2026-02-20 09:54:56.606591614 +0000 UTC m=+0.065832619 container died 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 20 09:54:56 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6-userdata-shm.mount: Deactivated successfully.
Feb 20 09:54:56 np0005625204.localdomain podman[313495]: 2026-02-20 09:54:56.65162792 +0000 UTC m=+0.110868875 container remove 087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:54:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:56Z|00214|binding|INFO|Releasing lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 from this chassis (sb_readonly=0)
Feb 20 09:54:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:56Z|00215|binding|INFO|Setting lport 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 down in Southbound
Feb 20 09:54:56 np0005625204.localdomain kernel: device tap3f9955f7-a5 left promiscuous mode
Feb 20 09:54:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:56.664 2 INFO neutron.agent.securitygroups_rpc [None req-36db6997-ce64-4f72-86bb-117bda3b0094 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:56.679 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7365eb83b07c4401a5a58afb3f122ce5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5653766c-f9f7-4ea8-b60b-d59b52335179, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=3f9955f7-a558-438e-b36a-2dfdb0fa6f03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:56.681 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 3f9955f7-a558-438e-b36a-2dfdb0fa6f03 in datapath c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 unbound from our chassis
Feb 20 09:54:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:56.684 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c4f5fcc5-eb5a-46dc-b828-35cec3b2f1e1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:56.685 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[65ffcdf2-71f4-4ef8-8cb9-c9003a38422b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.691 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.693 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:56 np0005625204.localdomain systemd[1]: libpod-conmon-087aef38a29d6a0b941f4183a9c329df7500780eb794d0c2d5ce56840b90faf6.scope: Deactivated successfully.
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.707 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:54:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:56.764 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: pgmap v248: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 4.8 KiB/s wr, 71 op/s
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3685988020' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3ad5ffb2-84e1-4f79-a933-80d3c5a63164", "format": "json"}]: dispatch
Feb 20 09:54:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3ad5ffb2-84e1-4f79-a933-80d3c5a63164", "force": true, "format": "json"}]: dispatch
Feb 20 09:54:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:54:57 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1996257721' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:57.272 264355 INFO neutron.agent.dhcp.agent [None req-fa7c5aed-dcab-4d30-acac-b168cc8df1ee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:57.272 264355 INFO neutron.agent.dhcp.agent [None req-fa7c5aed-dcab-4d30-acac-b168cc8df1ee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:57.273 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:54:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:57.278 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:54:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:57.292 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:54:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:57.293 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:54:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:57.293 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.690s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:54:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-49439000619e910f9d57f4aa05bb363f13b2b24373c80bde35cfff75a4113f51-merged.mount: Deactivated successfully.
Feb 20 09:54:57 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dc4f5fcc5\x2deb5a\x2d46dc\x2db828\x2d35cec3b2f1e1.mount: Deactivated successfully.
Feb 20 09:54:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e141 e141: 6 total, 6 up, 6 in
Feb 20 09:54:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:57.772 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:54:57 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:57.781 2 INFO neutron.agent.securitygroups_rpc [None req-a1f738f8-2f94-4ebf-b02a-59bcf9971aeb 51a4789e7d0b404b9882e0c26f7229be 1c44e13adebb4610b7c0cd2fdc62a5b7 - - default default] Security group member updated ['000c42d1-648a-4f56-b7e6-024a1e270fb9']
Feb 20 09:54:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1996257721' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:54:57 np0005625204.localdomain ceph-mon[301857]: osdmap e141: 6 total, 6 up, 6 in
Feb 20 09:54:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:58.002 264355 INFO neutron.agent.linux.ip_lib [None req-a0184e0e-748b-48ec-b6a6-b11d207ba4c4 - - - - - -] Device tapea1cc9c8-ce cannot be used as it has no MAC address
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain kernel: device tapea1cc9c8-ce entered promiscuous mode
Feb 20 09:54:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:58Z|00216|binding|INFO|Claiming lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe for this chassis.
Feb 20 09:54:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:58Z|00217|binding|INFO|ea1cc9c8-cec9-46f8-a9aa-5542762442fe: Claiming unknown
Feb 20 09:54:58 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581298.0484] manager: (tapea1cc9c8-ce): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain systemd-udevd[313548]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:54:58 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:58.064 2 INFO neutron.agent.securitygroups_rpc [None req-865b7338-5884-4631-b540-e349cbd12cfd f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:54:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:58.069 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa788937-4ffe-4167-9bda-a66bb7ab07d7, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ea1cc9c8-cec9-46f8-a9aa-5542762442fe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:54:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:58.071 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ea1cc9c8-cec9-46f8-a9aa-5542762442fe in datapath 2c7c971d-607d-4f86-ac60-49a788864bee bound to our chassis
Feb 20 09:54:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:58.075 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2c7c971d-607d-4f86-ac60-49a788864bee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:54:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:54:58.076 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[86931df8-1654-4cda-8801-e336e8c6c53b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.100 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:58Z|00218|binding|INFO|Setting lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe ovn-installed in OVS
Feb 20 09:54:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:58Z|00219|binding|INFO|Setting lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe up in Southbound
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.106 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.136 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.166 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.293 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.294 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:54:58Z|00220|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.454 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:58.724 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:58 np0005625204.localdomain ceph-mon[301857]: pgmap v249: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 4.7 KiB/s wr, 70 op/s
Feb 20 09:54:58 np0005625204.localdomain ceph-mon[301857]: mgrmap e48: np0005625202.arwxwo(active, since 6m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:54:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:54:59 np0005625204.localdomain podman[313603]: 
Feb 20 09:54:59 np0005625204.localdomain podman[313609]: 2026-02-20 09:54:59.163743615 +0000 UTC m=+0.096283523 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, release=1770267347)
Feb 20 09:54:59 np0005625204.localdomain podman[313609]: 2026-02-20 09:54:59.17612637 +0000 UTC m=+0.108666248 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-type=git, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.expose-services=)
Feb 20 09:54:59 np0005625204.localdomain podman[313603]: 2026-02-20 09:54:59.087904223 +0000 UTC m=+0.043787018 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:54:59 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:54:59 np0005625204.localdomain podman[313603]: 2026-02-20 09:54:59.198376746 +0000 UTC m=+0.154259471 container create b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:54:59 np0005625204.localdomain systemd[1]: Started libpod-conmon-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a.scope.
Feb 20 09:54:59 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:54:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:59.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:54:59 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1375106b10e9c61e0995150f814346f15b740b6f2eaa5b60963053b920ba7d52/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:54:59 np0005625204.localdomain podman[313603]: 2026-02-20 09:54:59.28919181 +0000 UTC m=+0.245074565 container init b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 09:54:59 np0005625204.localdomain podman[313603]: 2026-02-20 09:54:59.299085101 +0000 UTC m=+0.254967866 container start b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 20 09:54:59 np0005625204.localdomain dnsmasq[313641]: started, version 2.85 cachesize 150
Feb 20 09:54:59 np0005625204.localdomain dnsmasq[313641]: DNS service limited to local subnets
Feb 20 09:54:59 np0005625204.localdomain dnsmasq[313641]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:54:59 np0005625204.localdomain dnsmasq[313641]: warning: no upstream servers configured
Feb 20 09:54:59 np0005625204.localdomain dnsmasq-dhcp[313641]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:54:59 np0005625204.localdomain dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 0 addresses
Feb 20 09:54:59 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host
Feb 20 09:54:59 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts
Feb 20 09:54:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:59.366 264355 INFO neutron.agent.dhcp.agent [None req-a0184e0e-748b-48ec-b6a6-b11d207ba4c4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:57Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59cb4c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a6fc40>], id=c116158e-f1e4-44c0-bd21-ec8f4d275597, ip_allocation=immediate, mac_address=fa:16:3e:7c:7f:62, name=tempest-RoutersIpV6Test-1424937382, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:54:53Z, description=, dns_domain=, id=2c7c971d-607d-4f86-ac60-49a788864bee, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1709331935, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=11774, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1901, status=ACTIVE, subnets=['b4df3efd-04f4-4c14-80df-db94b5574306'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:56Z, vlan_transparent=None, network_id=2c7c971d-607d-4f86-ac60-49a788864bee, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['000c42d1-648a-4f56-b7e6-024a1e270fb9'], standard_attr_id=1917, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:57Z on network 2c7c971d-607d-4f86-ac60-49a788864bee
Feb 20 09:54:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:59.489 264355 INFO neutron.agent.dhcp.agent [None req-f112fb60-b282-4bcb-8ccf-e2d6a24a4f7a - - - - - -] DHCP configuration for ports {'586cb9ae-270d-4e56-9729-a053eb7c9410'} is completed
Feb 20 09:54:59 np0005625204.localdomain dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 1 addresses
Feb 20 09:54:59 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host
Feb 20 09:54:59 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts
Feb 20 09:54:59 np0005625204.localdomain podman[313660]: 2026-02-20 09:54:59.548002953 +0000 UTC m=+0.049929856 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:54:59 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:54:59.598 2 INFO neutron.agent.securitygroups_rpc [None req-1d54fa0a-8062-4c9c-94e3-875dfdd78a26 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:54:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:59.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:54:59.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:54:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:54:59.860 264355 INFO neutron.agent.dhcp.agent [None req-6849c88b-986f-4f18-b3a1-9ff7e4403be8 - - - - - -] DHCP configuration for ports {'c116158e-f1e4-44c0-bd21-ec8f4d275597'} is completed
Feb 20 09:55:00 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:00.453 2 INFO neutron.agent.securitygroups_rpc [None req-4a29bc3c-e3e4-4d65-b800-f3b558d9c704 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']
Feb 20 09:55:00 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:00.605 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:55:00 np0005625204.localdomain ceph-mon[301857]: pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 786 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 8.5 KiB/s wr, 36 op/s
Feb 20 09:55:01 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:01.082 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:54:57Z, description=, device_id=4cb71357-16b4-46c1-8592-93fdc1f8bfda, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59b23d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59b2d30>], id=c116158e-f1e4-44c0-bd21-ec8f4d275597, ip_allocation=immediate, mac_address=fa:16:3e:7c:7f:62, name=tempest-RoutersIpV6Test-1424937382, network_id=2c7c971d-607d-4f86-ac60-49a788864bee, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['000c42d1-648a-4f56-b7e6-024a1e270fb9'], standard_attr_id=1917, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:54:58Z on network 2c7c971d-607d-4f86-ac60-49a788864bee
Feb 20 09:55:01 np0005625204.localdomain dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 1 addresses
Feb 20 09:55:01 np0005625204.localdomain podman[313699]: 2026-02-20 09:55:01.276985399 +0000 UTC m=+0.060933610 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:55:01 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host
Feb 20 09:55:01 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts
Feb 20 09:55:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:01 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:01.609 264355 INFO neutron.agent.dhcp.agent [None req-22422d1a-79dd-4cc5-a069-0e9563348e6f - - - - - -] DHCP configuration for ports {'c116158e-f1e4-44c0-bd21-ec8f4d275597'} is completed
Feb 20 09:55:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:01.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:01.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.827 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.828 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.829 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:55:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:02.829 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:55:02 np0005625204.localdomain ceph-mon[301857]: pgmap v252: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 7.9 KiB/s wr, 32 op/s
Feb 20 09:55:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/674329819' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2547929420' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:55:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2547929420' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:55:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2447234794' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:03.810 2 INFO neutron.agent.securitygroups_rpc [None req-06243d38-8e6a-45a5-a465-51e5b9c1322f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:03.922 2 INFO neutron.agent.securitygroups_rpc [None req-befc33c8-098a-487a-a3bd-757b9f93e81d 3ace3fc0d46241ffa2d6d0b16953a588 8aa5b5a34cfe458d96fea87261361db1 - - default default] Security group member updated ['f947e5dd-708d-45fc-8f3d-3e71e4aec5b2']
Feb 20 09:55:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:03.948 2 INFO neutron.agent.securitygroups_rpc [None req-d0194b4e-ea8b-4209-af3b-a875851573ce 51a4789e7d0b404b9882e0c26f7229be 1c44e13adebb4610b7c0cd2fdc62a5b7 - - default default] Security group member updated ['000c42d1-648a-4f56-b7e6-024a1e270fb9']
Feb 20 09:55:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:04.139 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:55:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:04.159 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:55:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:04.160 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:55:04 np0005625204.localdomain dnsmasq[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/addn_hosts - 0 addresses
Feb 20 09:55:04 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/host
Feb 20 09:55:04 np0005625204.localdomain podman[313737]: 2026-02-20 09:55:04.166274117 +0000 UTC m=+0.044724068 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:55:04 np0005625204.localdomain dnsmasq-dhcp[313641]: read /var/lib/neutron/dhcp/2c7c971d-607d-4f86-ac60-49a788864bee/opts
Feb 20 09:55:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:04.274 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:04Z|00221|binding|INFO|Releasing lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe from this chassis (sb_readonly=0)
Feb 20 09:55:04 np0005625204.localdomain kernel: device tapea1cc9c8-ce left promiscuous mode
Feb 20 09:55:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:04Z|00222|binding|INFO|Setting lport ea1cc9c8-cec9-46f8-a9aa-5542762442fe down in Southbound
Feb 20 09:55:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:04.344 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:04.358 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2c7c971d-607d-4f86-ac60-49a788864bee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fa788937-4ffe-4167-9bda-a66bb7ab07d7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ea1cc9c8-cec9-46f8-a9aa-5542762442fe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:04.360 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ea1cc9c8-cec9-46f8-a9aa-5542762442fe in datapath 2c7c971d-607d-4f86-ac60-49a788864bee unbound from our chassis
Feb 20 09:55:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:04.361 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:04.363 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 2c7c971d-607d-4f86-ac60-49a788864bee or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:04.365 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4bffdb3f-bb7e-452f-b656-61e6d36b032c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:04 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:04.956 2 INFO neutron.agent.securitygroups_rpc [None req-a41b9e5e-24c7-4e83-9a8a-33aa24dfcce3 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:05 np0005625204.localdomain ceph-mon[301857]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 7.7 KiB/s wr, 31 op/s
Feb 20 09:55:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e142 e142: 6 total, 6 up, 6 in
Feb 20 09:55:05 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:05.224 264355 INFO neutron.agent.linux.ip_lib [None req-b1a077cd-edae-41e2-98f3-4f0daeac1dbc - - - - - -] Device tap482793f1-f2 cannot be used as it has no MAC address
Feb 20 09:55:05 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:05.257 2 INFO neutron.agent.securitygroups_rpc [None req-563cc02b-9571-4400-9563-57af3068006b f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:05.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:05 np0005625204.localdomain kernel: device tap482793f1-f2 entered promiscuous mode
Feb 20 09:55:05 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581305.2977] manager: (tap482793f1-f2): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Feb 20 09:55:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:05.299 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:05 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:05Z|00223|binding|INFO|Claiming lport 482793f1-f265-4388-a4f1-7b2c24064b54 for this chassis.
Feb 20 09:55:05 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:05Z|00224|binding|INFO|482793f1-f265-4388-a4f1-7b2c24064b54: Claiming unknown
Feb 20 09:55:05 np0005625204.localdomain systemd-udevd[313767]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:05.318 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe23:ae62/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e02367d-9b8e-4ff3-90ec-1850f1ce8ba8, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=482793f1-f265-4388-a4f1-7b2c24064b54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:05.322 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 482793f1-f265-4388-a4f1-7b2c24064b54 in datapath eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 bound to our chassis
Feb 20 09:55:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:05.323 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:05.324 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[11e7c59c-d180-4775-8b8a-f796b2268191]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:55:05 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:05Z|00225|binding|INFO|Setting lport 482793f1-f265-4388-a4f1-7b2c24064b54 ovn-installed in OVS
Feb 20 09:55:05 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:05Z|00226|binding|INFO|Setting lport 482793f1-f265-4388-a4f1-7b2c24064b54 up in Southbound
Feb 20 09:55:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:05.339 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap482793f1-f2: No such device
Feb 20 09:55:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:05.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:05.432 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:55:05 np0005625204.localdomain systemd[1]: tmp-crun.saTJD0.mount: Deactivated successfully.
Feb 20 09:55:05 np0005625204.localdomain podman[313773]: 2026-02-20 09:55:05.462735821 +0000 UTC m=+0.111686259 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:55:05 np0005625204.localdomain podman[313773]: 2026-02-20 09:55:05.495273719 +0000 UTC m=+0.144224157 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:55:05 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:55:05 np0005625204.localdomain podman[313810]: 2026-02-20 09:55:05.563245771 +0000 UTC m=+0.097997484 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible)
Feb 20 09:55:05 np0005625204.localdomain podman[313810]: 2026-02-20 09:55:05.596217751 +0000 UTC m=+0.130969444 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:55:05 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.017 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.018 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.018 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:55:06 np0005625204.localdomain ceph-mon[301857]: osdmap e142: 6 total, 6 up, 6 in
Feb 20 09:55:06 np0005625204.localdomain podman[313879]: 
Feb 20 09:55:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e143 e143: 6 total, 6 up, 6 in
Feb 20 09:55:06 np0005625204.localdomain podman[313879]: 2026-02-20 09:55:06.192068228 +0000 UTC m=+0.089751574 container create 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:55:06 np0005625204.localdomain systemd[1]: Started libpod-conmon-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd.scope.
Feb 20 09:55:06 np0005625204.localdomain podman[313879]: 2026-02-20 09:55:06.14759874 +0000 UTC m=+0.045282106 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:06 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:06 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c74a41ff9564be3fb088581551a25ec5006a1f6afdd9eb855006f6c53fa651d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:06 np0005625204.localdomain podman[313879]: 2026-02-20 09:55:06.279852092 +0000 UTC m=+0.177535438 container init 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:06 np0005625204.localdomain podman[313879]: 2026-02-20 09:55:06.288715051 +0000 UTC m=+0.186398387 container start 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 20 09:55:06 np0005625204.localdomain dnsmasq[313897]: started, version 2.85 cachesize 150
Feb 20 09:55:06 np0005625204.localdomain dnsmasq[313897]: DNS service limited to local subnets
Feb 20 09:55:06 np0005625204.localdomain dnsmasq[313897]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:06 np0005625204.localdomain dnsmasq[313897]: warning: no upstream servers configured
Feb 20 09:55:06 np0005625204.localdomain dnsmasq[313897]: read /var/lib/neutron/dhcp/eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8/addn_hosts - 0 addresses
Feb 20 09:55:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:06 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:06.487 264355 INFO neutron.agent.dhcp.agent [None req-4133f662-87a7-42b5-b620-db3d2d9639fd - - - - - -] DHCP configuration for ports {'424d9e79-b09e-4419-bfce-356fcae07ea1'} is completed
Feb 20 09:55:06 np0005625204.localdomain dnsmasq[313897]: exiting on receipt of SIGTERM
Feb 20 09:55:06 np0005625204.localdomain podman[313916]: 2026-02-20 09:55:06.705722153 +0000 UTC m=+0.048789092 container kill 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:06 np0005625204.localdomain systemd[1]: libpod-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd.scope: Deactivated successfully.
Feb 20 09:55:06 np0005625204.localdomain podman[313928]: 2026-02-20 09:55:06.77617383 +0000 UTC m=+0.055592108 container died 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:55:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:06 np0005625204.localdomain podman[313928]: 2026-02-20 09:55:06.807666985 +0000 UTC m=+0.087085223 container cleanup 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:55:06 np0005625204.localdomain systemd[1]: libpod-conmon-5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd.scope: Deactivated successfully.
Feb 20 09:55:06 np0005625204.localdomain podman[313930]: 2026-02-20 09:55:06.84408791 +0000 UTC m=+0.118142775 container remove 5efdc811ba5983a7fd2c70a5982fbcaa524c7ab89bc2c9287e6e29398f82c8dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:55:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:06Z|00227|binding|INFO|Releasing lport 482793f1-f265-4388-a4f1-7b2c24064b54 from this chassis (sb_readonly=0)
Feb 20 09:55:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:06.854 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:06Z|00228|binding|INFO|Setting lport 482793f1-f265-4388-a4f1-7b2c24064b54 down in Southbound
Feb 20 09:55:06 np0005625204.localdomain kernel: device tap482793f1-f2 left promiscuous mode
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.871 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe23:ae62/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2e02367d-9b8e-4ff3-90ec-1850f1ce8ba8, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=482793f1-f265-4388-a4f1-7b2c24064b54) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.873 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 482793f1-f265-4388-a4f1-7b2c24064b54 in datapath eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 unbound from our chassis
Feb 20 09:55:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:06.873 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.876 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network eddc0ec2-ac95-44b0-8ba7-88ad136c7ae8 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:06.877 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[671b8dd9-c476-494d-8bf4-ad289185954a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:06 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:06.910 264355 INFO neutron.agent.dhcp.agent [None req-d302ebcb-4aeb-4719-9eb5-b9b9cc423d70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:06 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:06.911 264355 INFO neutron.agent.dhcp.agent [None req-d302ebcb-4aeb-4719-9eb5-b9b9cc423d70 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:07 np0005625204.localdomain dnsmasq[313641]: exiting on receipt of SIGTERM
Feb 20 09:55:07 np0005625204.localdomain podman[313972]: 2026-02-20 09:55:07.099100167 +0000 UTC m=+0.041330385 container kill b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: libpod-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a.scope: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain podman[313984]: 2026-02-20 09:55:07.15821185 +0000 UTC m=+0.050647477 container died b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:55:07 np0005625204.localdomain podman[313984]: 2026-02-20 09:55:07.180506907 +0000 UTC m=+0.072942494 container cleanup b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: libpod-conmon-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a.scope: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain ceph-mon[301857]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 4.5 KiB/s rd, 7.5 KiB/s wr, 10 op/s
Feb 20 09:55:07 np0005625204.localdomain ceph-mon[301857]: osdmap e143: 6 total, 6 up, 6 in
Feb 20 09:55:07 np0005625204.localdomain podman[313991]: 2026-02-20 09:55:07.23368435 +0000 UTC m=+0.110935466 container remove b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2c7c971d-607d-4f86-ac60-49a788864bee, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:55:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:07Z|00229|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:55:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:07.422 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-c74a41ff9564be3fb088581551a25ec5006a1f6afdd9eb855006f6c53fa651d7-merged.mount: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2deddc0ec2\x2dac95\x2d44b0\x2d8ba7\x2d88ad136c7ae8.mount: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-1375106b10e9c61e0995150f814346f15b740b6f2eaa5b60963053b920ba7d52-merged.mount: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0d36b16c2b2cd5e3d7da662ca4e1c33060c8e54ef2962d44a0bd01132adf53a-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:07.462 2 INFO neutron.agent.securitygroups_rpc [None req-63a5189c-53e8-40d9-9654-04d02c3d7a9f 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:07.565 264355 INFO neutron.agent.dhcp.agent [None req-d60a952b-8db8-40d6-beec-996cc9523d86 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:07 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d2c7c971d\x2d607d\x2d4f86\x2dac60\x2d49a788864bee.mount: Deactivated successfully.
Feb 20 09:55:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:07.650 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:08.153 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e144 e144: 6 total, 6 up, 6 in
Feb 20 09:55:08 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:08.623 2 INFO neutron.agent.securitygroups_rpc [None req-08efe737-22a8-48a9-b7b8-e1929d8369c8 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:08 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:08.950 2 INFO neutron.agent.securitygroups_rpc [None req-3b1327f4-a56e-425d-b764-df7dfb03463f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:09 np0005625204.localdomain ceph-mon[301857]: pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 4.2 KiB/s rd, 3.2 KiB/s wr, 8 op/s
Feb 20 09:55:09 np0005625204.localdomain ceph-mon[301857]: osdmap e144: 6 total, 6 up, 6 in
Feb 20 09:55:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e145 e145: 6 total, 6 up, 6 in
Feb 20 09:55:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:09.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:10 np0005625204.localdomain ceph-mon[301857]: osdmap e145: 6 total, 6 up, 6 in
Feb 20 09:55:10 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:10.372 2 INFO neutron.agent.securitygroups_rpc [None req-9a8b2a58-7fed-4400-9808-58a86984ca8a 3ace3fc0d46241ffa2d6d0b16953a588 8aa5b5a34cfe458d96fea87261361db1 - - default default] Security group member updated ['f947e5dd-708d-45fc-8f3d-3e71e4aec5b2']
Feb 20 09:55:10 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:10.910 264355 INFO neutron.agent.linux.ip_lib [None req-449e4550-801e-4cb5-8a33-c704d023f847 - - - - - -] Device tap43ee6e73-b5 cannot be used as it has no MAC address
Feb 20 09:55:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:10.981 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:10 np0005625204.localdomain kernel: device tap43ee6e73-b5 entered promiscuous mode
Feb 20 09:55:10 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581310.9897] manager: (tap43ee6e73-b5): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Feb 20 09:55:10 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:10Z|00230|binding|INFO|Claiming lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 for this chassis.
Feb 20 09:55:10 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:10Z|00231|binding|INFO|43ee6e73-b59d-4f8c-bc0d-e76444d22449: Claiming unknown
Feb 20 09:55:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:10.991 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:10 np0005625204.localdomain systemd-udevd[314024]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.000 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2478f085-2c0d-4bdc-b22a-1c88d748beb7, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=43ee6e73-b59d-4f8c-bc0d-e76444d22449) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.003 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 43ee6e73-b59d-4f8c-bc0d-e76444d22449 in datapath 46a8490e-a535-4bce-9ac1-74e63b0f238d bound to our chassis
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.005 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46a8490e-a535-4bce-9ac1-74e63b0f238d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.008 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[909dd26e-35e6-4786-8971-33699d73ebb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00232|binding|INFO|Setting lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 ovn-installed in OVS
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00233|binding|INFO|Setting lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 up in Southbound
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.044 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap43ee6e73-b5: No such device
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.090 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.129 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain ceph-mon[301857]: pgmap v260: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 3.2 KiB/s rd, 236 B/s wr, 7 op/s
Feb 20 09:55:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:11.783 2 INFO neutron.agent.securitygroups_rpc [None req-9a050612-ac54-4498-bab7-ac1559fb18bf 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.824 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c3441dcb-93b0-4351-aba2-94a1914f5ed5 with type ""
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00234|binding|INFO|Removing iface tap43ee6e73-b5 ovn-installed in OVS
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.826 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-46a8490e-a535-4bce-9ac1-74e63b0f238d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '54500b20d6a643669fbf357242dde27f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2478f085-2c0d-4bdc-b22a-1c88d748beb7, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=43ee6e73-b59d-4f8c-bc0d-e76444d22449) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.828 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 43ee6e73-b59d-4f8c-bc0d-e76444d22449 in datapath 46a8490e-a535-4bce-9ac1-74e63b0f238d unbound from our chassis
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.830 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 46a8490e-a535-4bce-9ac1-74e63b0f238d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.833 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7641f8-7de4-4731-bcba-bca580f620db]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00235|binding|INFO|Removing lport 43ee6e73-b59d-4f8c-bc0d-e76444d22449 ovn-installed in OVS
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.835 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:11.904 2 INFO neutron.agent.securitygroups_rpc [None req-36106926-50cd-429d-aa41-30ad5718ad39 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:11 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:11.928 264355 INFO neutron.agent.linux.ip_lib [None req-29401c17-9f88-48b5-9e1b-c3ed33377fb4 - - - - - -] Device tap4771034d-1a cannot be used as it has no MAC address
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.943 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain kernel: device tap4771034d-1a entered promiscuous mode
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.949 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581311.9493] manager: (tap4771034d-1a): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00236|binding|INFO|Claiming lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 for this chassis.
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.959 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00237|binding|INFO|4771034d-1a14-45f4-a39e-d4b7e4f8e0b4: Claiming unknown
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.969 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ccf68c5-a7cb-4cfb-a4c3-a082862eb086, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=4771034d-1a14-45f4-a39e-d4b7e4f8e0b4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.970 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 in datapath db986821-358f-44d6-9e8f-8928c31d10ae bound to our chassis
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.971 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network db986821-358f-44d6-9e8f-8928c31d10ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:11.973 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[2df336b2-e543-46e7-a989-16889df2f413]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00238|binding|INFO|Setting lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 ovn-installed in OVS
Feb 20 09:55:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:11Z|00239|binding|INFO|Setting lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 up in Southbound
Feb 20 09:55:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:11.995 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:12 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4771034d-1a: No such device
Feb 20 09:55:12 np0005625204.localdomain podman[314102]: 
Feb 20 09:55:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:12.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:12 np0005625204.localdomain podman[314102]: 2026-02-20 09:55:12.022417847 +0000 UTC m=+0.086912658 container create 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 20 09:55:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:12.033 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: Started libpod-conmon-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1.scope.
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:12 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0338ae8987ed2c464584a36ca2ce03dbc2fc06cedc3471135948acdfc2656ace/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:12 np0005625204.localdomain podman[314102]: 2026-02-20 09:55:12.083003684 +0000 UTC m=+0.147498455 container init 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:12 np0005625204.localdomain podman[314102]: 2026-02-20 09:55:11.985107784 +0000 UTC m=+0.049602575 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:12 np0005625204.localdomain podman[314102]: 2026-02-20 09:55:12.088484811 +0000 UTC m=+0.152979622 container start 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314148]: started, version 2.85 cachesize 150
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314148]: DNS service limited to local subnets
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314148]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314148]: warning: no upstream servers configured
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314148]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/addn_hosts - 0 addresses
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/host
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/opts
Feb 20 09:55:12 np0005625204.localdomain sshd[314149]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:55:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:12.148 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:12 np0005625204.localdomain kernel: device tap43ee6e73-b5 left promiscuous mode
Feb 20 09:55:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:12.164 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.265 264355 INFO neutron.agent.dhcp.agent [None req-7f2d0925-b7d2-4002-908e-9cc18f9ae448 - - - - - -] DHCP configuration for ports {'bdecb4bc-5e0d-4a76-954d-639d3a7f9a03'} is completed
Feb 20 09:55:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:12.340 2 INFO neutron.agent.securitygroups_rpc [None req-9a050612-ac54-4498-bab7-ac1559fb18bf 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e146 e146: 6 total, 6 up, 6 in
Feb 20 09:55:12 np0005625204.localdomain sshd[314149]: Invalid user jane from 57.128.218.144 port 40516
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: tmp-crun.6JUUyE.mount: Deactivated successfully.
Feb 20 09:55:12 np0005625204.localdomain podman[314211]: 
Feb 20 09:55:12 np0005625204.localdomain podman[314211]: 2026-02-20 09:55:12.857859083 +0000 UTC m=+0.111007609 container create c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: Started libpod-conmon-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a.scope.
Feb 20 09:55:12 np0005625204.localdomain sshd[314149]: Received disconnect from 57.128.218.144 port 40516:11: Bye Bye [preauth]
Feb 20 09:55:12 np0005625204.localdomain podman[314211]: 2026-02-20 09:55:12.809733463 +0000 UTC m=+0.062882049 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:12 np0005625204.localdomain podman[314222]: 2026-02-20 09:55:12.911201752 +0000 UTC m=+0.091988232 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:55:12 np0005625204.localdomain sshd[314149]: Disconnected from invalid user jane 57.128.218.144 port 40516 [preauth]
Feb 20 09:55:12 np0005625204.localdomain podman[314226]: 2026-02-20 09:55:12.933034604 +0000 UTC m=+0.107657638 container kill 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/addn_hosts - 0 addresses
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/host
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314148]: read /var/lib/neutron/dhcp/46a8490e-a535-4bce-9ac1-74e63b0f238d/opts
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent [-] Unable to reload_allocations dhcp for 46a8490e-a535-4bce-9ac1-74e63b0f238d.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap43ee6e73-b5 not found in namespace qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d.
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Feb 20 09:55:12 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24261156bad49a8305fefc4579d7635ae2afa5b6e97772dd5140da521b7fab16/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     return fut.result()
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     raise self._exception
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap43ee6e73-b5 not found in namespace qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d.
Feb 20 09:55:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:12.956 264355 ERROR neutron.agent.dhcp.agent 
Feb 20 09:55:12 np0005625204.localdomain podman[314222]: 2026-02-20 09:55:12.96027029 +0000 UTC m=+0.141056710 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:12 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:55:12 np0005625204.localdomain podman[314211]: 2026-02-20 09:55:12.976833362 +0000 UTC m=+0.229981938 container init c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:55:12 np0005625204.localdomain podman[314211]: 2026-02-20 09:55:12.982661679 +0000 UTC m=+0.235810235 container start c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314263]: started, version 2.85 cachesize 150
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314263]: DNS service limited to local subnets
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314263]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314263]: warning: no upstream servers configured
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314263]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:55:12 np0005625204.localdomain dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 0 addresses
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host
Feb 20 09:55:12 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.027 264355 INFO neutron.agent.dhcp.agent [None req-1a2e3763-4058-4706-bf05-99d8b392f202 - - - - - -] Synchronizing state
Feb 20 09:55:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:13.065 2 INFO neutron.agent.securitygroups_rpc [None req-5b112f63-69ca-4cb8-af5b-6c4c1770d4b6 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.121 264355 INFO neutron.agent.dhcp.agent [None req-68834c23-3df4-4658-b0ad-65ce3e176213 - - - - - -] DHCP configuration for ports {'cd05ab2e-f393-45d1-8131-90841abef68e'} is completed
Feb 20 09:55:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:13.280 2 INFO neutron.agent.securitygroups_rpc [None req-624534fd-50bc-4fd0-a351-6a4578734382 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:13 np0005625204.localdomain ceph-mon[301857]: pgmap v261: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Feb 20 09:55:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3509655015' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:55:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3509655015' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:55:13 np0005625204.localdomain ceph-mon[301857]: osdmap e146: 6 total, 6 up, 6 in
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.315 264355 INFO neutron.agent.dhcp.agent [None req-c2bf4659-23a7-4d94-a36c-8a2e456b8aa4 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.316 264355 INFO neutron.agent.dhcp.agent [-] Starting network 46a8490e-a535-4bce-9ac1-74e63b0f238d dhcp configuration
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.316 264355 INFO neutron.agent.dhcp.agent [-] Finished network 46a8490e-a535-4bce-9ac1-74e63b0f238d dhcp configuration
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.317 264355 INFO neutron.agent.dhcp.agent [None req-c2bf4659-23a7-4d94-a36c-8a2e456b8aa4 - - - - - -] Synchronizing state complete
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.317 264355 INFO neutron.agent.dhcp.agent [None req-29401c17-9f88-48b5-9e1b-c3ed33377fb4 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:12Z, description=, device_id=f26d36e8-7d9e-47ed-9784-1de0d2ce9ae4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a85100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a852e0>], id=dac9892a-d1b0-4086-9174-145617b0e7c9, ip_allocation=immediate, mac_address=fa:16:3e:71:14:9a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:08Z, description=, dns_domain=, id=db986821-358f-44d6-9e8f-8928c31d10ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-904494756, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12149, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1973, status=ACTIVE, subnets=['92327ff9-f3f4-4fb2-9234-a9395798504e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:10Z, vlan_transparent=None, network_id=db986821-358f-44d6-9e8f-8928c31d10ae, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1988, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:12Z on network db986821-358f-44d6-9e8f-8928c31d10ae
Feb 20 09:55:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:13Z|00240|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:55:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:13.528 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:13 np0005625204.localdomain dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 1 addresses
Feb 20 09:55:13 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host
Feb 20 09:55:13 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts
Feb 20 09:55:13 np0005625204.localdomain podman[314290]: 2026-02-20 09:55:13.565274035 +0000 UTC m=+0.103667326 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:55:13 np0005625204.localdomain dnsmasq[314148]: exiting on receipt of SIGTERM
Feb 20 09:55:13 np0005625204.localdomain podman[314312]: 2026-02-20 09:55:13.692848736 +0000 UTC m=+0.059671761 container kill 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:55:13 np0005625204.localdomain systemd[1]: libpod-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1.scope: Deactivated successfully.
Feb 20 09:55:13 np0005625204.localdomain podman[314330]: 2026-02-20 09:55:13.768411528 +0000 UTC m=+0.059644420 container died 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 20 09:55:13 np0005625204.localdomain podman[314330]: 2026-02-20 09:55:13.799298186 +0000 UTC m=+0.090531038 container cleanup 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:55:13 np0005625204.localdomain systemd[1]: libpod-conmon-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1.scope: Deactivated successfully.
Feb 20 09:55:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-0338ae8987ed2c464584a36ca2ce03dbc2fc06cedc3471135948acdfc2656ace-merged.mount: Deactivated successfully.
Feb 20 09:55:13 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:13 np0005625204.localdomain podman[314333]: 2026-02-20 09:55:13.855973645 +0000 UTC m=+0.140988588 container remove 81aac61f034982a7769d88f5d390f36be570f2120cb27d66cb11249af60659d1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-46a8490e-a535-4bce-9ac1-74e63b0f238d, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.886 264355 INFO neutron.agent.dhcp.agent [None req-e1eb74c7-20a6-4ab1-b283-f34bddc1a110 - - - - - -] DHCP configuration for ports {'dac9892a-d1b0-4086-9174-145617b0e7c9'} is completed
Feb 20 09:55:13 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d46a8490e\x2da535\x2d4bce\x2d9ac1\x2d74e63b0f238d.mount: Deactivated successfully.
Feb 20 09:55:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:13.970 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:12Z, description=, device_id=f26d36e8-7d9e-47ed-9784-1de0d2ce9ae4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a20700>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59aa190>], id=dac9892a-d1b0-4086-9174-145617b0e7c9, ip_allocation=immediate, mac_address=fa:16:3e:71:14:9a, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:08Z, description=, dns_domain=, id=db986821-358f-44d6-9e8f-8928c31d10ae, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-904494756, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=12149, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1973, status=ACTIVE, subnets=['92327ff9-f3f4-4fb2-9234-a9395798504e'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:10Z, vlan_transparent=None, network_id=db986821-358f-44d6-9e8f-8928c31d10ae, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1988, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:12Z on network db986821-358f-44d6-9e8f-8928c31d10ae
Feb 20 09:55:14 np0005625204.localdomain podman[314375]: 2026-02-20 09:55:14.161616338 +0000 UTC m=+0.060011382 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:55:14 np0005625204.localdomain dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 1 addresses
Feb 20 09:55:14 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host
Feb 20 09:55:14 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts
Feb 20 09:55:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:14.280 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:14 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:14.480 264355 INFO neutron.agent.dhcp.agent [None req-92ba39a7-7cc4-4605-856e-f6754aa06c8e - - - - - -] DHCP configuration for ports {'dac9892a-d1b0-4086-9174-145617b0e7c9'} is completed
Feb 20 09:55:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:14.506 2 INFO neutron.agent.securitygroups_rpc [None req-cfd67c24-e1f2-4ca4-a53e-99c158a0738a 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:14 np0005625204.localdomain podman[314414]: 2026-02-20 09:55:14.790458127 +0000 UTC m=+0.062737175 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:14 np0005625204.localdomain dnsmasq[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/addn_hosts - 0 addresses
Feb 20 09:55:14 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/host
Feb 20 09:55:14 np0005625204.localdomain dnsmasq-dhcp[314263]: read /var/lib/neutron/dhcp/db986821-358f-44d6-9e8f-8928c31d10ae/opts
Feb 20 09:55:14 np0005625204.localdomain kernel: device tap4771034d-1a left promiscuous mode
Feb 20 09:55:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:14.996 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:14Z|00241|binding|INFO|Releasing lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 from this chassis (sb_readonly=0)
Feb 20 09:55:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:14Z|00242|binding|INFO|Setting lport 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 down in Southbound
Feb 20 09:55:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:15.005 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db986821-358f-44d6-9e8f-8928c31d10ae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ccf68c5-a7cb-4cfb-a4c3-a082862eb086, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=4771034d-1a14-45f4-a39e-d4b7e4f8e0b4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:15.007 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4771034d-1a14-45f4-a39e-d4b7e4f8e0b4 in datapath db986821-358f-44d6-9e8f-8928c31d10ae unbound from our chassis
Feb 20 09:55:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:15.009 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network db986821-358f-44d6-9e8f-8928c31d10ae or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:15.010 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b04ce51c-392a-4be2-9233-53aeb8eec8a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:15.023 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:15.025 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:15 np0005625204.localdomain ceph-mon[301857]: pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 57 op/s
Feb 20 09:55:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:16.010 2 INFO neutron.agent.securitygroups_rpc [None req-7271509a-2575-4000-bc96-a2fd7601216d f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e147 e147: 6 total, 6 up, 6 in
Feb 20 09:55:17 np0005625204.localdomain podman[314454]: 2026-02-20 09:55:17.082125773 +0000 UTC m=+0.062979422 container kill c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:55:17 np0005625204.localdomain dnsmasq[314263]: exiting on receipt of SIGTERM
Feb 20 09:55:17 np0005625204.localdomain systemd[1]: libpod-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a.scope: Deactivated successfully.
Feb 20 09:55:17 np0005625204.localdomain podman[314469]: 2026-02-20 09:55:17.152681774 +0000 UTC m=+0.053185725 container died c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:55:17 np0005625204.localdomain systemd[1]: tmp-crun.P0oI4q.mount: Deactivated successfully.
Feb 20 09:55:17 np0005625204.localdomain podman[314469]: 2026-02-20 09:55:17.261931998 +0000 UTC m=+0.162435909 container cleanup c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:17 np0005625204.localdomain systemd[1]: libpod-conmon-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a.scope: Deactivated successfully.
Feb 20 09:55:17 np0005625204.localdomain podman[314468]: 2026-02-20 09:55:17.281148381 +0000 UTC m=+0.179266019 container remove c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db986821-358f-44d6-9e8f-8928c31d10ae, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 20 09:55:17 np0005625204.localdomain ceph-mon[301857]: pgmap v264: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 69 KiB/s rd, 4.0 KiB/s wr, 91 op/s
Feb 20 09:55:17 np0005625204.localdomain ceph-mon[301857]: osdmap e147: 6 total, 6 up, 6 in
Feb 20 09:55:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:55:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:55:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:55:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:55:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:55:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18339 "" "Go-http-client/1.1"
Feb 20 09:55:17 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:17.794 264355 INFO neutron.agent.dhcp.agent [None req-9a99823c-4a01-4153-b1a7-1f995f7ae87d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:17 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:17.803 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:17 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:17.977 2 INFO neutron.agent.securitygroups_rpc [None req-99b1b964-36a8-407d-a34f-bd7246c382f8 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:17 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:17.981 2 INFO neutron.agent.securitygroups_rpc [None req-9d512101-812d-472a-bd29-050847053b0a f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-24261156bad49a8305fefc4579d7635ae2afa5b6e97772dd5140da521b7fab16-merged.mount: Deactivated successfully.
Feb 20 09:55:18 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c38ff7c27918fed188f979509640974869c477eb0859709fe294aba56526521a-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:18 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2ddb986821\x2d358f\x2d44d6\x2d9e8f\x2d8928c31d10ae.mount: Deactivated successfully.
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.209 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.216 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0308ddb8-8650-4867-bc8a-8d85a653999e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.211079', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4322aa3e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '63cf382f663fa482fad6af017b88d9ee6459b2237bd74e76afe3216cabd06b68'}]}, 'timestamp': '2026-02-20 09:55:18.217462', '_unique_id': '7937c235e65d4e6a8236af6252fa6e01'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.218 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.220 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.249 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.250 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44163411-ae98-424f-998f-0e527e369af1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.220258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4327b074-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '62b06f7a25b900e95b3a6bdfb5b0558a8f51b45690ac0b0769615b803d7939b6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.220258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4327c21c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'ccbcd25ea41cd0844b3434cf3f236fba7462ed1068a826746d999243eac5defa'}]}, 'timestamp': '2026-02-20 09:55:18.250829', '_unique_id': '4365349e2d124a289570e4163eab5f10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.251 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.253 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.253 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8e93a2-e0b1-454f-9d9c-9a709257b9ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.253255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43283486-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '9413e6ea9816c541620301ece107fc5adb0ee1c89ca4d35b0fb11d3f18b1cb30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.253255', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '432848b8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '3497e4f40c0e2efece87ea6b3d4343e0da1dbf15252203b57fe9a6d5ffde5242'}]}, 'timestamp': '2026-02-20 09:55:18.254204', '_unique_id': 'f35b3f2a900b417ca8dff3ee39eabd44'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.255 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.256 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.267 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb70fa9b-a88a-4853-968f-56087c9c039d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.256392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '432a5f18-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': 'c75c356c2436840165ca0af7d778ba1e00768ae4e16de6899fc8ffc44fd6ee59'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.256392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '432a6fe4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': '9977d24a0f71dedc1f83b5eabc49a8c809323fcaeac35c4efa43ec55456fefe8'}]}, 'timestamp': '2026-02-20 09:55:18.268311', '_unique_id': 'a4d7fddbe39d498995026a3a78be989e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain systemd-journald[48359]: Data hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Feb 20 09:55:18 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.269 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.270 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.270 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98406af6-c826-40eb-b148-76340e527aa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.270509', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432ad7f4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '411becbbefcbc37233a0ae4decd56bc696779d886367eca4bc6f6efa0b72fb70'}]}, 'timestamp': '2026-02-20 09:55:18.271005', '_unique_id': 'b93e308a056a4d95b07b03e92b101d06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.271 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.272 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.273 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.273 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd111b78d-348b-4a7b-9f19-1a3c09e9ede3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.273221', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432b402c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '2b4adb65eb3df8022083599668183c3d1ea908bba1deb464abba07bdcf2f82b2'}]}, 'timestamp': '2026-02-20 09:55:18.273702', '_unique_id': '0ffeeffc188f4f43bcd255800f7bdfdf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.274 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.275 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.275 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.276 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef335e9a-cfda-48f7-b014-ddd0a841cd00', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.276011', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432bad46-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'd8375db9499f9b5254a5f8432da70f0f9abe72c3c824139b79b7a881030230ab'}]}, 'timestamp': '2026-02-20 09:55:18.276463', '_unique_id': 'a65f60537c4b42789e5bac8bf44789cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.277 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.278 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.278 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.279 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04967746-a53c-4fe3-8cad-6e59a9f4f6e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.278997', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432c2208-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '97e86b9220f5da1c3f5d018fe7ef98efce66ec86b833e89df17b4e419a417002'}]}, 'timestamp': '2026-02-20 09:55:18.279453', '_unique_id': 'dde045211b804f33a2aab4f7e523a295'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.280 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.281 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.281 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4b3f8c6-eb09-44e5-beea-840b5dbc672a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.281522', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432c86ee-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'd2d03f4bb5dad94a0aff4a6f5763a032190ebdf571302846a2a57d561bc05d34'}]}, 'timestamp': '2026-02-20 09:55:18.282034', '_unique_id': '60f2e83d851f4fe6b1b8a7d0f128677b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.282 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.283 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9af111ee-ec6e-4cc4-879f-f2d5ab0148c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:55:18.284072', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '432f9276-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.540533254, 'message_signature': '0062a99a0209afba08732627cd7cb65f8fcd15add4a890c90e0fc4bdaf4a40b7'}]}, 'timestamp': '2026-02-20 09:55:18.301986', '_unique_id': '061b1602d7944239acc3d315f3cb2ecc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.302 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cbb53467-3376-4df7-ac26-e158451d8975', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.304246', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '432ffc98-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'ae85aacafc0663775f1edcd41e01318ffca6c598536af7d30295a24451e20ea1'}]}, 'timestamp': '2026-02-20 09:55:18.304747', '_unique_id': '51da9bf793634c3c8f2b7f79f198695e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.305 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.306 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1afdc59a-ea37-499c-95a3-0e8121a0ec40', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.306963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '433066d8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': '4629f42c1ad43a1dc0cdf3a7bbbc95c77378a444ed7356e340204f7a2022bb51'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.306963', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43307768-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': 'bf4232903a3d996515942c143259a53122e7caf998ff0b48f263733955ffbf50'}]}, 'timestamp': '2026-02-20 09:55:18.307872', '_unique_id': 'eff9f0b618664d44bc3d3e9a789d5f14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.308 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.309 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.310 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.310 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '736651c6-a8ad-4877-bff7-ac58406d5a72', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.310121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4330e20c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '4122abc75fb9d2bf08e26f33af00d9bf6a0b0020e28eaba6200165e561559e77'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.310121', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4330f4ae-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '72ba8afe4fb3e68949cb506385511bebbf099593903c4df189618a6ab62eac6a'}]}, 'timestamp': '2026-02-20 09:55:18.311056', '_unique_id': '0f82dd8f55f74422aeba655f169d4be9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'caf433e0-32e5-4bab-909e-4bfc15d44c88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.313215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43315a5c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'c1600fc6d8abcb3061beb07e4ddb111ea428fe6c71654936c64b2b5e63f2aefc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.313215', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '43316c4a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '8e29aa5499498053c608e5b3492fbf3b18bdc39cb30fe5ce8df15b66cffeb971'}]}, 'timestamp': '2026-02-20 09:55:18.314091', '_unique_id': '2acbee8351bd4eb3a2af110c97f1c2d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.314 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.316 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b46f38c2-e419-4619-b2cd-10c2f5dde1cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.316180', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4331ce56-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '54bd21f34d2766aa42ebe533602971bfe04c96f10e98a3e8b401075a4650df4b'}]}, 'timestamp': '2026-02-20 09:55:18.316629', '_unique_id': 'e409c98ab6a84ebdb694b181e8056b74'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.317 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.318 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.318 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '009aeb26-c6cf-4f0f-95e4-29bf67b46048', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.318740', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '4332326a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': '5ffc244c730b7033abdba2b8406703f1aa608c65ec5269b5f2af1c03bd53bd40'}]}, 'timestamp': '2026-02-20 09:55:18.319247', '_unique_id': '117fa2c6214f49749396800a58e87c07'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '693ff355-b5c5-435c-aa89-512c3e16b5e3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.321335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '43329782-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'da614cccb741f8e53f2c3ec5ca24a7a1c1c6211f7084557534ab155edceb4b29'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.321335', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4332ab00-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': 'cc6876b3bd6d2295d84d0b40c04a7e5501383d17185b3bd1a4d579a56cfa83ed'}]}, 'timestamp': '2026-02-20 09:55:18.322252', '_unique_id': '150e5deea8224e0bb80ca24d3327d1b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.323 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.324 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69e7eab3-8d1a-4c17-af5e-aaeea9e3e0c0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:55:18.324321', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '43330c62-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.450278676, 'message_signature': 'e5b25c0abba0cea62f63610e1c1b7d4090ad30f9e2662102fc964bbf2d23b0e9'}]}, 'timestamp': '2026-02-20 09:55:18.324878', '_unique_id': '0bb8a038482341918b9e6b40dff29427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '04732ff0-c629-4fc2-9c0c-fa1138e6e4cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.326797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '433369b4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '335db5c59eb2fbbdba444b71465c673085f44b3d2b1861fa69b5034b809a58e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.326797', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '433373d2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.459457824, 'message_signature': '5d07d7a4fda264a1159146392832a34356c4216864c19f52cb21c22e711e0508'}]}, 'timestamp': '2026-02-20 09:55:18.327315', '_unique_id': '09ceea6dcc0a4ca2ae4d96e08ee497aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.327 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.328 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.328 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 17620000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecfde883-9c84-4eff-a169-1f355ae8f96e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17620000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:55:18.328599', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '4333b194-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.540533254, 'message_signature': 'b9978cf158c01db6a69960fffb4bd53a511a36252447e19747e0037f55df9c7b'}]}, 'timestamp': '2026-02-20 09:55:18.328904', '_unique_id': '5d51c6af30c74ad5a142596c02282e34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.329 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb76359a-0057-4f43-b30e-f35af43e7983', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:55:18.330209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '4333ef06-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': 'af8167d715ad692102645db062ec8b445d65ebe912d60a13d15d3686a9866ee6'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:55:18.330209', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '4333f898-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11877.49559636, 'message_signature': '99fb62d7a2fc51c92ee5dfba3c70eaf9d91c706eea52bcd4e14fead690219d9d'}]}, 'timestamp': '2026-02-20 09:55:18.330735', '_unique_id': '1331dc289bed47e49872781d181b2646'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:55:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:55:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:55:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e148 e148: 6 total, 6 up, 6 in
Feb 20 09:55:18 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:18.371 2 INFO neutron.agent.securitygroups_rpc [None req-6afebda2-88bb-41f1-8c70-a0608f1757d1 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:18 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:18.396 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:18 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 09:55:18 np0005625204.localdomain ceph-mon[301857]: pgmap v266: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 3.5 KiB/s wr, 79 op/s
Feb 20 09:55:18 np0005625204.localdomain ceph-mon[301857]: osdmap e148: 6 total, 6 up, 6 in
Feb 20 09:55:19 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:19.207 264355 INFO neutron.agent.linux.ip_lib [None req-3e6675b4-0a31-4041-8d9b-35d52c04f656 - - - - - -] Device tapad165bef-73 cannot be used as it has no MAC address
Feb 20 09:55:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:19Z|00243|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.276 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.285 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain kernel: device tapad165bef-73 entered promiscuous mode
Feb 20 09:55:19 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581319.2935] manager: (tapad165bef-73): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Feb 20 09:55:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:19Z|00244|binding|INFO|Claiming lport ad165bef-73b3-4f1e-864a-c401cd5b89ed for this chassis.
Feb 20 09:55:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:19Z|00245|binding|INFO|ad165bef-73b3-4f1e-864a-c401cd5b89ed: Claiming unknown
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain systemd-udevd[314507]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:19.310 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081736e4-c4cd-4a80-b501-1dcc1e64a740, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ad165bef-73b3-4f1e-864a-c401cd5b89ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:19.312 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ad165bef-73b3-4f1e-864a-c401cd5b89ed in datapath 1569f18d-1cf2-4113-a1bd-2d35906eb20f bound to our chassis
Feb 20 09:55:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:19.314 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 1569f18d-1cf2-4113-a1bd-2d35906eb20f or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:19.315 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[2d7c09c8-edeb-4c8c-b079-d9ce8a0f0a88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.326 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:19Z|00246|binding|INFO|Setting lport ad165bef-73b3-4f1e-864a-c401cd5b89ed ovn-installed in OVS
Feb 20 09:55:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:19Z|00247|binding|INFO|Setting lport ad165bef-73b3-4f1e-864a-c401cd5b89ed up in Southbound
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.331 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.332 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.370 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:19.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:55:19 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1519354984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:55:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:55:19 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1519354984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:55:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1519354984' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:55:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1519354984' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:55:20 np0005625204.localdomain podman[314560]: 
Feb 20 09:55:20 np0005625204.localdomain podman[314560]: 2026-02-20 09:55:20.274800575 +0000 UTC m=+0.090602300 container create 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:55:20 np0005625204.localdomain systemd[1]: Started libpod-conmon-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d.scope.
Feb 20 09:55:20 np0005625204.localdomain podman[314560]: 2026-02-20 09:55:20.231867472 +0000 UTC m=+0.047669217 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:20 np0005625204.localdomain systemd[1]: tmp-crun.d7Xkyv.mount: Deactivated successfully.
Feb 20 09:55:20 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:20 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7724903b31542ba17b88488ef9f53de08d696e33d16bf9339b383229babbfc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:20 np0005625204.localdomain podman[314560]: 2026-02-20 09:55:20.372141468 +0000 UTC m=+0.187943193 container init 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 09:55:20 np0005625204.localdomain podman[314560]: 2026-02-20 09:55:20.388147154 +0000 UTC m=+0.203948879 container start 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:55:20 np0005625204.localdomain dnsmasq[314579]: started, version 2.85 cachesize 150
Feb 20 09:55:20 np0005625204.localdomain dnsmasq[314579]: DNS service limited to local subnets
Feb 20 09:55:20 np0005625204.localdomain dnsmasq[314579]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:20 np0005625204.localdomain dnsmasq[314579]: warning: no upstream servers configured
Feb 20 09:55:20 np0005625204.localdomain dnsmasq-dhcp[314579]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:55:20 np0005625204.localdomain dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 0 addresses
Feb 20 09:55:20 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host
Feb 20 09:55:20 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts
Feb 20 09:55:20 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:20.590 264355 INFO neutron.agent.dhcp.agent [None req-0c4b6dd7-1e41-4948-91e1-7a8e88bd4fa7 - - - - - -] DHCP configuration for ports {'c8bb9739-380a-474c-a432-b1d24c789ff6'} is completed
Feb 20 09:55:20 np0005625204.localdomain ceph-mon[301857]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 1.9 KiB/s wr, 54 op/s
Feb 20 09:55:20 np0005625204.localdomain sshd[314580]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:55:21 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:21.050 2 INFO neutron.agent.securitygroups_rpc [None req-0c0c6410-fbc5-4b85-ab45-3c003033a966 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:21 np0005625204.localdomain sshd[314580]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:55:21 np0005625204.localdomain systemd[1]: tmp-crun.N3vkPb.mount: Deactivated successfully.
Feb 20 09:55:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:22 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:22.618 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:22Z, description=, device_id=be6b9a78-3f77-4a1d-b448-ffccd7465c5d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5952670>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59524f0>], id=5af3f48a-8899-4613-9124-3d7003577e8a, ip_allocation=immediate, mac_address=fa:16:3e:09:ac:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:16Z, description=, dns_domain=, id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1040364741, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29694, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2005, status=ACTIVE, subnets=['a2044cb0-b4a9-4d72-a4e0-16c7a2c0d8d7'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:17Z, vlan_transparent=None, network_id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2036, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:22Z on network 1569f18d-1cf2-4113-a1bd-2d35906eb20f
Feb 20 09:55:22 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 e149: 6 total, 6 up, 6 in
Feb 20 09:55:22 np0005625204.localdomain dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 1 addresses
Feb 20 09:55:22 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host
Feb 20 09:55:22 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts
Feb 20 09:55:22 np0005625204.localdomain ceph-mon[301857]: pgmap v269: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 3.2 KiB/s wr, 84 op/s
Feb 20 09:55:22 np0005625204.localdomain ceph-mon[301857]: osdmap e149: 6 total, 6 up, 6 in
Feb 20 09:55:22 np0005625204.localdomain podman[314599]: 2026-02-20 09:55:22.844878189 +0000 UTC m=+0.070260313 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:22 np0005625204.localdomain systemd[1]: tmp-crun.BnEo5U.mount: Deactivated successfully.
Feb 20 09:55:22 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:55:22 np0005625204.localdomain podman[314614]: 2026-02-20 09:55:22.960643231 +0000 UTC m=+0.077923045 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:55:22 np0005625204.localdomain podman[314614]: 2026-02-20 09:55:22.969276703 +0000 UTC m=+0.086556517 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:55:22 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:55:23 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:23.197 264355 INFO neutron.agent.dhcp.agent [None req-73843a85-b37f-499a-b518-782de643ab28 - - - - - -] DHCP configuration for ports {'5af3f48a-8899-4613-9124-3d7003577e8a'} is completed
Feb 20 09:55:23 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:23.358 2 INFO neutron.agent.securitygroups_rpc [None req-6678ed04-c4d3-4555-9813-927645c955fd 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']
Feb 20 09:55:23 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:23.540 2 INFO neutron.agent.securitygroups_rpc [None req-08ae8aba-2609-40c5-899b-086a86995061 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:24.319 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:24 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:24.577 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:22Z, description=, device_id=be6b9a78-3f77-4a1d-b448-ffccd7465c5d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5976d00>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df589a4f0>], id=5af3f48a-8899-4613-9124-3d7003577e8a, ip_allocation=immediate, mac_address=fa:16:3e:09:ac:b1, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:16Z, description=, dns_domain=, id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1040364741, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=29694, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2005, status=ACTIVE, subnets=['a2044cb0-b4a9-4d72-a4e0-16c7a2c0d8d7'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:17Z, vlan_transparent=None, network_id=1569f18d-1cf2-4113-a1bd-2d35906eb20f, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2036, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:22Z on network 1569f18d-1cf2-4113-a1bd-2d35906eb20f
Feb 20 09:55:24 np0005625204.localdomain podman[314660]: 2026-02-20 09:55:24.801914462 +0000 UTC m=+0.064808467 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:55:24 np0005625204.localdomain dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 1 addresses
Feb 20 09:55:24 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host
Feb 20 09:55:24 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts
Feb 20 09:55:24 np0005625204.localdomain ceph-mon[301857]: pgmap v271: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 2.4 KiB/s wr, 49 op/s
Feb 20 09:55:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:25.074 264355 INFO neutron.agent.dhcp.agent [None req-42ee7eb4-b053-4e18-907b-4ebfb8bb6b7a - - - - - -] DHCP configuration for ports {'5af3f48a-8899-4613-9124-3d7003577e8a'} is completed
Feb 20 09:55:25 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:25.589 2 INFO neutron.agent.securitygroups_rpc [None req-65f746b2-e25c-42a6-a0af-3bc4d3abfc01 5f5cb06fd4c045b1bfc3f890392c3165 54500b20d6a643669fbf357242dde27f - - default default] Security group member updated ['b8016b60-fce7-435d-b877-28993ceda4d5']
Feb 20 09:55:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:55:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:55:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:55:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:55:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:55:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:55:26 np0005625204.localdomain dnsmasq[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/addn_hosts - 0 addresses
Feb 20 09:55:26 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/host
Feb 20 09:55:26 np0005625204.localdomain podman[314695]: 2026-02-20 09:55:26.822383312 +0000 UTC m=+0.066539100 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:55:26 np0005625204.localdomain dnsmasq-dhcp[314579]: read /var/lib/neutron/dhcp/1569f18d-1cf2-4113-a1bd-2d35906eb20f/opts
Feb 20 09:55:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:55:26 np0005625204.localdomain podman[314708]: 2026-02-20 09:55:26.939249467 +0000 UTC m=+0.090430444 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:55:26 np0005625204.localdomain podman[314708]: 2026-02-20 09:55:26.946759645 +0000 UTC m=+0.097940592 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:55:26 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:55:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:27Z|00248|binding|INFO|Releasing lport ad165bef-73b3-4f1e-864a-c401cd5b89ed from this chassis (sb_readonly=0)
Feb 20 09:55:27 np0005625204.localdomain kernel: device tapad165bef-73 left promiscuous mode
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.210 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:27Z|00249|binding|INFO|Setting lport ad165bef-73b3-4f1e-864a-c401cd5b89ed down in Southbound
Feb 20 09:55:27 np0005625204.localdomain ceph-mon[301857]: pgmap v272: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 2.1 KiB/s wr, 45 op/s
Feb 20 09:55:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:55:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.224 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1569f18d-1cf2-4113-a1bd-2d35906eb20f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=081736e4-c4cd-4a80-b501-1dcc1e64a740, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ad165bef-73b3-4f1e-864a-c401cd5b89ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.226 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ad165bef-73b3-4f1e-864a-c401cd5b89ed in datapath 1569f18d-1cf2-4113-a1bd-2d35906eb20f unbound from our chassis
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.228 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1569f18d-1cf2-4113-a1bd-2d35906eb20f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.229 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9e14a83a-d1e2-441d-8bab-67d905c9846a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.244 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:27.323 2 INFO neutron.agent.securitygroups_rpc [None req-09c13c9e-9ba3-4904-bcac-09b2f5e1651f 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:27 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:27.573 264355 INFO neutron.agent.linux.ip_lib [None req-09c5e945-f56a-45bc-a62e-02d6db468313 - - - - - -] Device tapfa0c1fb0-9b cannot be used as it has no MAC address
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.596 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain kernel: device tapfa0c1fb0-9b entered promiscuous mode
Feb 20 09:55:27 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581327.6062] manager: (tapfa0c1fb0-9b): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Feb 20 09:55:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:27Z|00250|binding|INFO|Claiming lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a for this chassis.
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:27Z|00251|binding|INFO|fa0c1fb0-9b16-4f58-88ba-a30e77afed6a: Claiming unknown
Feb 20 09:55:27 np0005625204.localdomain systemd-udevd[314749]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.619 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb276814-fa37-4413-a869-338c3a114128, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=fa0c1fb0-9b16-4f58-88ba-a30e77afed6a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.621 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fa0c1fb0-9b16-4f58-88ba-a30e77afed6a in datapath 30554b8e-97d6-457c-a559-c8a175beb267 bound to our chassis
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.626 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30554b8e-97d6-457c-a559-c8a175beb267 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:27.627 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8c22cb-949a-4307-a889-62772023e6c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:27Z|00252|binding|INFO|Setting lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a ovn-installed in OVS
Feb 20 09:55:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:27Z|00253|binding|INFO|Setting lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a up in Southbound
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.680 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:27.708 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:28 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:28.214 2 INFO neutron.agent.securitygroups_rpc [None req-df2a63c6-7355-4f99-a5c3-49ea7a77359b 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:28 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "format": "json"}]: dispatch
Feb 20 09:55:28 np0005625204.localdomain podman[314804]: 
Feb 20 09:55:28 np0005625204.localdomain podman[314804]: 2026-02-20 09:55:28.495203744 +0000 UTC m=+0.090049573 container create 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:28 np0005625204.localdomain systemd[1]: Started libpod-conmon-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad.scope.
Feb 20 09:55:28 np0005625204.localdomain podman[314804]: 2026-02-20 09:55:28.4509192 +0000 UTC m=+0.045765069 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/285bed8d22a682481e23070c8097cc1e4b38ac43cbaebd1571bb61275146e81a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:28 np0005625204.localdomain podman[314804]: 2026-02-20 09:55:28.577197571 +0000 UTC m=+0.172043400 container init 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:55:28 np0005625204.localdomain podman[314804]: 2026-02-20 09:55:28.592221817 +0000 UTC m=+0.187067666 container start 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314824]: started, version 2.85 cachesize 150
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314824]: DNS service limited to local subnets
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314824]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314824]: warning: no upstream servers configured
Feb 20 09:55:28 np0005625204.localdomain dnsmasq-dhcp[314824]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 0 addresses
Feb 20 09:55:28 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host
Feb 20 09:55:28 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts
Feb 20 09:55:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:28.659 264355 INFO neutron.agent.dhcp.agent [None req-09c5e945-f56a-45bc-a62e-02d6db468313 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:27Z, description=, device_id=cf3f380b-2884-4a30-946d-3fd2eacae5d3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df589ae20>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59a9ac0>], id=6aed2313-a15f-4b24-8da1-1ee435817e36, ip_allocation=immediate, mac_address=fa:16:3e:24:c2:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:25Z, description=, dns_domain=, id=30554b8e-97d6-457c-a559-c8a175beb267, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1940356692, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31716, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2052, status=ACTIVE, subnets=['7584d118-7e5d-407c-8050-c5c8e10e09f6'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:26Z, vlan_transparent=None, network_id=30554b8e-97d6-457c-a559-c8a175beb267, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2061, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:27Z on network 30554b8e-97d6-457c-a559-c8a175beb267
Feb 20 09:55:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:28.755 264355 INFO neutron.agent.dhcp.agent [None req-a7ceac08-d506-4028-8026-9e8d2146a00e - - - - - -] DHCP configuration for ports {'b59c76b5-90e3-41db-af11-d782540bc114'} is completed
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 1 addresses
Feb 20 09:55:28 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host
Feb 20 09:55:28 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts
Feb 20 09:55:28 np0005625204.localdomain podman[314856]: 2026-02-20 09:55:28.876317466 +0000 UTC m=+0.064032073 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:28 np0005625204.localdomain podman[314868]: 2026-02-20 09:55:28.928867371 +0000 UTC m=+0.064569370 container kill 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:55:28 np0005625204.localdomain dnsmasq[314579]: exiting on receipt of SIGTERM
Feb 20 09:55:28 np0005625204.localdomain systemd[1]: libpod-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d.scope: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain podman[314885]: 2026-02-20 09:55:29.00104072 +0000 UTC m=+0.058903178 container died 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:55:29 np0005625204.localdomain podman[314885]: 2026-02-20 09:55:29.084336237 +0000 UTC m=+0.142198685 container cleanup 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: libpod-conmon-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d.scope: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain podman[314892]: 2026-02-20 09:55:29.107420847 +0000 UTC m=+0.142127102 container remove 8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1569f18d-1cf2-4113-a1bd-2d35906eb20f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:55:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.136 264355 INFO neutron.agent.dhcp.agent [None req-64c46c8e-5526-4f37-aa14-55f8204a4df1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:29 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:29.147 2 INFO neutron.agent.securitygroups_rpc [None req-abf59bbc-b23d-40a5-812c-25dd2e2a84ba 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.219 264355 INFO neutron.agent.dhcp.agent [None req-d74ca894-b05f-4447-b3df-cf0f8ceec184 - - - - - -] DHCP configuration for ports {'6aed2313-a15f-4b24-8da1-1ee435817e36'} is completed
Feb 20 09:55:29 np0005625204.localdomain ceph-mon[301857]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 1.9 KiB/s wr, 39 op/s
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:55:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:29.366 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.375 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:27Z, description=, device_id=cf3f380b-2884-4a30-946d-3fd2eacae5d3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5952100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5952130>], id=6aed2313-a15f-4b24-8da1-1ee435817e36, ip_allocation=immediate, mac_address=fa:16:3e:24:c2:d9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:25Z, description=, dns_domain=, id=30554b8e-97d6-457c-a559-c8a175beb267, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1940356692, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=31716, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2052, status=ACTIVE, subnets=['7584d118-7e5d-407c-8050-c5c8e10e09f6'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:26Z, vlan_transparent=None, network_id=30554b8e-97d6-457c-a559-c8a175beb267, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2061, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:27Z on network 30554b8e-97d6-457c-a559-c8a175beb267
Feb 20 09:55:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.431 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:29 np0005625204.localdomain podman[314920]: 2026-02-20 09:55:29.435249583 +0000 UTC m=+0.121787315 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal)
Feb 20 09:55:29 np0005625204.localdomain podman[314920]: 2026-02-20 09:55:29.450143576 +0000 UTC m=+0.136681308 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., version=9.7, distribution-scope=public, maintainer=Red Hat, Inc., release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: tmp-crun.yQTdTo.mount: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f7724903b31542ba17b88488ef9f53de08d696e33d16bf9339b383229babbfc0-merged.mount: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e259c34dab84630480a722ca57c29990d4a74cf50573732368ee0a7f1be263d-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d1569f18d\x2d1cf2\x2d4113\x2da1bd\x2d2d35906eb20f.mount: Deactivated successfully.
Feb 20 09:55:29 np0005625204.localdomain dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 1 addresses
Feb 20 09:55:29 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host
Feb 20 09:55:29 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts
Feb 20 09:55:29 np0005625204.localdomain podman[314958]: 2026-02-20 09:55:29.582852742 +0000 UTC m=+0.060127895 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:29 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:29Z|00254|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:55:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:29.711 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:29.770 264355 INFO neutron.agent.dhcp.agent [None req-087c180b-137c-44aa-83fc-d3e7ea4c1690 - - - - - -] DHCP configuration for ports {'6aed2313-a15f-4b24-8da1-1ee435817e36'} is completed
Feb 20 09:55:30 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:30.324 2 INFO neutron.agent.securitygroups_rpc [None req-c4977344-b1fb-45a9-a725-767a5df232d2 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:31 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:31.049 2 INFO neutron.agent.securitygroups_rpc [None req-3c26b833-1dd4-4db0-a9c6-673823ae81db 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:31 np0005625204.localdomain ceph-mon[301857]: pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 4.0 KiB/s wr, 33 op/s
Feb 20 09:55:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "snap_name": "76272f77-26a4-41f3-97ff-7d9d42de32a4", "format": "json"}]: dispatch
Feb 20 09:55:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:31 np0005625204.localdomain dnsmasq[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/addn_hosts - 0 addresses
Feb 20 09:55:31 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/host
Feb 20 09:55:31 np0005625204.localdomain dnsmasq-dhcp[314824]: read /var/lib/neutron/dhcp/30554b8e-97d6-457c-a559-c8a175beb267/opts
Feb 20 09:55:31 np0005625204.localdomain podman[314997]: 2026-02-20 09:55:31.704035987 +0000 UTC m=+0.059995761 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:31 np0005625204.localdomain kernel: device tapfa0c1fb0-9b left promiscuous mode
Feb 20 09:55:31 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:31Z|00255|binding|INFO|Releasing lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a from this chassis (sb_readonly=0)
Feb 20 09:55:31 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:31Z|00256|binding|INFO|Setting lport fa0c1fb0-9b16-4f58-88ba-a30e77afed6a down in Southbound
Feb 20 09:55:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:31.924 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:31.934 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30554b8e-97d6-457c-a559-c8a175beb267', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eb276814-fa37-4413-a869-338c3a114128, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=fa0c1fb0-9b16-4f58-88ba-a30e77afed6a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:31.936 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fa0c1fb0-9b16-4f58-88ba-a30e77afed6a in datapath 30554b8e-97d6-457c-a559-c8a175beb267 unbound from our chassis
Feb 20 09:55:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:31.938 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 30554b8e-97d6-457c-a559-c8a175beb267 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:31.939 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[be52e880-932b-4a2f-b8a1-ce327e20c0a6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:31.947 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:32 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:32.585 2 INFO neutron.agent.securitygroups_rpc [None req-2e3d6688-1793-41b9-a2f2-f815ee6132fe 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:32 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:32.964 2 INFO neutron.agent.securitygroups_rpc [None req-d888e083-a1eb-4717-8e36-0999aadb5157 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:33 np0005625204.localdomain ceph-mon[301857]: pgmap v275: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 2.7 KiB/s wr, 1 op/s
Feb 20 09:55:33 np0005625204.localdomain dnsmasq[314824]: exiting on receipt of SIGTERM
Feb 20 09:55:33 np0005625204.localdomain podman[315037]: 2026-02-20 09:55:33.539421051 +0000 UTC m=+0.063316862 container kill 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:55:33 np0005625204.localdomain systemd[1]: libpod-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad.scope: Deactivated successfully.
Feb 20 09:55:33 np0005625204.localdomain podman[315050]: 2026-02-20 09:55:33.61519778 +0000 UTC m=+0.061676162 container died 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:33 np0005625204.localdomain systemd[1]: tmp-crun.JfQgfZ.mount: Deactivated successfully.
Feb 20 09:55:33 np0005625204.localdomain podman[315050]: 2026-02-20 09:55:33.650946865 +0000 UTC m=+0.097425197 container cleanup 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:55:33 np0005625204.localdomain systemd[1]: libpod-conmon-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad.scope: Deactivated successfully.
Feb 20 09:55:33 np0005625204.localdomain podman[315052]: 2026-02-20 09:55:33.70286034 +0000 UTC m=+0.137606956 container remove 4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-30554b8e-97d6-457c-a559-c8a175beb267, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 20 09:55:33 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:33.905 264355 INFO neutron.agent.dhcp.agent [None req-2cd93e5e-e68a-4b2b-a8e7-7ff60c86ff36 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:34 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:34.063 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "snap_name": "94e02352-6a28-4288-9072-c7133a6151bf", "format": "json"}]: dispatch
Feb 20 09:55:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:34.411 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:34 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:34.482 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-285bed8d22a682481e23070c8097cc1e4b38ac43cbaebd1571bb61275146e81a-merged.mount: Deactivated successfully.
Feb 20 09:55:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a1377db316fba48357fd01a68ce6a8adecf6b3d055aae6d4f634e52880662ad-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:34 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d30554b8e\x2d97d6\x2d457c\x2da559\x2dc8a175beb267.mount: Deactivated successfully.
Feb 20 09:55:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:34Z|00257|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:55:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:34.928 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:34 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:34.953 2 INFO neutron.agent.securitygroups_rpc [None req-3b974f9c-5163-47e8-89d2-00acf380ad82 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:35 np0005625204.localdomain ceph-mon[301857]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 477 B/s rd, 2.5 KiB/s wr, 1 op/s
Feb 20 09:55:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:55:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:55:36 np0005625204.localdomain podman[315079]: 2026-02-20 09:55:36.165666029 +0000 UTC m=+0.097950923 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:55:36 np0005625204.localdomain podman[315080]: 2026-02-20 09:55:36.242169539 +0000 UTC m=+0.170551295 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:36 np0005625204.localdomain podman[315079]: 2026-02-20 09:55:36.266380614 +0000 UTC m=+0.198665468 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:55:36 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:55:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:36 np0005625204.localdomain podman[315080]: 2026-02-20 09:55:36.320926289 +0000 UTC m=+0.249308055 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 20 09:55:36 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:55:37 np0005625204.localdomain ceph-mon[301857]: pgmap v277: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 4.4 KiB/s wr, 2 op/s
Feb 20 09:55:37 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:37.529 2 INFO neutron.agent.securitygroups_rpc [None req-a4ab168f-7329-4461-8236-8f00fb1e3c92 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:37.562 264355 INFO neutron.agent.linux.ip_lib [None req-e46b29f8-41d0-4d97-ab45-2edba2f5ff37 - - - - - -] Device tapaf75ed32-13 cannot be used as it has no MAC address
Feb 20 09:55:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:37.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:37 np0005625204.localdomain kernel: device tapaf75ed32-13 entered promiscuous mode
Feb 20 09:55:37 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581337.6317] manager: (tapaf75ed32-13): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Feb 20 09:55:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:37.632 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:37Z|00258|binding|INFO|Claiming lport af75ed32-130d-4e8e-87f9-48ee296520f0 for this chassis.
Feb 20 09:55:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:37Z|00259|binding|INFO|af75ed32-130d-4e8e-87f9-48ee296520f0: Claiming unknown
Feb 20 09:55:37 np0005625204.localdomain systemd-udevd[315133]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:37.642 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=876849fd-270a-4183-8f36-c602ceb68d2b, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=af75ed32-130d-4e8e-87f9-48ee296520f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:37.645 162652 INFO neutron.agent.ovn.metadata.agent [-] Port af75ed32-130d-4e8e-87f9-48ee296520f0 in datapath 41ccba1b-d4dd-4580-8736-703a9b44e71b bound to our chassis
Feb 20 09:55:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:37.647 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 41ccba1b-d4dd-4580-8736-703a9b44e71b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:37.648 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[4a38f513-f715-4bf8-9c80-524ddb23b111]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:37Z|00260|binding|INFO|Setting lport af75ed32-130d-4e8e-87f9-48ee296520f0 ovn-installed in OVS
Feb 20 09:55:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:37Z|00261|binding|INFO|Setting lport af75ed32-130d-4e8e-87f9-48ee296520f0 up in Southbound
Feb 20 09:55:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:37.676 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapaf75ed32-13: No such device
Feb 20 09:55:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:37.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:37.762 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "snap_name": "94e02352-6a28-4288-9072-c7133a6151bf_2956b5fe-4a3f-4e13-8c50-0bbaa50928f8", "force": true, "format": "json"}]: dispatch
Feb 20 09:55:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "snap_name": "94e02352-6a28-4288-9072-c7133a6151bf", "force": true, "format": "json"}]: dispatch
Feb 20 09:55:38 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/870555994' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:55:38 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/870555994' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:55:38 np0005625204.localdomain ceph-mon[301857]: pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 4.4 KiB/s wr, 1 op/s
Feb 20 09:55:38 np0005625204.localdomain podman[315204]: 
Feb 20 09:55:38 np0005625204.localdomain podman[315204]: 2026-02-20 09:55:38.595960602 +0000 UTC m=+0.072024586 container create 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:55:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c.scope.
Feb 20 09:55:38 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98a11e8164b248f48392f782c6ee5ccf0ba56d6a01fd60072357a42bcea10a93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:38 np0005625204.localdomain podman[315204]: 2026-02-20 09:55:38.557942999 +0000 UTC m=+0.034006983 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:38 np0005625204.localdomain podman[315204]: 2026-02-20 09:55:38.66114217 +0000 UTC m=+0.137206134 container init 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:38 np0005625204.localdomain podman[315204]: 2026-02-20 09:55:38.672162384 +0000 UTC m=+0.148226338 container start 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:38 np0005625204.localdomain dnsmasq[315222]: started, version 2.85 cachesize 150
Feb 20 09:55:38 np0005625204.localdomain dnsmasq[315222]: DNS service limited to local subnets
Feb 20 09:55:38 np0005625204.localdomain dnsmasq[315222]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:38 np0005625204.localdomain dnsmasq[315222]: warning: no upstream servers configured
Feb 20 09:55:38 np0005625204.localdomain dnsmasq-dhcp[315222]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:55:38 np0005625204.localdomain dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 0 addresses
Feb 20 09:55:38 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host
Feb 20 09:55:38 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts
Feb 20 09:55:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:38.750 264355 INFO neutron.agent.dhcp.agent [None req-a9886a1c-647f-4ea8-a821-7fccb7c50cbd - - - - - -] DHCP configuration for ports {'1d5a0285-6e9b-4ceb-bbc6-a537f6efba7b'} is completed
Feb 20 09:55:38 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:38.905 2 INFO neutron.agent.securitygroups_rpc [None req-eb4533df-2883-48ce-b913-efeaaf3f9e10 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:39.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:39 np0005625204.localdomain sudo[315223]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:55:39 np0005625204.localdomain sudo[315223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:55:39 np0005625204.localdomain sudo[315223]: pam_unix(sudo:session): session closed for user root
Feb 20 09:55:39 np0005625204.localdomain sudo[315241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:55:39 np0005625204.localdomain sudo[315241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:55:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:39.880 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:39Z, description=, device_id=a5051175-9093-4a92-8a71-099c941b94b5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58a2f70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58a2fa0>], id=adcb09ab-6dda-431c-9b20-ce19a23f1bc4, ip_allocation=immediate, mac_address=fa:16:3e:40:5f:b9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:35Z, description=, dns_domain=, id=41ccba1b-d4dd-4580-8736-703a9b44e71b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1367145819, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40210, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2091, status=ACTIVE, subnets=['cc518582-23ac-498e-b92e-77fca38ee666'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:36Z, vlan_transparent=None, network_id=41ccba1b-d4dd-4580-8736-703a9b44e71b, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2123, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:39Z on network 41ccba1b-d4dd-4580-8736-703a9b44e71b
Feb 20 09:55:40 np0005625204.localdomain podman[315292]: 2026-02-20 09:55:40.117579547 +0000 UTC m=+0.070489970 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:40 np0005625204.localdomain dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 1 addresses
Feb 20 09:55:40 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host
Feb 20 09:55:40 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts
Feb 20 09:55:40 np0005625204.localdomain sudo[315241]: pam_unix(sudo:session): session closed for user root
Feb 20 09:55:40 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:40.413 264355 INFO neutron.agent.dhcp.agent [None req-efc687d8-befb-434a-9a23-9fa8c0641eb7 - - - - - -] DHCP configuration for ports {'adcb09ab-6dda-431c-9b20-ce19a23f1bc4'} is completed
Feb 20 09:55:40 np0005625204.localdomain sudo[315327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:55:40 np0005625204.localdomain sudo[315327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:55:40 np0005625204.localdomain sudo[315327]: pam_unix(sudo:session): session closed for user root
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: pgmap v279: 177 pgs: 177 active+clean; 146 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 8.7 KiB/s wr, 3 op/s
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "snap_name": "76272f77-26a4-41f3-97ff-7d9d42de32a4_f845adef-9d7e-4723-a4a0-91acd19cabbe", "force": true, "format": "json"}]: dispatch
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "snap_name": "76272f77-26a4-41f3-97ff-7d9d42de32a4", "force": true, "format": "json"}]: dispatch
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:55:40 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:55:40 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:40.623 2 INFO neutron.agent.securitygroups_rpc [None req-7a51814a-d3bb-4d9f-a2cf-a8e1904feac9 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:41.004 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:39Z, description=, device_id=a5051175-9093-4a92-8a71-099c941b94b5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a02340>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a028b0>], id=adcb09ab-6dda-431c-9b20-ce19a23f1bc4, ip_allocation=immediate, mac_address=fa:16:3e:40:5f:b9, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:35Z, description=, dns_domain=, id=41ccba1b-d4dd-4580-8736-703a9b44e71b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-1367145819, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40210, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2091, status=ACTIVE, subnets=['cc518582-23ac-498e-b92e-77fca38ee666'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:36Z, vlan_transparent=None, network_id=41ccba1b-d4dd-4580-8736-703a9b44e71b, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2123, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:39Z on network 41ccba1b-d4dd-4580-8736-703a9b44e71b
Feb 20 09:55:41 np0005625204.localdomain dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 1 addresses
Feb 20 09:55:41 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host
Feb 20 09:55:41 np0005625204.localdomain podman[315362]: 2026-02-20 09:55:41.223199111 +0000 UTC m=+0.058704582 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:41 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts
Feb 20 09:55:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:41.468 264355 INFO neutron.agent.dhcp.agent [None req-b1c870c0-549d-463a-8614-fad00a2b7163 - - - - - -] DHCP configuration for ports {'adcb09ab-6dda-431c-9b20-ce19a23f1bc4'} is completed
Feb 20 09:55:41 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:41.673 2 INFO neutron.agent.securitygroups_rpc [None req-45da2e31-24b0-4a9e-8fbd-b3074d839731 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:41.704 264355 INFO neutron.agent.linux.ip_lib [None req-1485b343-de9e-4d46-8292-a63df142710c - - - - - -] Device tap4a535ed2-00 cannot be used as it has no MAC address
Feb 20 09:55:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:41.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:41 np0005625204.localdomain kernel: device tap4a535ed2-00 entered promiscuous mode
Feb 20 09:55:41 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581341.7371] manager: (tap4a535ed2-00): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Feb 20 09:55:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:41Z|00262|binding|INFO|Claiming lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 for this chassis.
Feb 20 09:55:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:41Z|00263|binding|INFO|4a535ed2-00be-4ec7-8d9e-24afdab13877: Claiming unknown
Feb 20 09:55:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:41.738 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:41 np0005625204.localdomain systemd-udevd[315392]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:41.750 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2f3d8db-7c26-4898-8e61-0b8ba04044df, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=4a535ed2-00be-4ec7-8d9e-24afdab13877) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:41.753 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4a535ed2-00be-4ec7-8d9e-24afdab13877 in datapath cd45d753-df49-44f6-b419-6749d7fe84f3 bound to our chassis
Feb 20 09:55:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:41.755 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cd45d753-df49-44f6-b419-6749d7fe84f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:41.756 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1d428b70-87cb-4b9b-9266-304615394891]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:41Z|00264|binding|INFO|Setting lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 ovn-installed in OVS
Feb 20 09:55:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:41Z|00265|binding|INFO|Setting lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 up in Southbound
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:41.780 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap4a535ed2-00: No such device
Feb 20 09:55:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:41.817 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:41.845 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:42 np0005625204.localdomain podman[315463]: 
Feb 20 09:55:42 np0005625204.localdomain podman[315463]: 2026-02-20 09:55:42.59008359 +0000 UTC m=+0.074162030 container create a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 20 09:55:42 np0005625204.localdomain ceph-mon[301857]: pgmap v280: 177 pgs: 177 active+clean; 146 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 7.1 KiB/s wr, 16 op/s
Feb 20 09:55:42 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e150 e150: 6 total, 6 up, 6 in
Feb 20 09:55:42 np0005625204.localdomain systemd[1]: Started libpod-conmon-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d.scope.
Feb 20 09:55:42 np0005625204.localdomain systemd[1]: tmp-crun.3FE9Zn.mount: Deactivated successfully.
Feb 20 09:55:42 np0005625204.localdomain podman[315463]: 2026-02-20 09:55:42.557210883 +0000 UTC m=+0.041289333 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:42 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:42 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8a8113fcfd1358b3db8a5596a585afb795b665e94caae01a8c63b1b24d0a203b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:42 np0005625204.localdomain podman[315463]: 2026-02-20 09:55:42.676800881 +0000 UTC m=+0.160879291 container init a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:42 np0005625204.localdomain podman[315463]: 2026-02-20 09:55:42.683205605 +0000 UTC m=+0.167284015 container start a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:42 np0005625204.localdomain dnsmasq[315482]: started, version 2.85 cachesize 150
Feb 20 09:55:42 np0005625204.localdomain dnsmasq[315482]: DNS service limited to local subnets
Feb 20 09:55:42 np0005625204.localdomain dnsmasq[315482]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:42 np0005625204.localdomain dnsmasq[315482]: warning: no upstream servers configured
Feb 20 09:55:42 np0005625204.localdomain dnsmasq-dhcp[315482]: DHCPv6, static leases only on 2001:db8:1::, lease time 1d
Feb 20 09:55:42 np0005625204.localdomain dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 0 addresses
Feb 20 09:55:42 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host
Feb 20 09:55:42 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts
Feb 20 09:55:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:42.750 264355 INFO neutron.agent.dhcp.agent [None req-1485b343-de9e-4d46-8292-a63df142710c - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:41Z, description=, device_id=7f90544d-5ad8-4f1b-8f43-7851902677f5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a02a60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d2fd0>], id=b3a465d7-0fe1-4076-a907-134fc38d292d, ip_allocation=immediate, mac_address=fa:16:3e:54:61:36, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:39Z, description=, dns_domain=, id=cd45d753-df49-44f6-b419-6749d7fe84f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-181663761, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61227, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2126, status=ACTIVE, subnets=['8ee7ce1e-019f-46a3-86af-2fedd3583e02'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:40Z, vlan_transparent=None, network_id=cd45d753-df49-44f6-b419-6749d7fe84f3, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2136, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:41Z on network cd45d753-df49-44f6-b419-6749d7fe84f3
Feb 20 09:55:42 np0005625204.localdomain sshd[315502]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:55:42 np0005625204.localdomain dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 1 addresses
Feb 20 09:55:42 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host
Feb 20 09:55:42 np0005625204.localdomain podman[315501]: 2026-02-20 09:55:42.888394721 +0000 UTC m=+0.041446978 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:55:42 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts
Feb 20 09:55:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:42.911 264355 INFO neutron.agent.dhcp.agent [None req-7d4a75c5-73c2-4ee7-b46f-db483350b090 - - - - - -] DHCP configuration for ports {'74f77219-8820-4956-b74f-503a881f77fb'} is completed
Feb 20 09:55:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:55:43 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:43.047 264355 INFO neutron.agent.dhcp.agent [None req-b5a2ed51-0a74-4077-8f10-9b935b39d47b - - - - - -] DHCP configuration for ports {'b3a465d7-0fe1-4076-a907-134fc38d292d'} is completed
Feb 20 09:55:43 np0005625204.localdomain podman[315525]: 2026-02-20 09:55:43.14547189 +0000 UTC m=+0.086634069 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 20 09:55:43 np0005625204.localdomain podman[315525]: 2026-02-20 09:55:43.162086454 +0000 UTC m=+0.103248633 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:43 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:55:43 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:43.313 2 INFO neutron.agent.securitygroups_rpc [None req-b9b9f9d2-32dc-4ef0-9ebd-12b2d2ae4ee8 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:43 np0005625204.localdomain ceph-mon[301857]: osdmap e150: 6 total, 6 up, 6 in
Feb 20 09:55:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "format": "json"}]: dispatch
Feb 20 09:55:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1ece01ec-80d8-4513-b3ec-6bcff358bcd5", "force": true, "format": "json"}]: dispatch
Feb 20 09:55:43 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:55:43 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e151 e151: 6 total, 6 up, 6 in
Feb 20 09:55:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:43.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:43.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 09:55:43 np0005625204.localdomain sshd[315502]: Invalid user sol from 45.148.10.240 port 57768
Feb 20 09:55:43 np0005625204.localdomain sshd[315502]: Connection closed by invalid user sol 45.148.10.240 port 57768 [preauth]
Feb 20 09:55:44 np0005625204.localdomain dnsmasq[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/addn_hosts - 0 addresses
Feb 20 09:55:44 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/host
Feb 20 09:55:44 np0005625204.localdomain podman[315560]: 2026-02-20 09:55:44.158851365 +0000 UTC m=+0.066380655 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:55:44 np0005625204.localdomain dnsmasq-dhcp[315222]: read /var/lib/neutron/dhcp/41ccba1b-d4dd-4580-8736-703a9b44e71b/opts
Feb 20 09:55:44 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:44.159 2 INFO neutron.agent.securitygroups_rpc [None req-6065317f-9707-4b49-a2f9-16f062a24577 061949b19d2146debcdb4e85c8db9eec b9f8945a2560410b988e395a1db7710f - - default default] Security group member updated ['9c30e397-a710-4013-bf42-b0dd9762b00a']
Feb 20 09:55:44 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:44Z|00266|binding|INFO|Releasing lport af75ed32-130d-4e8e-87f9-48ee296520f0 from this chassis (sb_readonly=0)
Feb 20 09:55:44 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:44Z|00267|binding|INFO|Setting lport af75ed32-130d-4e8e-87f9-48ee296520f0 down in Southbound
Feb 20 09:55:44 np0005625204.localdomain kernel: device tapaf75ed32-13 left promiscuous mode
Feb 20 09:55:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:44.383 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:44.386 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:41Z, description=, device_id=7f90544d-5ad8-4f1b-8f43-7851902677f5, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62af430>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5952a00>], id=b3a465d7-0fe1-4076-a907-134fc38d292d, ip_allocation=immediate, mac_address=fa:16:3e:54:61:36, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:39Z, description=, dns_domain=, id=cd45d753-df49-44f6-b419-6749d7fe84f3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-181663761, port_security_enabled=True, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=61227, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2126, status=ACTIVE, subnets=['8ee7ce1e-019f-46a3-86af-2fedd3583e02'], tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:40Z, vlan_transparent=None, network_id=cd45d753-df49-44f6-b419-6749d7fe84f3, port_security_enabled=False, project_id=1c44e13adebb4610b7c0cd2fdc62a5b7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2136, status=DOWN, tags=[], tenant_id=1c44e13adebb4610b7c0cd2fdc62a5b7, updated_at=2026-02-20T09:55:41Z on network cd45d753-df49-44f6-b419-6749d7fe84f3
Feb 20 09:55:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:44.405 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:44.403 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-41ccba1b-d4dd-4580-8736-703a9b44e71b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=876849fd-270a-4183-8f36-c602ceb68d2b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=af75ed32-130d-4e8e-87f9-48ee296520f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:44.406 162652 INFO neutron.agent.ovn.metadata.agent [-] Port af75ed32-130d-4e8e-87f9-48ee296520f0 in datapath 41ccba1b-d4dd-4580-8736-703a9b44e71b unbound from our chassis
Feb 20 09:55:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:44.409 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 41ccba1b-d4dd-4580-8736-703a9b44e71b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:55:44 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:44.410 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c2510a50-4f86-4615-9571-5c25b9ddb593]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:44.460 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:44 np0005625204.localdomain dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 1 addresses
Feb 20 09:55:44 np0005625204.localdomain podman[315601]: 2026-02-20 09:55:44.594696518 +0000 UTC m=+0.060556399 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:44 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host
Feb 20 09:55:44 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts
Feb 20 09:55:44 np0005625204.localdomain ceph-mon[301857]: pgmap v282: 177 pgs: 177 active+clean; 146 MiB data, 791 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 8.5 KiB/s wr, 20 op/s
Feb 20 09:55:44 np0005625204.localdomain ceph-mon[301857]: osdmap e151: 6 total, 6 up, 6 in
Feb 20 09:55:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:44.826 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:44 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:44.895 264355 INFO neutron.agent.dhcp.agent [None req-446cb246-573f-40f5-9b5c-a84bb73bc1bc - - - - - -] DHCP configuration for ports {'b3a465d7-0fe1-4076-a907-134fc38d292d'} is completed
Feb 20 09:55:45 np0005625204.localdomain dnsmasq[315222]: exiting on receipt of SIGTERM
Feb 20 09:55:45 np0005625204.localdomain systemd[1]: libpod-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c.scope: Deactivated successfully.
Feb 20 09:55:45 np0005625204.localdomain podman[315639]: 2026-02-20 09:55:45.195714112 +0000 UTC m=+0.062569019 container kill 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:45 np0005625204.localdomain podman[315651]: 2026-02-20 09:55:45.266769008 +0000 UTC m=+0.059162006 container died 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:55:45 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c-userdata-shm.mount: Deactivated successfully.
Feb 20 09:55:45 np0005625204.localdomain podman[315651]: 2026-02-20 09:55:45.303162092 +0000 UTC m=+0.095555040 container cleanup 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:45 np0005625204.localdomain systemd[1]: libpod-conmon-427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c.scope: Deactivated successfully.
Feb 20 09:55:45 np0005625204.localdomain podman[315653]: 2026-02-20 09:55:45.355567631 +0000 UTC m=+0.137386129 container remove 427ccf657c31cd2bec09ec10ff8e3baf74ca1fb1f855e99cf8a9f70de9be153c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-41ccba1b-d4dd-4580-8736-703a9b44e71b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 20 09:55:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:45.387 264355 INFO neutron.agent.dhcp.agent [None req-61f6c2fa-305a-4b01-bf8f-36b14b468a72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:45 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:45.388 264355 INFO neutron.agent.dhcp.agent [None req-61f6c2fa-305a-4b01-bf8f-36b14b468a72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:55:45 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:45Z|00268|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:55:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:45.624 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:46 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-98a11e8164b248f48392f782c6ee5ccf0ba56d6a01fd60072357a42bcea10a93-merged.mount: Deactivated successfully.
Feb 20 09:55:46 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d41ccba1b\x2dd4dd\x2d4580\x2d8736\x2d703a9b44e71b.mount: Deactivated successfully.
Feb 20 09:55:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:46 np0005625204.localdomain ceph-mon[301857]: pgmap v284: 177 pgs: 177 active+clean; 146 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 23 KiB/s wr, 27 op/s
Feb 20 09:55:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:47.155 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:47.157 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:55:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:47.160 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:55:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:47.161 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[e92f375c-9ce6-47af-bc98-b68cdb489ca1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:47 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:47.487 2 INFO neutron.agent.securitygroups_rpc [None req-4aebfeae-a91f-4757-b68b-fe43601c173b 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:55:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:55:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:55:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157076 "" "Go-http-client/1.1"
Feb 20 09:55:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:55:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18807 "" "Go-http-client/1.1"
Feb 20 09:55:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 e152: 6 total, 6 up, 6 in
Feb 20 09:55:48 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:48.616 2 INFO neutron.agent.securitygroups_rpc [None req-f40f35c0-4148-4e05-a7c2-3455638d7684 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:48 np0005625204.localdomain ceph-mon[301857]: pgmap v285: 177 pgs: 177 active+clean; 146 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 16 KiB/s wr, 24 op/s
Feb 20 09:55:48 np0005625204.localdomain ceph-mon[301857]: osdmap e152: 6 total, 6 up, 6 in
Feb 20 09:55:48 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:48.992 2 INFO neutron.agent.securitygroups_rpc [None req-b8e91bcb-8145-4128-b349-c2aa5e79d87e 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:49.497 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:49.515 264355 INFO neutron.agent.linux.ip_lib [None req-5508e5c9-f288-401d-b9b9-5d3584cd4c63 - - - - - -] Device tap36eed41e-8c cannot be used as it has no MAC address
Feb 20 09:55:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:49.530 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:49 np0005625204.localdomain kernel: device tap36eed41e-8c entered promiscuous mode
Feb 20 09:55:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:49.536 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:49 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581349.5394] manager: (tap36eed41e-8c): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Feb 20 09:55:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:49Z|00269|binding|INFO|Claiming lport 36eed41e-8c87-4e39-8638-1f088c4d480e for this chassis.
Feb 20 09:55:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:49Z|00270|binding|INFO|36eed41e-8c87-4e39-8638-1f088c4d480e: Claiming unknown
Feb 20 09:55:49 np0005625204.localdomain systemd-udevd[315691]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:49.547 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24080264-67e8-4d16-abbb-0767714bc8ff, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=36eed41e-8c87-4e39-8638-1f088c4d480e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:49.549 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 36eed41e-8c87-4e39-8638-1f088c4d480e in datapath 7d64da0e-050b-4b53-8861-874f3c3ef083 bound to our chassis
Feb 20 09:55:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:49.551 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port c9940588-ae7f-4658-b111-18cf96086819 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:55:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:49.551 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d64da0e-050b-4b53-8861-874f3c3ef083, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:55:49 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:49.552 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2b89a3-6560-4fe0-a8cb-9adb7fa4c5f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:49Z|00271|binding|INFO|Setting lport 36eed41e-8c87-4e39-8638-1f088c4d480e ovn-installed in OVS
Feb 20 09:55:49 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:49Z|00272|binding|INFO|Setting lport 36eed41e-8c87-4e39-8638-1f088c4d480e up in Southbound
Feb 20 09:55:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:49.570 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap36eed41e-8c: No such device
Feb 20 09:55:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:49.599 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:49.625 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:50 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:50.051 2 INFO neutron.agent.securitygroups_rpc [None req-005508de-f69c-4da2-9936-4351a4d76fde 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:50 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:50.175 2 INFO neutron.agent.securitygroups_rpc [None req-8c2a2fb8-3a21-4d90-b744-6c75dba74fae f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:50 np0005625204.localdomain podman[315762]: 
Feb 20 09:55:50 np0005625204.localdomain podman[315762]: 2026-02-20 09:55:50.468980469 +0000 UTC m=+0.092251720 container create 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:55:50 np0005625204.localdomain systemd[1]: Started libpod-conmon-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8.scope.
Feb 20 09:55:50 np0005625204.localdomain podman[315762]: 2026-02-20 09:55:50.422022364 +0000 UTC m=+0.045293645 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:50 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:50 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e2bd54cd7424e1b7846b66bd6e6847e221583891328681be5cb3316924e217c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:50 np0005625204.localdomain podman[315762]: 2026-02-20 09:55:50.553218734 +0000 UTC m=+0.176490045 container init 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 20 09:55:50 np0005625204.localdomain podman[315762]: 2026-02-20 09:55:50.564505176 +0000 UTC m=+0.187776437 container start 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:55:50 np0005625204.localdomain dnsmasq[315780]: started, version 2.85 cachesize 150
Feb 20 09:55:50 np0005625204.localdomain dnsmasq[315780]: DNS service limited to local subnets
Feb 20 09:55:50 np0005625204.localdomain dnsmasq[315780]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:50 np0005625204.localdomain dnsmasq[315780]: warning: no upstream servers configured
Feb 20 09:55:50 np0005625204.localdomain dnsmasq-dhcp[315780]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:55:50 np0005625204.localdomain dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 0 addresses
Feb 20 09:55:50 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host
Feb 20 09:55:50 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts
Feb 20 09:55:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:50.623 264355 INFO neutron.agent.dhcp.agent [None req-20df167f-7f80-49d5-8aed-c087e176ca79 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:49Z, description=, device_id=40707009-5dc5-44c2-8d25-acba20c2e4ac, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58c25b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58c24c0>], id=3ba2d20e-026f-45bd-b5aa-4b36de93e613, ip_allocation=immediate, mac_address=fa:16:3e:9a:3d:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:46Z, description=, dns_domain=, id=7d64da0e-050b-4b53-8861-874f3c3ef083, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-38954291, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37042, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2154, status=ACTIVE, subnets=['0c6212d7-4953-4ba5-8041-3a0436e8149b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:47Z, vlan_transparent=None, network_id=7d64da0e-050b-4b53-8861-874f3c3ef083, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2176, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:49Z on network 7d64da0e-050b-4b53-8861-874f3c3ef083
Feb 20 09:55:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:50.782 264355 INFO neutron.agent.dhcp.agent [None req-4295fb46-a102-499c-826c-250329e900f8 - - - - - -] DHCP configuration for ports {'b804fc28-6e7e-40dc-ac07-67ec6e857bfe'} is completed
Feb 20 09:55:50 np0005625204.localdomain ceph-mon[301857]: pgmap v287: 177 pgs: 177 active+clean; 146 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 1.2 MiB/s rd, 19 KiB/s wr, 6 op/s
Feb 20 09:55:50 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:50.912 2 INFO neutron.agent.securitygroups_rpc [None req-8cf476ac-f2bb-4715-a401-741101924898 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:50 np0005625204.localdomain dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 1 addresses
Feb 20 09:55:50 np0005625204.localdomain podman[315798]: 2026-02-20 09:55:50.948969751 +0000 UTC m=+0.060597120 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:55:50 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host
Feb 20 09:55:50 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts
Feb 20 09:55:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:51.235 264355 INFO neutron.agent.dhcp.agent [None req-d41ba51e-a2b4-40d5-8fd2-3dd95a0ca548 - - - - - -] DHCP configuration for ports {'3ba2d20e-026f-45bd-b5aa-4b36de93e613'} is completed
Feb 20 09:55:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:51 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:55:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:52.136 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:52 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:52.162 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:49Z, description=, device_id=40707009-5dc5-44c2-8d25-acba20c2e4ac, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df625ebb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5861430>], id=3ba2d20e-026f-45bd-b5aa-4b36de93e613, ip_allocation=immediate, mac_address=fa:16:3e:9a:3d:89, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:46Z, description=, dns_domain=, id=7d64da0e-050b-4b53-8861-874f3c3ef083, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-38954291, port_security_enabled=True, project_id=8aa5b5a34cfe458d96fea87261361db1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=37042, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2154, status=ACTIVE, subnets=['0c6212d7-4953-4ba5-8041-3a0436e8149b'], tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:47Z, vlan_transparent=None, network_id=7d64da0e-050b-4b53-8861-874f3c3ef083, port_security_enabled=False, project_id=8aa5b5a34cfe458d96fea87261361db1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2176, status=DOWN, tags=[], tenant_id=8aa5b5a34cfe458d96fea87261361db1, updated_at=2026-02-20T09:55:49Z on network 7d64da0e-050b-4b53-8861-874f3c3ef083
Feb 20 09:55:52 np0005625204.localdomain dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 1 addresses
Feb 20 09:55:52 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host
Feb 20 09:55:52 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts
Feb 20 09:55:52 np0005625204.localdomain podman[315837]: 2026-02-20 09:55:52.370754357 +0000 UTC m=+0.056160504 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:52.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:52 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:52.575 264355 INFO neutron.agent.dhcp.agent [None req-a0869f93-dc3b-4277-9bd9-4d40b4467b17 - - - - - -] DHCP configuration for ports {'3ba2d20e-026f-45bd-b5aa-4b36de93e613'} is completed
Feb 20 09:55:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:52.617 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:52.619 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:55:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:52.622 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:55:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:52.623 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[bfc348c9-3d8e-4abd-94cb-6509f2c6e23f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:52 np0005625204.localdomain ceph-mon[301857]: pgmap v288: 177 pgs: 177 active+clean; 146 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 16 KiB/s wr, 14 op/s
Feb 20 09:55:52 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:55:52 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "format": "json"}]: dispatch
Feb 20 09:55:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:55:53 np0005625204.localdomain podman[315859]: 2026-02-20 09:55:53.143138959 +0000 UTC m=+0.084152614 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:55:53 np0005625204.localdomain podman[315859]: 2026-02-20 09:55:53.15997719 +0000 UTC m=+0.100990855 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:55:53 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:55:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:54.526 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:54.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:54 np0005625204.localdomain ceph-mon[301857]: pgmap v289: 177 pgs: 177 active+clean; 146 MiB data, 792 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 13 KiB/s wr, 11 op/s
Feb 20 09:55:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/223948785' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:55.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:55.770 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:55:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:55.771 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:55:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:55.771 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:55:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:55.772 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:55:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:55.772 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:55:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "snap_name": "da20d4b5-2abc-49bc-a78c-39ce3cdadf16", "format": "json"}]: dispatch
Feb 20 09:55:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/469548279' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:55:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/469548279' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:55:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2497222268' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:56.004 2 INFO neutron.agent.securitygroups_rpc [None req-29e4bc85-2be1-46ee-a8e4-a169ea695f47 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.026 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.028 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.033 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.034 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[80525b26-302b-4de2-8a53-15c0da1d6682]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:55:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1770512086' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.260 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:55:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.342 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.343 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:55:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:56.459 264355 INFO neutron.agent.linux.ip_lib [None req-f02ad7d0-6a8a-4737-b433-d3516658bae2 - - - - - -] Device tap7fb0a486-a6 cannot be used as it has no MAC address
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.481 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:56 np0005625204.localdomain kernel: device tap7fb0a486-a6 entered promiscuous mode
Feb 20 09:55:56 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581356.4864] manager: (tap7fb0a486-a6): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.489 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:56Z|00273|binding|INFO|Claiming lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 for this chassis.
Feb 20 09:55:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:56Z|00274|binding|INFO|7fb0a486-a66c-4d37-81fc-183b9c067f22: Claiming unknown
Feb 20 09:55:56 np0005625204.localdomain systemd-udevd[315914]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.500 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a09b3407c3143c8ae0948ccb18c1e61', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9d9638a-132c-434c-893e-fbb0a0a85486, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=7fb0a486-a66c-4d37-81fc-183b9c067f22) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.502 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fb0a486-a66c-4d37-81fc-183b9c067f22 in datapath 477c60d5-1ced-40c8-b389-807eea4d8a62 bound to our chassis
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.504 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 477c60d5-1ced-40c8-b389-807eea4d8a62 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.505 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[feff1754-0592-4af4-885c-9bfc920581fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:56Z|00275|binding|INFO|Setting lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 ovn-installed in OVS
Feb 20 09:55:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:55:56Z|00276|binding|INFO|Setting lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 up in Southbound
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.522 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:56.532 2 INFO neutron.agent.securitygroups_rpc [None req-35ffc450-9844-472b-bd23-e1de49029696 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap7fb0a486-a6: No such device
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.550 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.551 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11323MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.551 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.552 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.554 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:55:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:55:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:55:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:55:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:55:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.586 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.658 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.727 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.744 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:55:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:56.745 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.746 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:55:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:55:56.748 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:55:56 np0005625204.localdomain ceph-mon[301857]: pgmap v290: 177 pgs: 177 active+clean; 192 MiB data, 856 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 52 op/s
Feb 20 09:55:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1770512086' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:56 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:56.986 2 INFO neutron.agent.securitygroups_rpc [None req-add9b10e-24c8-47e6-9727-38256205ffd5 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:55:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:55:57 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/438322960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:57 np0005625204.localdomain systemd[1]: tmp-crun.M2yvpP.mount: Deactivated successfully.
Feb 20 09:55:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:57.165 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:55:57 np0005625204.localdomain podman[315983]: 2026-02-20 09:55:57.164991219 +0000 UTC m=+0.102188512 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:55:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:57.172 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:55:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:57.193 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:55:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:57.194 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:55:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:57.195 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:55:57 np0005625204.localdomain podman[315983]: 2026-02-20 09:55:57.201988301 +0000 UTC m=+0.139185604 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:55:57 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:55:57 np0005625204.localdomain podman[316030]: 
Feb 20 09:55:57 np0005625204.localdomain podman[316030]: 2026-02-20 09:55:57.319246739 +0000 UTC m=+0.069723837 container create 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:55:57 np0005625204.localdomain systemd[1]: Started libpod-conmon-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01.scope.
Feb 20 09:55:57 np0005625204.localdomain podman[316030]: 2026-02-20 09:55:57.278209973 +0000 UTC m=+0.028687111 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:55:57 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:55:57 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f8187cf9303c1f059ae001862a85201eb317f04d51767ad21701d4f209390a5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:55:57 np0005625204.localdomain podman[316030]: 2026-02-20 09:55:57.394664967 +0000 UTC m=+0.145142065 container init 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:55:57 np0005625204.localdomain podman[316030]: 2026-02-20 09:55:57.403757583 +0000 UTC m=+0.154234681 container start 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:55:57 np0005625204.localdomain dnsmasq[316049]: started, version 2.85 cachesize 150
Feb 20 09:55:57 np0005625204.localdomain dnsmasq[316049]: DNS service limited to local subnets
Feb 20 09:55:57 np0005625204.localdomain dnsmasq[316049]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:55:57 np0005625204.localdomain dnsmasq[316049]: warning: no upstream servers configured
Feb 20 09:55:57 np0005625204.localdomain dnsmasq-dhcp[316049]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:55:57 np0005625204.localdomain dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 0 addresses
Feb 20 09:55:57 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host
Feb 20 09:55:57 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts
Feb 20 09:55:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:57.425 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:57.467 264355 INFO neutron.agent.dhcp.agent [None req-f02ad7d0-6a8a-4737-b433-d3516658bae2 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:56Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62ca370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62cae20>], id=24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb, ip_allocation=immediate, mac_address=fa:16:3e:2d:41:23, name=tempest-PortsIpV6TestJSON-708453411, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:55:52Z, description=, dns_domain=, id=477c60d5-1ced-40c8-b389-807eea4d8a62, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1206391875, port_security_enabled=True, project_id=3a09b3407c3143c8ae0948ccb18c1e61, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=27615, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2181, status=ACTIVE, subnets=['005984a6-885d-4a90-94c5-fdf65af6044a'], tags=[], tenant_id=3a09b3407c3143c8ae0948ccb18c1e61, updated_at=2026-02-20T09:55:54Z, vlan_transparent=None, network_id=477c60d5-1ced-40c8-b389-807eea4d8a62, port_security_enabled=True, project_id=3a09b3407c3143c8ae0948ccb18c1e61, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['b2e5856c-f1df-4bbc-8f9c-41698aa249c6'], standard_attr_id=2213, status=DOWN, tags=[], tenant_id=3a09b3407c3143c8ae0948ccb18c1e61, updated_at=2026-02-20T09:55:56Z on network 477c60d5-1ced-40c8-b389-807eea4d8a62
Feb 20 09:55:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:57.542 264355 INFO neutron.agent.dhcp.agent [None req-f2ffb705-493e-40d9-a18a-d200026d8a4c - - - - - -] DHCP configuration for ports {'0e3d4067-022e-417e-90d3-44d8efd32e90'} is completed
Feb 20 09:55:57 np0005625204.localdomain dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 1 addresses
Feb 20 09:55:57 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host
Feb 20 09:55:57 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts
Feb 20 09:55:57 np0005625204.localdomain podman[316066]: 2026-02-20 09:55:57.65583663 +0000 UTC m=+0.056983009 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:55:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:55:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "format": "json"}]: dispatch
Feb 20 09:55:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:55:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/438322960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:55:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:55:57.928 264355 INFO neutron.agent.dhcp.agent [None req-44f0482b-cbae-46f2-8704-a715503832e3 - - - - - -] DHCP configuration for ports {'24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb'} is completed
Feb 20 09:55:58 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:58.482 2 INFO neutron.agent.securitygroups_rpc [None req-a5703a13-6375-4e7f-aba2-f531a9b12f0a f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:55:58 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:58.863 2 INFO neutron.agent.securitygroups_rpc [None req-ca73f77f-8256-4f4e-b317-bc2e72fd527f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:55:58 np0005625204.localdomain ceph-mon[301857]: pgmap v291: 177 pgs: 177 active+clean; 192 MiB data, 856 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 52 op/s
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.177 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.177 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:59 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:55:59.670 2 INFO neutron.agent.securitygroups_rpc [None req-ca73f77f-8256-4f4e-b317-bc2e72fd527f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.719 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.740 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.741 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:55:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:55:59.818 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:55:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "snap_name": "da20d4b5-2abc-49bc-a78c-39ce3cdadf16", "target_sub_name": "f9ac42b7-680c-41fc-8784-6176baa738f7", "format": "json"}]: dispatch
Feb 20 09:55:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f9ac42b7-680c-41fc-8784-6176baa738f7", "format": "json"}]: dispatch
Feb 20 09:56:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:56:00 np0005625204.localdomain podman[316086]: 2026-02-20 09:56:00.141753761 +0000 UTC m=+0.080279187 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.7, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, build-date=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc.)
Feb 20 09:56:00 np0005625204.localdomain podman[316086]: 2026-02-20 09:56:00.154666642 +0000 UTC m=+0.093192098 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, managed_by=edpm_ansible, release=1770267347, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, distribution-scope=public, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7)
Feb 20 09:56:00 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:56:00 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:00.493 2 INFO neutron.agent.securitygroups_rpc [None req-78688fc9-1f65-4ea8-8870-27e8d247cb32 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:00 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:00.586 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:55:56Z, description=, device_id=e10ccfe8-c211-4168-9be0-1240dc9757c2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df596f0a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df596f130>], id=24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb, ip_allocation=immediate, mac_address=fa:16:3e:2d:41:23, name=tempest-PortsIpV6TestJSON-708453411, network_id=477c60d5-1ced-40c8-b389-807eea4d8a62, port_security_enabled=True, project_id=3a09b3407c3143c8ae0948ccb18c1e61, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['b2e5856c-f1df-4bbc-8f9c-41698aa249c6'], standard_attr_id=2213, status=ACTIVE, tags=[], tenant_id=3a09b3407c3143c8ae0948ccb18c1e61, updated_at=2026-02-20T09:55:58Z on network 477c60d5-1ced-40c8-b389-807eea4d8a62
Feb 20 09:56:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:00 np0005625204.localdomain dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 1 addresses
Feb 20 09:56:00 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host
Feb 20 09:56:00 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts
Feb 20 09:56:00 np0005625204.localdomain podman[316124]: 2026-02-20 09:56:00.786541443 +0000 UTC m=+0.072091229 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:00 np0005625204.localdomain ceph-mon[301857]: pgmap v292: 177 pgs: 177 active+clean; 177 MiB data, 855 MiB used, 41 GiB / 42 GiB avail; 1.8 MiB/s rd, 1.8 MiB/s wr, 58 op/s
Feb 20 09:56:00 np0005625204.localdomain ceph-mon[301857]: mgrmap e49: np0005625202.arwxwo(active, since 7m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:56:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e1c014f1-0af0-4079-b9b5-c123fb6102a7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:56:01 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:01.172 264355 INFO neutron.agent.dhcp.agent [None req-09304e15-4b73-489c-886b-10c76f865860 - - - - - -] DHCP configuration for ports {'24ebb8b1-2361-48b3-b58d-97f9d2e4f4eb'} is completed
Feb 20 09:56:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:01 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:01.576 2 INFO neutron.agent.securitygroups_rpc [None req-a4e037d0-314c-4de3-aad9-537a96cc703d 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1014543227' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1014543227' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:02.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: pgmap v293: 177 pgs: 177 active+clean; 146 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 1.1 MiB/s rd, 1.8 MiB/s wr, 60 op/s
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e1c014f1-0af0-4079-b9b5-c123fb6102a7", "format": "json"}]: dispatch
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1014543227' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1014543227' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:56:03 np0005625204.localdomain sshd[316146]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:56:03 np0005625204.localdomain sshd[316146]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.823 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.824 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.824 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:56:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:03.825 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:56:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:03.894 2 INFO neutron.agent.securitygroups_rpc [None req-7d0bf5e0-9e1d-414c-8190-249e450828ca 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:56:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "e1c014f1-0af0-4079-b9b5-c123fb6102a7", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 20 09:56:04 np0005625204.localdomain dnsmasq[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/addn_hosts - 0 addresses
Feb 20 09:56:04 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/host
Feb 20 09:56:04 np0005625204.localdomain dnsmasq-dhcp[316049]: read /var/lib/neutron/dhcp/477c60d5-1ced-40c8-b389-807eea4d8a62/opts
Feb 20 09:56:04 np0005625204.localdomain podman[316165]: 2026-02-20 09:56:04.15538023 +0000 UTC m=+0.080017158 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.193 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.195 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.199 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.200 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[5d56de93-b2bf-4df5-a223-4db45aaeeec4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:04 np0005625204.localdomain kernel: device tap7fb0a486-a6 left promiscuous mode
Feb 20 09:56:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:04Z|00277|binding|INFO|Releasing lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 from this chassis (sb_readonly=0)
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.373 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:04Z|00278|binding|INFO|Setting lport 7fb0a486-a66c-4d37-81fc-183b9c067f22 down in Southbound
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.384 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-477c60d5-1ced-40c8-b389-807eea4d8a62', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a09b3407c3143c8ae0948ccb18c1e61', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d9d9638a-132c-434c-893e-fbb0a0a85486, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=7fb0a486-a66c-4d37-81fc-183b9c067f22) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.386 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 7fb0a486-a66c-4d37-81fc-183b9c067f22 in datapath 477c60d5-1ced-40c8-b389-807eea4d8a62 unbound from our chassis
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.388 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 477c60d5-1ced-40c8-b389-807eea4d8a62 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:04.390 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[26979788-d58f-4efa-bfd3-98dde28de985]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.399 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.574 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.577 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.647 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.663 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.664 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.665 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.665 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 09:56:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:04.682 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 09:56:05 np0005625204.localdomain ceph-mon[301857]: pgmap v294: 177 pgs: 177 active+clean; 146 MiB data, 806 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 1.8 MiB/s wr, 54 op/s
Feb 20 09:56:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1963234622' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/462116900' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/462116900' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/542996593' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:06.019 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:56:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:06.019 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:56:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:56:06 np0005625204.localdomain podman[316205]: 2026-02-20 09:56:06.142029532 +0000 UTC m=+0.058415033 container kill 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: tmp-crun.Oprtiq.mount: Deactivated successfully.
Feb 20 09:56:06 np0005625204.localdomain dnsmasq[316049]: exiting on receipt of SIGTERM
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: libpod-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01.scope: Deactivated successfully.
Feb 20 09:56:06 np0005625204.localdomain podman[316217]: 2026-02-20 09:56:06.200824156 +0000 UTC m=+0.044619285 container died 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 20 09:56:06 np0005625204.localdomain podman[316217]: 2026-02-20 09:56:06.251910966 +0000 UTC m=+0.095706075 container cleanup 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:06.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: libpod-conmon-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01.scope: Deactivated successfully.
Feb 20 09:56:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:06 np0005625204.localdomain podman[316219]: 2026-02-20 09:56:06.305478581 +0000 UTC m=+0.140599926 container remove 59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-477c60d5-1ced-40c8-b389-807eea4d8a62, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:56:06 np0005625204.localdomain podman[316246]: 2026-02-20 09:56:06.406595129 +0000 UTC m=+0.086306730 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:56:06 np0005625204.localdomain podman[316246]: 2026-02-20 09:56:06.474220721 +0000 UTC m=+0.153932302 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:56:06 np0005625204.localdomain podman[316274]: 2026-02-20 09:56:06.557077325 +0000 UTC m=+0.085040562 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 20 09:56:06 np0005625204.localdomain podman[316274]: 2026-02-20 09:56:06.561480918 +0000 UTC m=+0.089444005 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:56:06 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:56:06 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:06.578 264355 INFO neutron.agent.dhcp.agent [None req-14e7e21d-85db-4fcd-b44f-93fe00750199 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:06 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:06.578 264355 INFO neutron.agent.dhcp.agent [None req-14e7e21d-85db-4fcd-b44f-93fe00750199 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:07 np0005625204.localdomain ceph-mon[301857]: pgmap v295: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 1.8 MiB/s wr, 70 op/s
Feb 20 09:56:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:07.114 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:07 np0005625204.localdomain systemd[1]: tmp-crun.k88TM2.mount: Deactivated successfully.
Feb 20 09:56:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-2f8187cf9303c1f059ae001862a85201eb317f04d51767ad21701d4f209390a5-merged.mount: Deactivated successfully.
Feb 20 09:56:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59582cd1fd7b2aa6e69e2e9c0504ce1cbf0dadd137a364545dacd6bf6c1fbb01-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:07 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d477c60d5\x2d1ced\x2d40c8\x2db389\x2d807eea4d8a62.mount: Deactivated successfully.
Feb 20 09:56:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:07 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/766486190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:07 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/766486190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:07.576 264355 INFO neutron.agent.linux.ip_lib [None req-18a669aa-7587-498f-825f-6b145f8a4193 - - - - - -] Device tapab7745cf-b0 cannot be used as it has no MAC address
Feb 20 09:56:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:07Z|00279|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.639 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain kernel: device tapab7745cf-b0 entered promiscuous mode
Feb 20 09:56:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:07Z|00280|binding|INFO|Claiming lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 for this chassis.
Feb 20 09:56:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:07Z|00281|binding|INFO|ab7745cf-b091-4696-a4c2-1807d5c5fc66: Claiming unknown
Feb 20 09:56:07 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581367.6578] manager: (tapab7745cf-b0): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain systemd-udevd[316302]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:07.671 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b30598c7-87a8-481e-9e41-a7365e7f8781, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ab7745cf-b091-4696-a4c2-1807d5c5fc66) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:07.673 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ab7745cf-b091-4696-a4c2-1807d5c5fc66 in datapath a882f5ec-b867-48ca-837a-0ff3e12032b5 bound to our chassis
Feb 20 09:56:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:07.676 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 66774e6a-d590-4cbf-92dc-54d8452fe968 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:56:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:07.676 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a882f5ec-b867-48ca-837a-0ff3e12032b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:07.677 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[437fc9ac-194d-4337-9a93-3a4d5c38b098]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.699 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:07Z|00282|binding|INFO|Setting lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 ovn-installed in OVS
Feb 20 09:56:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:07Z|00283|binding|INFO|Setting lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 up in Southbound
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.702 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapab7745cf-b0: No such device
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:07.775 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:07 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:07.888 2 INFO neutron.agent.securitygroups_rpc [None req-8965acb2-2b16-4d89-a227-154eee5fe38f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e1c014f1-0af0-4079-b9b5-c123fb6102a7", "format": "json"}]: dispatch
Feb 20 09:56:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e1c014f1-0af0-4079-b9b5-c123fb6102a7", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/766486190' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/766486190' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:08.373 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:08.376 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:56:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:08.380 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:08.381 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3002a9e2-03c4-447f-b62c-7d0e1ae9cc6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:08 np0005625204.localdomain podman[316373]: 
Feb 20 09:56:08 np0005625204.localdomain podman[316373]: 2026-02-20 09:56:08.669111011 +0000 UTC m=+0.077698518 container create 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:56:08 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:08.679 2 INFO neutron.agent.securitygroups_rpc [None req-d51c801f-c66b-4697-8723-78081587d201 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:08 np0005625204.localdomain systemd[1]: Started libpod-conmon-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2.scope.
Feb 20 09:56:08 np0005625204.localdomain podman[316373]: 2026-02-20 09:56:08.632271403 +0000 UTC m=+0.040858980 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:56:08 np0005625204.localdomain systemd[1]: tmp-crun.g53Swh.mount: Deactivated successfully.
Feb 20 09:56:08 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:56:08 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/72d8d3ddae4f030671431c2b14aac6be84d7b25fbbaca5b27db499d762216d2d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:56:08 np0005625204.localdomain podman[316373]: 2026-02-20 09:56:08.763081052 +0000 UTC m=+0.171668589 container init 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 20 09:56:08 np0005625204.localdomain podman[316373]: 2026-02-20 09:56:08.774116957 +0000 UTC m=+0.182704494 container start 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:08 np0005625204.localdomain dnsmasq[316391]: started, version 2.85 cachesize 150
Feb 20 09:56:08 np0005625204.localdomain dnsmasq[316391]: DNS service limited to local subnets
Feb 20 09:56:08 np0005625204.localdomain dnsmasq[316391]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:56:08 np0005625204.localdomain dnsmasq[316391]: warning: no upstream servers configured
Feb 20 09:56:08 np0005625204.localdomain dnsmasq-dhcp[316391]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:56:08 np0005625204.localdomain dnsmasq[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/addn_hosts - 0 addresses
Feb 20 09:56:08 np0005625204.localdomain dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/host
Feb 20 09:56:08 np0005625204.localdomain dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/opts
Feb 20 09:56:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:08.832 264355 INFO neutron.agent.dhcp.agent [None req-954de5ce-1da6-4437-9c3c-d61063b0cd37 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:07Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62ca370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5bb21f0>], id=672227c5-4720-4d41-a0d6-6e9f5b528e92, ip_allocation=immediate, mac_address=fa:16:3e:0c:3e:77, name=tempest-PortsTestJSON-495831353, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:03Z, description=, dns_domain=, id=a882f5ec-b867-48ca-837a-0ff3e12032b5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-643978840, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=57544, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2234, status=ACTIVE, subnets=['58e95fae-bb88-4e7c-92ac-e1d173306dd0'], tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:06Z, vlan_transparent=None, network_id=a882f5ec-b867-48ca-837a-0ff3e12032b5, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['72ed92b6-af24-4274-854b-a52220405faf'], standard_attr_id=2243, status=DOWN, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:07Z on network a882f5ec-b867-48ca-837a-0ff3e12032b5
Feb 20 09:56:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:08.984 264355 INFO neutron.agent.dhcp.agent [None req-9f19bcf9-e212-43ab-bd66-0cf57b476a5f - - - - - -] DHCP configuration for ports {'7c76b554-ab71-45df-b525-0c267fc92bd2'} is completed
Feb 20 09:56:09 np0005625204.localdomain dnsmasq[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/addn_hosts - 1 addresses
Feb 20 09:56:09 np0005625204.localdomain dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/host
Feb 20 09:56:09 np0005625204.localdomain podman[316407]: 2026-02-20 09:56:09.04516752 +0000 UTC m=+0.047728059 container kill 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:56:09 np0005625204.localdomain dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/opts
Feb 20 09:56:09 np0005625204.localdomain ceph-mon[301857]: pgmap v296: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 25 KiB/s wr, 34 op/s
Feb 20 09:56:09 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:09.298 264355 INFO neutron.agent.dhcp.agent [None req-31b7dcf9-ba3d-48ac-871d-490caaa16b0f - - - - - -] DHCP configuration for ports {'672227c5-4720-4d41-a0d6-6e9f5b528e92'} is completed
Feb 20 09:56:09 np0005625204.localdomain dnsmasq[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/addn_hosts - 0 addresses
Feb 20 09:56:09 np0005625204.localdomain dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/host
Feb 20 09:56:09 np0005625204.localdomain podman[316445]: 2026-02-20 09:56:09.389473976 +0000 UTC m=+0.050081561 container kill 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:09 np0005625204.localdomain dnsmasq-dhcp[316391]: read /var/lib/neutron/dhcp/a882f5ec-b867-48ca-837a-0ff3e12032b5/opts
Feb 20 09:56:09 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:09.493 2 INFO neutron.agent.securitygroups_rpc [None req-166aeacb-5366-40db-a13d-35c7cc5a7a14 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:09.576 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:09.802 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 66774e6a-d590-4cbf-92dc-54d8452fe968 with type ""
Feb 20 09:56:09 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:09Z|00284|binding|INFO|Removing iface tapab7745cf-b0 ovn-installed in OVS
Feb 20 09:56:09 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:09Z|00285|binding|INFO|Removing lport ab7745cf-b091-4696-a4c2-1807d5c5fc66 ovn-installed in OVS
Feb 20 09:56:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:09.803 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a882f5ec-b867-48ca-837a-0ff3e12032b5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b30598c7-87a8-481e-9e41-a7365e7f8781, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ab7745cf-b091-4696-a4c2-1807d5c5fc66) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:09.803 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:09.806 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ab7745cf-b091-4696-a4c2-1807d5c5fc66 in datapath a882f5ec-b867-48ca-837a-0ff3e12032b5 unbound from our chassis
Feb 20 09:56:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:09.810 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a882f5ec-b867-48ca-837a-0ff3e12032b5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:09.811 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:09 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:09.811 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1d75db-2c04-4db3-b628-cec5d28f49ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:09 np0005625204.localdomain dnsmasq[316391]: exiting on receipt of SIGTERM
Feb 20 09:56:09 np0005625204.localdomain podman[316484]: 2026-02-20 09:56:09.820440931 +0000 UTC m=+0.064051264 container kill 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:56:09 np0005625204.localdomain systemd[1]: libpod-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2.scope: Deactivated successfully.
Feb 20 09:56:09 np0005625204.localdomain podman[316498]: 2026-02-20 09:56:09.894413685 +0000 UTC m=+0.057248168 container died 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:56:09 np0005625204.localdomain podman[316498]: 2026-02-20 09:56:09.930660625 +0000 UTC m=+0.093495068 container cleanup 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:56:09 np0005625204.localdomain systemd[1]: libpod-conmon-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2.scope: Deactivated successfully.
Feb 20 09:56:09 np0005625204.localdomain podman[316499]: 2026-02-20 09:56:09.972381741 +0000 UTC m=+0.130222462 container remove 3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-a882f5ec-b867-48ca-837a-0ff3e12032b5, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:56:09 np0005625204.localdomain kernel: device tapab7745cf-b0 left promiscuous mode
Feb 20 09:56:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:09.985 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:09.998 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:56:10 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:10.145 264355 INFO neutron.agent.dhcp.agent [None req-63c1a141-18cb-45d0-a643-47cade11de72 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:10 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:10.151 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:10 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:10Z|00286|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:10.446 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-72d8d3ddae4f030671431c2b14aac6be84d7b25fbbaca5b27db499d762216d2d-merged.mount: Deactivated successfully.
Feb 20 09:56:10 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a8d95f3a056455c6e326809b3e554b649655d1b2b1d9b7ddf8ae65ed60ab9a2-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:10 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2da882f5ec\x2db867\x2d48ca\x2d837a\x2d0ff3e12032b5.mount: Deactivated successfully.
Feb 20 09:56:10 np0005625204.localdomain sshd[316526]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:56:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:11.006 2 INFO neutron.agent.securitygroups_rpc [None req-7e54c9a7-f5f5-46c1-ae1b-688f8acab697 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['46f15231-c0dd-46d4-9abc-adba5985e75b']
Feb 20 09:56:11 np0005625204.localdomain ceph-mon[301857]: pgmap v297: 177 pgs: 177 active+clean; 146 MiB data, 797 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 31 KiB/s wr, 47 op/s
Feb 20 09:56:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ec3b06e0-45bf-4e34-99f4-1c4bd5601e66", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:56:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ec3b06e0-45bf-4e34-99f4-1c4bd5601e66", "format": "json"}]: dispatch
Feb 20 09:56:11 np0005625204.localdomain sshd[316526]: Invalid user claude from 154.91.170.41 port 47934
Feb 20 09:56:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:11 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:11.322 2 INFO neutron.agent.securitygroups_rpc [None req-4b1a20ca-0949-416f-91ae-525739a1e77a f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:11 np0005625204.localdomain sshd[316526]: Received disconnect from 154.91.170.41 port 47934:11: Bye Bye [preauth]
Feb 20 09:56:11 np0005625204.localdomain sshd[316526]: Disconnected from invalid user claude 154.91.170.41 port 47934 [preauth]
Feb 20 09:56:13 np0005625204.localdomain ceph-mon[301857]: pgmap v298: 177 pgs: 177 active+clean; 146 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 20 KiB/s wr, 38 op/s
Feb 20 09:56:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:13.737 2 INFO neutron.agent.securitygroups_rpc [None req-93b1773d-c2eb-4652-8e8d-0c460cd5364e 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['46f15231-c0dd-46d4-9abc-adba5985e75b', '446482cb-8c18-450e-acf7-2fbe583929b8']
Feb 20 09:56:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:13.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:56:14 np0005625204.localdomain podman[316528]: 2026-02-20 09:56:14.148098877 +0000 UTC m=+0.086117493 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "ec3b06e0-45bf-4e34-99f4-1c4bd5601e66", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 20 09:56:14 np0005625204.localdomain podman[316528]: 2026-02-20 09:56:14.162107792 +0000 UTC m=+0.100126378 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 20 09:56:14 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:56:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:14.358 2 INFO neutron.agent.securitygroups_rpc [None req-6fbfa532-f4c6-42e9-b707-63e0a42ce0d3 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['446482cb-8c18-450e-acf7-2fbe583929b8']
Feb 20 09:56:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:14.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:14 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:14.681 264355 INFO neutron.agent.linux.ip_lib [None req-a9d0852b-246f-45da-9c08-e40cbf2b895a - - - - - -] Device tap50f94ccb-90 cannot be used as it has no MAC address
Feb 20 09:56:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:14.705 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:14 np0005625204.localdomain kernel: device tap50f94ccb-90 entered promiscuous mode
Feb 20 09:56:14 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581374.7146] manager: (tap50f94ccb-90): new Generic device (/org/freedesktop/NetworkManager/Devices/47)
Feb 20 09:56:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:14Z|00287|binding|INFO|Claiming lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f for this chassis.
Feb 20 09:56:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:14.715 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:14Z|00288|binding|INFO|50f94ccb-90f6-40a9-9b2a-8774575ebe1f: Claiming unknown
Feb 20 09:56:14 np0005625204.localdomain systemd-udevd[316558]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:56:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:14.729 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac9b59eb-ef08-4f48-908a-f0e44b4f5714, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=50f94ccb-90f6-40a9-9b2a-8774575ebe1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:14.731 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50f94ccb-90f6-40a9-9b2a-8774575ebe1f in datapath 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543 bound to our chassis
Feb 20 09:56:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:14.734 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:14.735 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[6f1165e7-7bea-47e9-a471-043b8ba8051d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:14Z|00289|binding|INFO|Setting lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f ovn-installed in OVS
Feb 20 09:56:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:14Z|00290|binding|INFO|Setting lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f up in Southbound
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:14.749 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap50f94ccb-90: No such device
Feb 20 09:56:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:14.784 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:14.809 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:15 np0005625204.localdomain dnsmasq[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/addn_hosts - 0 addresses
Feb 20 09:56:15 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/host
Feb 20 09:56:15 np0005625204.localdomain podman[316604]: 2026-02-20 09:56:15.019620998 +0000 UTC m=+0.053907986 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:56:15 np0005625204.localdomain dnsmasq-dhcp[315482]: read /var/lib/neutron/dhcp/cd45d753-df49-44f6-b419-6749d7fe84f3/opts
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.058 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.3 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.060 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.064 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.065 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[82d8b059-092b-43da-986b-58155effc230]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:15 np0005625204.localdomain sshd[316633]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:56:15 np0005625204.localdomain ceph-mon[301857]: pgmap v299: 177 pgs: 177 active+clean; 146 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 19 KiB/s wr, 33 op/s
Feb 20 09:56:15 np0005625204.localdomain kernel: device tap4a535ed2-00 left promiscuous mode
Feb 20 09:56:15 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:15Z|00291|binding|INFO|Releasing lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 from this chassis (sb_readonly=0)
Feb 20 09:56:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:15.294 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:15 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:15Z|00292|binding|INFO|Setting lport 4a535ed2-00be-4ec7-8d9e-24afdab13877 down in Southbound
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.304 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:1::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cd45d753-df49-44f6-b419-6749d7fe84f3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1c44e13adebb4610b7c0cd2fdc62a5b7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d2f3d8db-7c26-4898-8e61-0b8ba04044df, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=4a535ed2-00be-4ec7-8d9e-24afdab13877) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.306 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 4a535ed2-00be-4ec7-8d9e-24afdab13877 in datapath cd45d753-df49-44f6-b419-6749d7fe84f3 unbound from our chassis
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.308 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network cd45d753-df49-44f6-b419-6749d7fe84f3 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:15.308 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:15.310 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[299afe80-b78f-4512-819a-d21e52b6e073]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:15 np0005625204.localdomain podman[316667]: 
Feb 20 09:56:15 np0005625204.localdomain podman[316667]: 2026-02-20 09:56:15.743328365 +0000 UTC m=+0.094330873 container create 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:15 np0005625204.localdomain systemd[1]: Started libpod-conmon-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49.scope.
Feb 20 09:56:15 np0005625204.localdomain podman[316667]: 2026-02-20 09:56:15.699605348 +0000 UTC m=+0.050607866 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:56:15 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:56:15 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/efdc0a52815bc810e6855830fc6f3a2cfc45ff834f8d7edab4a6f93251dafdd5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:56:15 np0005625204.localdomain podman[316667]: 2026-02-20 09:56:15.818777924 +0000 UTC m=+0.169780432 container init 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 09:56:15 np0005625204.localdomain podman[316667]: 2026-02-20 09:56:15.827853079 +0000 UTC m=+0.178855587 container start 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:15 np0005625204.localdomain dnsmasq[316685]: started, version 2.85 cachesize 150
Feb 20 09:56:15 np0005625204.localdomain dnsmasq[316685]: DNS service limited to local subnets
Feb 20 09:56:15 np0005625204.localdomain dnsmasq[316685]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:56:15 np0005625204.localdomain dnsmasq[316685]: warning: no upstream servers configured
Feb 20 09:56:15 np0005625204.localdomain dnsmasq-dhcp[316685]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:56:15 np0005625204.localdomain dnsmasq[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/addn_hosts - 0 addresses
Feb 20 09:56:15 np0005625204.localdomain dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/host
Feb 20 09:56:15 np0005625204.localdomain dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/opts
Feb 20 09:56:16 np0005625204.localdomain sshd[316633]: Received disconnect from 196.189.116.182 port 44696:11: Bye Bye [preauth]
Feb 20 09:56:16 np0005625204.localdomain sshd[316633]: Disconnected from authenticating user root 196.189.116.182 port 44696 [preauth]
Feb 20 09:56:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:16.251 264355 INFO neutron.agent.dhcp.agent [None req-1df8cec4-0e90-4b2f-835e-a8212a043feb - - - - - -] DHCP configuration for ports {'544ef45a-4e36-47a7-a9cf-a372fb626d48'} is completed
Feb 20 09:56:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:16 np0005625204.localdomain dnsmasq[315482]: exiting on receipt of SIGTERM
Feb 20 09:56:16 np0005625204.localdomain podman[316703]: 2026-02-20 09:56:16.577977658 +0000 UTC m=+0.058184397 container kill a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 20 09:56:16 np0005625204.localdomain systemd[1]: libpod-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d.scope: Deactivated successfully.
Feb 20 09:56:16 np0005625204.localdomain podman[316715]: 2026-02-20 09:56:16.653292693 +0000 UTC m=+0.063003393 container died a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:56:16 np0005625204.localdomain podman[316715]: 2026-02-20 09:56:16.689469661 +0000 UTC m=+0.099180311 container cleanup a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:56:16 np0005625204.localdomain systemd[1]: libpod-conmon-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d.scope: Deactivated successfully.
Feb 20 09:56:16 np0005625204.localdomain podman[316717]: 2026-02-20 09:56:16.725479792 +0000 UTC m=+0.124305592 container remove a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-cd45d753-df49-44f6-b419-6749d7fe84f3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 20 09:56:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8a8113fcfd1358b3db8a5596a585afb795b665e94caae01a8c63b1b24d0a203b-merged.mount: Deactivated successfully.
Feb 20 09:56:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4acff58cb70072a0dce839fd0b47ae3b9c815ae278ef5d9dd6674618ac4d20d-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:16.864 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:16Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5875550>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58a2cd0>], id=34719fc8-3f38-4502-8ff2-ba5516ba7226, ip_allocation=immediate, mac_address=fa:16:3e:0a:ae:8f, name=tempest-PortsTestJSON-1975505507, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:10Z, description=, dns_domain=, id=6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1562581203, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=45010, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2264, status=ACTIVE, subnets=['ca6109c7-e6d1-47cb-90b4-650cf67ce94e'], tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:13Z, vlan_transparent=None, network_id=6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2283, status=DOWN, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:16Z on network 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543
Feb 20 09:56:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:16.940 264355 INFO neutron.agent.dhcp.agent [None req-39e942ed-42d1-4cb9-be31-c6659c14a791 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:16 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dcd45d753\x2ddf49\x2d44f6\x2db419\x2d6749d7fe84f3.mount: Deactivated successfully.
Feb 20 09:56:17 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:17.067 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:17 np0005625204.localdomain dnsmasq[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/addn_hosts - 1 addresses
Feb 20 09:56:17 np0005625204.localdomain dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/host
Feb 20 09:56:17 np0005625204.localdomain dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/opts
Feb 20 09:56:17 np0005625204.localdomain podman[316762]: 2026-02-20 09:56:17.084627589 +0000 UTC m=+0.055210176 container kill 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:56:17 np0005625204.localdomain ceph-mon[301857]: pgmap v300: 177 pgs: 177 active+clean; 146 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 26 KiB/s wr, 35 op/s
Feb 20 09:56:17 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:17.344 264355 INFO neutron.agent.dhcp.agent [None req-ca896c1b-0a7e-4005-af16-f5d61cdd11f5 - - - - - -] DHCP configuration for ports {'34719fc8-3f38-4502-8ff2-ba5516ba7226'} is completed
Feb 20 09:56:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:56:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:56:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:56:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158905 "" "Go-http-client/1.1"
Feb 20 09:56:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:56:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19280 "" "Go-http-client/1.1"
Feb 20 09:56:17 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:17.986 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ec3b06e0-45bf-4e34-99f4-1c4bd5601e66", "format": "json"}]: dispatch
Feb 20 09:56:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ec3b06e0-45bf-4e34-99f4-1c4bd5601e66", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:18 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2119990363' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:18 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2119990363' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:18 np0005625204.localdomain dnsmasq[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/addn_hosts - 0 addresses
Feb 20 09:56:18 np0005625204.localdomain dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/host
Feb 20 09:56:18 np0005625204.localdomain dnsmasq-dhcp[316685]: read /var/lib/neutron/dhcp/6a1f8436-ee67-4aeb-90fc-cbe6c39f2543/opts
Feb 20 09:56:18 np0005625204.localdomain podman[316800]: 2026-02-20 09:56:18.372083279 +0000 UTC m=+0.056992250 container kill 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:56:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:18Z|00293|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:18.558 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:18 np0005625204.localdomain dnsmasq[316685]: exiting on receipt of SIGTERM
Feb 20 09:56:18 np0005625204.localdomain podman[316837]: 2026-02-20 09:56:18.800895448 +0000 UTC m=+0.063623540 container kill 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:56:18 np0005625204.localdomain systemd[1]: libpod-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49.scope: Deactivated successfully.
Feb 20 09:56:18 np0005625204.localdomain podman[316849]: 2026-02-20 09:56:18.870793599 +0000 UTC m=+0.057619548 container died 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:56:18 np0005625204.localdomain podman[316849]: 2026-02-20 09:56:18.901166001 +0000 UTC m=+0.087991880 container cleanup 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:18 np0005625204.localdomain systemd[1]: libpod-conmon-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49.scope: Deactivated successfully.
Feb 20 09:56:18 np0005625204.localdomain podman[316851]: 2026-02-20 09:56:18.942499655 +0000 UTC m=+0.121179317 container remove 3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:56:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:18Z|00294|binding|INFO|Releasing lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f from this chassis (sb_readonly=0)
Feb 20 09:56:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:18Z|00295|binding|INFO|Setting lport 50f94ccb-90f6-40a9-9b2a-8774575ebe1f down in Southbound
Feb 20 09:56:18 np0005625204.localdomain kernel: device tap50f94ccb-90 left promiscuous mode
Feb 20 09:56:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:18.994 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.005 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a1f8436-ee67-4aeb-90fc-cbe6c39f2543', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ac9b59eb-ef08-4f48-908a-f0e44b4f5714, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=50f94ccb-90f6-40a9-9b2a-8774575ebe1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.007 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50f94ccb-90f6-40a9-9b2a-8774575ebe1f in datapath 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543 unbound from our chassis
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.011 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a1f8436-ee67-4aeb-90fc-cbe6c39f2543, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.012 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[815c0f17-2257-4968-beb9-7f2b6f683c80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:19.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:19 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:19.154 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:19 np0005625204.localdomain ceph-mon[301857]: pgmap v301: 177 pgs: 177 active+clean; 146 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 13 KiB/s wr, 18 op/s
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.241 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.242 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.246 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:19.247 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab88c3d-4696-4b0c-8c28-ce22290974ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-efdc0a52815bc810e6855830fc6f3a2cfc45ff834f8d7edab4a6f93251dafdd5-merged.mount: Deactivated successfully.
Feb 20 09:56:19 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3aa79602020178d5b8d789e6df510cf25fc687e08824598a3ccfb7339c544b49-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:19 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d6a1f8436\x2dee67\x2d4aeb\x2d90fc\x2dcbe6c39f2543.mount: Deactivated successfully.
Feb 20 09:56:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:19.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:19.585 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:20 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:20.084 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:56:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:20.346 2 INFO neutron.agent.securitygroups_rpc [None req-27e863e6-abb7-4d79-8929-35ee419d3ab5 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:20 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/772936112' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:20 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/772936112' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:20Z|00296|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:20.623 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:20.840 2 INFO neutron.agent.securitygroups_rpc [None req-5bc16860-c455-4be4-9017-f7ba050a5b1d f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:20.942 2 INFO neutron.agent.securitygroups_rpc [None req-3d50957a-c50d-404e-a697-bd588426aa5b 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['9c894fef-e625-4d2d-ad79-9f0215b19661']
Feb 20 09:56:21 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:21.128 2 INFO neutron.agent.securitygroups_rpc [None req-ab2c767f-db90-4059-9416-3c9c50626a18 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:21 np0005625204.localdomain ceph-mon[301857]: pgmap v302: 177 pgs: 177 active+clean; 146 MiB data, 801 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 18 KiB/s wr, 20 op/s
Feb 20 09:56:21 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "15ed10cc-a478-4127-a35f-dcce0e8ec529", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:56:21 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "15ed10cc-a478-4127-a35f-dcce0e8ec529", "format": "json"}]: dispatch
Feb 20 09:56:21 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/772936112' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:21 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/772936112' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:21 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:21.818 2 INFO neutron.agent.securitygroups_rpc [None req-325d197d-f2bb-472d-a6df-be02729b4a1c 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:22 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:22.603 2 INFO neutron.agent.securitygroups_rpc [None req-521cfad5-05c2-4b59-9313-296ec36811c0 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:23 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:23.111 2 INFO neutron.agent.securitygroups_rpc [None req-264095f7-8549-4a1d-9c14-cf140323ad0c 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:23 np0005625204.localdomain ceph-mon[301857]: pgmap v303: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 12 KiB/s wr, 21 op/s
Feb 20 09:56:23 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:23.452 2 INFO neutron.agent.securitygroups_rpc [None req-8e5e38ae-f36c-4a7a-929b-4d665cde8908 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:56:24 np0005625204.localdomain systemd[1]: tmp-crun.pm9isZ.mount: Deactivated successfully.
Feb 20 09:56:24 np0005625204.localdomain podman[316881]: 2026-02-20 09:56:24.153963345 +0000 UTC m=+0.091159966 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:56:24 np0005625204.localdomain podman[316881]: 2026-02-20 09:56:24.161658529 +0000 UTC m=+0.098855110 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:56:24 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:56:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "15ed10cc-a478-4127-a35f-dcce0e8ec529", "format": "json"}]: dispatch
Feb 20 09:56:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "15ed10cc-a478-4127-a35f-dcce0e8ec529", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:24.585 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:24 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:24.666 2 INFO neutron.agent.securitygroups_rpc [None req-7c70538f-1d84-485c-beb6-53999b2ce1d2 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['9c894fef-e625-4d2d-ad79-9f0215b19661', '6e36724b-9ab8-4bfe-9f74-069d82055697', '5fe0aa03-55bd-43ef-a38b-499c4a5e8b30']
Feb 20 09:56:25 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:25.110 2 INFO neutron.agent.securitygroups_rpc [None req-22f50294-5f51-4ab3-8b7c-31c2f02c0d3d 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:25 np0005625204.localdomain ceph-mon[301857]: pgmap v304: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 12 KiB/s wr, 17 op/s
Feb 20 09:56:25 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:25.783 2 INFO neutron.agent.securitygroups_rpc [None req-f3d891d9-b12f-41a6-9c43-2a59a14444d4 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['6e36724b-9ab8-4bfe-9f74-069d82055697', '5fe0aa03-55bd-43ef-a38b-499c4a5e8b30']
Feb 20 09:56:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:56:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:56:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:56:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:56:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:56:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:56:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:26.619 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 10.100.0.2 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:26.621 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:56:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:26.624 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:26 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:26.625 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[341212f9-c3b3-43e6-bae3-e5d39182f7bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:27 np0005625204.localdomain ceph-mon[301857]: pgmap v305: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 21 KiB/s wr, 32 op/s
Feb 20 09:56:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "da4b4a91-1d53-4396-80dd-4a8521885337", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:56:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "da4b4a91-1d53-4396-80dd-4a8521885337", "format": "json"}]: dispatch
Feb 20 09:56:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:56:27 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:27.553 2 INFO neutron.agent.securitygroups_rpc [None req-a6f56626-9080-4b48-8909-d5cbdaffd977 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:27Z|00297|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:27.954 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:56:28 np0005625204.localdomain podman[316904]: 2026-02-20 09:56:28.144809333 +0000 UTC m=+0.084493375 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:56:28 np0005625204.localdomain podman[316904]: 2026-02-20 09:56:28.156034353 +0000 UTC m=+0.095718405 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:56:28 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:56:28 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:28.580 2 INFO neutron.agent.securitygroups_rpc [None req-6a8272e4-f5a1-42d2-a801-cea63c76a8af f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:28 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:28.890 2 INFO neutron.agent.securitygroups_rpc [None req-222fdb52-3334-45ab-8f45-945b32b8d031 809f6e2027f9442d8bd2b94b11475b17 3a09b3407c3143c8ae0948ccb18c1e61 - - default default] Security group member updated ['b2e5856c-f1df-4bbc-8f9c-41698aa249c6']
Feb 20 09:56:29 np0005625204.localdomain ceph-mon[301857]: pgmap v306: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 15 KiB/s wr, 31 op/s
Feb 20 09:56:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:29.587 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:31 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:31Z|00298|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:56:31 np0005625204.localdomain ceph-mon[301857]: pgmap v307: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 19 KiB/s wr, 32 op/s
Feb 20 09:56:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "da4b4a91-1d53-4396-80dd-4a8521885337", "format": "json"}]: dispatch
Feb 20 09:56:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "da4b4a91-1d53-4396-80dd-4a8521885337", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:31.699 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:31.703 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ce:83:b0 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b4d5592-ecf2-48cc-b3b1-c6ba46f9e5e6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dee4bf28-462f-4e5a-bb37-08fba06228d7) old=Port_Binding(mac=['fa:16:3e:ce:83:b0 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-34dc61c2-2cd5-48a1-a54d-350e15f73770', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:31.704 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dee4bf28-462f-4e5a-bb37-08fba06228d7 in datapath 34dc61c2-2cd5-48a1-a54d-350e15f73770 updated
Feb 20 09:56:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:31.707 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 34dc61c2-2cd5-48a1-a54d-350e15f73770, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:31 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:31.707 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[71260dfc-c0dc-4d76-a89c-4aca0b779f89]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:31 np0005625204.localdomain podman[316928]: 2026-02-20 09:56:31.71633593 +0000 UTC m=+0.449854640 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, maintainer=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9/ubi-minimal, release=1770267347, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 20 09:56:31 np0005625204.localdomain podman[316928]: 2026-02-20 09:56:31.750084414 +0000 UTC m=+0.483603114 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, distribution-scope=public, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 20 09:56:32 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:32.273 2 INFO neutron.agent.securitygroups_rpc [None req-c16a47d9-8c3c-4273-8f35-4d2edcf8a46b f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:32 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:56:32 np0005625204.localdomain dnsmasq[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/addn_hosts - 0 addresses
Feb 20 09:56:32 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/host
Feb 20 09:56:32 np0005625204.localdomain dnsmasq-dhcp[315780]: read /var/lib/neutron/dhcp/7d64da0e-050b-4b53-8861-874f3c3ef083/opts
Feb 20 09:56:32 np0005625204.localdomain podman[316965]: 2026-02-20 09:56:32.366268538 +0000 UTC m=+0.037554731 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 09:56:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:32Z|00299|binding|INFO|Releasing lport 36eed41e-8c87-4e39-8638-1f088c4d480e from this chassis (sb_readonly=0)
Feb 20 09:56:32 np0005625204.localdomain kernel: device tap36eed41e-8c left promiscuous mode
Feb 20 09:56:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:32Z|00300|binding|INFO|Setting lport 36eed41e-8c87-4e39-8638-1f088c4d480e down in Southbound
Feb 20 09:56:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:32.513 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:32.521 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d64da0e-050b-4b53-8861-874f3c3ef083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8aa5b5a34cfe458d96fea87261361db1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=24080264-67e8-4d16-abbb-0767714bc8ff, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=36eed41e-8c87-4e39-8638-1f088c4d480e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:32.522 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 36eed41e-8c87-4e39-8638-1f088c4d480e in datapath 7d64da0e-050b-4b53-8861-874f3c3ef083 unbound from our chassis
Feb 20 09:56:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:32.525 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d64da0e-050b-4b53-8861-874f3c3ef083, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:32.526 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5cf083-242b-4b41-8303-1e074300a68e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:32.535 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:32 np0005625204.localdomain ceph-mon[301857]: pgmap v308: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 14 KiB/s wr, 30 op/s
Feb 20 09:56:32 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3147010759' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:56:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:33.105 2 INFO neutron.agent.securitygroups_rpc [None req-b75820d6-6baf-4494-b7b1-8acd63dcbbd9 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:33.139 2 INFO neutron.agent.securitygroups_rpc [None req-8fddf0ed-4d67-47fd-a98a-ec6a15c12895 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:33 np0005625204.localdomain dnsmasq[315780]: exiting on receipt of SIGTERM
Feb 20 09:56:33 np0005625204.localdomain podman[317006]: 2026-02-20 09:56:33.234800619 +0000 UTC m=+0.057914349 container kill 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:33 np0005625204.localdomain systemd[1]: libpod-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8.scope: Deactivated successfully.
Feb 20 09:56:33 np0005625204.localdomain podman[317018]: 2026-02-20 09:56:33.284087244 +0000 UTC m=+0.039141639 container died 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:33 np0005625204.localdomain podman[317018]: 2026-02-20 09:56:33.314835426 +0000 UTC m=+0.069889791 container cleanup 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:56:33 np0005625204.localdomain systemd[1]: libpod-conmon-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8.scope: Deactivated successfully.
Feb 20 09:56:33 np0005625204.localdomain podman[317020]: 2026-02-20 09:56:33.392679348 +0000 UTC m=+0.138353138 container remove 6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7d64da0e-050b-4b53-8861-874f3c3ef083, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:33 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:33.459 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:33.761 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:33.789 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 20 09:56:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:33.790 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:56:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:33.790 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:56:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:33.814 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.024s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:56:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e153 e153: 6 total, 6 up, 6 in
Feb 20 09:56:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "81bbf3cf-dd05-49ca-a680-c731aa18f72b", "format": "json"}]: dispatch
Feb 20 09:56:33 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:33.858 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:34Z|00301|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:34.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-3e2bd54cd7424e1b7846b66bd6e6847e221583891328681be5cb3316924e217c-merged.mount: Deactivated successfully.
Feb 20 09:56:34 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ba581886ef6e69461e6d1f0e195a9fb2a40218ad194fcea2864e632d46439e8-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:34 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d7d64da0e\x2d050b\x2d4b53\x2d8861\x2d874f3c3ef083.mount: Deactivated successfully.
Feb 20 09:56:34 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:34.439 2 INFO neutron.agent.securitygroups_rpc [None req-ceb6fbf0-e236-46d5-ab31-4b9208acd398 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:34.592 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:34 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e154 e154: 6 total, 6 up, 6 in
Feb 20 09:56:34 np0005625204.localdomain ceph-mon[301857]: pgmap v309: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 14 KiB/s wr, 16 op/s
Feb 20 09:56:34 np0005625204.localdomain ceph-mon[301857]: osdmap e153: 6 total, 6 up, 6 in
Feb 20 09:56:35 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:35.737 2 INFO neutron.agent.securitygroups_rpc [None req-21b5cfcb-ef7f-4dc6-82f5-46fe7ab7fc9a 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:35 np0005625204.localdomain ceph-mon[301857]: osdmap e154: 6 total, 6 up, 6 in
Feb 20 09:56:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e155 e155: 6 total, 6 up, 6 in
Feb 20 09:56:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:36 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:36.670 2 INFO neutron.agent.securitygroups_rpc [None req-5f75f844-b31b-4010-9a93-efcf0b2c4eb8 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:36 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:36.894 2 INFO neutron.agent.securitygroups_rpc [None req-a5e98a26-c124-4fdc-9abc-b12558eae8ef f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:36 np0005625204.localdomain ceph-mon[301857]: pgmap v312: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 4.6 KiB/s rd, 18 KiB/s wr, 11 op/s
Feb 20 09:56:36 np0005625204.localdomain ceph-mon[301857]: osdmap e155: 6 total, 6 up, 6 in
Feb 20 09:56:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e156 e156: 6 total, 6 up, 6 in
Feb 20 09:56:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:56:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:56:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:37.037 264355 INFO neutron.agent.linux.ip_lib [None req-3a002da4-7f7b-4ddd-b6b6-5656271a924c - - - - - -] Device tapc5a7b9ef-7d cannot be used as it has no MAC address
Feb 20 09:56:37 np0005625204.localdomain podman[317049]: 2026-02-20 09:56:37.062873769 +0000 UTC m=+0.099844341 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:56:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:37.066 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:37 np0005625204.localdomain kernel: device tapc5a7b9ef-7d entered promiscuous mode
Feb 20 09:56:37 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581397.0755] manager: (tapc5a7b9ef-7d): new Generic device (/org/freedesktop/NetworkManager/Devices/48)
Feb 20 09:56:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:37.076 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:37Z|00302|binding|INFO|Claiming lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce for this chassis.
Feb 20 09:56:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:37Z|00303|binding|INFO|c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce: Claiming unknown
Feb 20 09:56:37 np0005625204.localdomain systemd-udevd[317082]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:56:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:37.093 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8616b81b-c719-43b5-be2c-dbf68397c33b, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:37.097 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce in datapath 91a3c914-50af-4619-8f46-93ff66e8b045 bound to our chassis
Feb 20 09:56:37 np0005625204.localdomain podman[317049]: 2026-02-20 09:56:37.097889961 +0000 UTC m=+0.134860533 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 20 09:56:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:37.099 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 91a3c914-50af-4619-8f46-93ff66e8b045 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:37.100 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[eba41a9e-94bf-4600-a19f-433c96983724]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:37Z|00304|binding|INFO|Setting lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce ovn-installed in OVS
Feb 20 09:56:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:37Z|00305|binding|INFO|Setting lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce up in Southbound
Feb 20 09:56:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:37.121 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:37 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:56:37 np0005625204.localdomain podman[317048]: 2026-02-20 09:56:37.130221352 +0000 UTC m=+0.174850516 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:56:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:37.153 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:37 np0005625204.localdomain podman[317048]: 2026-02-20 09:56:37.161390757 +0000 UTC m=+0.206019871 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:56:37 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:56:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:37.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:37 np0005625204.localdomain systemd[1]: tmp-crun.4OAisw.mount: Deactivated successfully.
Feb 20 09:56:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "81bbf3cf-dd05-49ca-a680-c731aa18f72b_083f2f39-54db-4760-baba-9aefd6c5b6fc", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "81bbf3cf-dd05-49ca-a680-c731aa18f72b", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:37 np0005625204.localdomain ceph-mon[301857]: osdmap e156: 6 total, 6 up, 6 in
Feb 20 09:56:37 np0005625204.localdomain podman[317153]: 
Feb 20 09:56:37 np0005625204.localdomain podman[317153]: 2026-02-20 09:56:37.989732129 +0000 UTC m=+0.088056612 container create 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 09:56:38 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:38.020 2 INFO neutron.agent.securitygroups_rpc [None req-a827ab5a-214a-4a1d-a84d-cac050b991d6 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6.scope.
Feb 20 09:56:38 np0005625204.localdomain podman[317153]: 2026-02-20 09:56:37.947404845 +0000 UTC m=+0.045729328 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:56:38 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:56:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fd26698bf47eaf748148ca73e0ba4205722c28c51e908d2f97ffc72f55d6d626/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:56:38 np0005625204.localdomain podman[317153]: 2026-02-20 09:56:38.06625417 +0000 UTC m=+0.164578653 container init 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 20 09:56:38 np0005625204.localdomain podman[317153]: 2026-02-20 09:56:38.072133488 +0000 UTC m=+0.170457971 container start 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:56:38 np0005625204.localdomain dnsmasq[317172]: started, version 2.85 cachesize 150
Feb 20 09:56:38 np0005625204.localdomain dnsmasq[317172]: DNS service limited to local subnets
Feb 20 09:56:38 np0005625204.localdomain dnsmasq[317172]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:56:38 np0005625204.localdomain dnsmasq[317172]: warning: no upstream servers configured
Feb 20 09:56:38 np0005625204.localdomain dnsmasq-dhcp[317172]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:56:38 np0005625204.localdomain dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 0 addresses
Feb 20 09:56:38 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host
Feb 20 09:56:38 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts
Feb 20 09:56:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:38.219 264355 INFO neutron.agent.dhcp.agent [None req-7b3d08e0-3d5a-4399-9db6-3150dc7d1478 - - - - - -] DHCP configuration for ports {'fa0abef2-8ae8-40ee-a86b-a1b7596c8d71'} is completed
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2694845083' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2694845083' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: pgmap v315: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 8.7 KiB/s rd, 23 KiB/s wr, 18 op/s
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f9ac42b7-680c-41fc-8784-6176baa738f7", "format": "json"}]: dispatch
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2694845083' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2694845083' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:38 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e157 e157: 6 total, 6 up, 6 in
Feb 20 09:56:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:39.620 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:56:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 2206 writes, 22K keys, 2206 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s
                                                           Cumulative WAL: 2206 writes, 2206 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2206 writes, 22K keys, 2206 commit groups, 1.0 writes per commit group, ingest: 39.05 MB, 0.07 MB/s
                                                           Interval WAL: 2206 writes, 2206 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    183.9      0.13              0.06         7    0.019       0      0       0.0       0.0
                                                             L6      1/0   15.18 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.1    223.9    204.4      0.49              0.28         6    0.082     74K   2880       0.0       0.0
                                                            Sum      1/0   15.18 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.1    175.7    200.0      0.63              0.34        13    0.048     74K   2880       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   5.1    176.3    200.6      0.62              0.34        12    0.052     74K   2880       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    223.9    204.4      0.49              0.28         6    0.082     74K   2880       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    186.7      0.13              0.06         6    0.022       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.024, interval 0.024
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.12 GB write, 0.21 MB/s write, 0.11 GB read, 0.18 MB/s read, 0.6 seconds
                                                           Interval compaction: 0.12 GB write, 0.21 MB/s write, 0.11 GB read, 0.18 MB/s read, 0.6 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x559a1eac51f0#2 capacity: 308.00 MB usage: 11.04 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000111 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(564,10.51 MB,3.41339%) FilterBlock(13,234.23 KB,0.0742677%) IndexBlock(13,300.55 KB,0.0952931%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 20 09:56:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e158 e158: 6 total, 6 up, 6 in
Feb 20 09:56:39 np0005625204.localdomain ceph-mon[301857]: osdmap e157: 6 total, 6 up, 6 in
Feb 20 09:56:40 np0005625204.localdomain sudo[317173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:56:40 np0005625204.localdomain sudo[317173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:56:40 np0005625204.localdomain sudo[317173]: pam_unix(sudo:session): session closed for user root
Feb 20 09:56:40 np0005625204.localdomain sudo[317191]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:56:40 np0005625204.localdomain sudo[317191]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:56:41 np0005625204.localdomain ceph-mon[301857]: pgmap v317: 177 pgs: 177 active+clean; 146 MiB data, 802 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 16 KiB/s wr, 65 op/s
Feb 20 09:56:41 np0005625204.localdomain ceph-mon[301857]: osdmap e158: 6 total, 6 up, 6 in
Feb 20 09:56:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3055324932' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3055324932' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e158 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:41 np0005625204.localdomain sudo[317191]: pam_unix(sudo:session): session closed for user root
Feb 20 09:56:41 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:41.714 2 INFO neutron.agent.securitygroups_rpc [None req-13f83c28-0ec5-483d-8133-f11a853f0aba f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:41 np0005625204.localdomain sudo[317240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:56:41 np0005625204.localdomain sudo[317240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:56:41 np0005625204.localdomain sudo[317240]: pam_unix(sudo:session): session closed for user root
Feb 20 09:56:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:41.983 264355 INFO neutron.agent.linux.ip_lib [None req-23b58b6a-bbd0-4fc6-bbb9-bac7e4963005 - - - - - -] Device tape4daede4-dd cannot be used as it has no MAC address
Feb 20 09:56:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:42.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:42 np0005625204.localdomain kernel: device tape4daede4-dd entered promiscuous mode
Feb 20 09:56:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:56:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:56:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:56:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:56:42 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581402.0248] manager: (tape4daede4-dd): new Generic device (/org/freedesktop/NetworkManager/Devices/49)
Feb 20 09:56:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:42.024 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:42Z|00306|binding|INFO|Claiming lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a for this chassis.
Feb 20 09:56:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:42Z|00307|binding|INFO|e4daede4-dda0-4eeb-801e-ab2b266b4f0a: Claiming unknown
Feb 20 09:56:42 np0005625204.localdomain systemd-udevd[317268]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:56:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:42.039 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aac0e58-10ca-4f2e-89c4-11cc34f042d9, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e4daede4-dda0-4eeb-801e-ab2b266b4f0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:42.041 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e4daede4-dda0-4eeb-801e-ab2b266b4f0a in datapath 183b90c2-0ae0-467a-8a71-cbddda06cd4d bound to our chassis
Feb 20 09:56:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:42.043 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 183b90c2-0ae0-467a-8a71-cbddda06cd4d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:42 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:42.043 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc5127f-a2e8-4b2a-a121-102087a9e801]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:42Z|00308|binding|INFO|Setting lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a ovn-installed in OVS
Feb 20 09:56:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:42Z|00309|binding|INFO|Setting lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a up in Southbound
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:42.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tape4daede4-dd: No such device
Feb 20 09:56:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:42.101 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:42.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:42 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:42.613 2 INFO neutron.agent.securitygroups_rpc [None req-92a7b9d6-6b07-465f-9755-118a416fc381 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:42 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e159 e159: 6 total, 6 up, 6 in
Feb 20 09:56:42 np0005625204.localdomain podman[317339]: 
Feb 20 09:56:42 np0005625204.localdomain podman[317339]: 2026-02-20 09:56:42.956236915 +0000 UTC m=+0.093607232 container create 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:56:42 np0005625204.localdomain systemd[1]: Started libpod-conmon-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358.scope.
Feb 20 09:56:43 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:56:43 np0005625204.localdomain podman[317339]: 2026-02-20 09:56:42.912806327 +0000 UTC m=+0.050176694 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:56:43 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83bff4229e001c3857e1bb62e3030a2db0bdff95dede477c807197129286ef30/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:56:43 np0005625204.localdomain podman[317339]: 2026-02-20 09:56:43.024143485 +0000 UTC m=+0.161513802 container init 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Feb 20 09:56:43 np0005625204.localdomain ceph-mon[301857]: pgmap v319: 177 pgs: 177 active+clean; 146 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 94 KiB/s rd, 16 KiB/s wr, 132 op/s
Feb 20 09:56:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f9ac42b7-680c-41fc-8784-6176baa738f7", "format": "json"}]: dispatch
Feb 20 09:56:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:56:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "8b7ef8b3-43af-41f4-8a24-ef7ee4fe0934", "format": "json"}]: dispatch
Feb 20 09:56:43 np0005625204.localdomain ceph-mon[301857]: osdmap e159: 6 total, 6 up, 6 in
Feb 20 09:56:43 np0005625204.localdomain podman[317339]: 2026-02-20 09:56:43.037471779 +0000 UTC m=+0.174842086 container start 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:43 np0005625204.localdomain dnsmasq[317358]: started, version 2.85 cachesize 150
Feb 20 09:56:43 np0005625204.localdomain dnsmasq[317358]: DNS service limited to local subnets
Feb 20 09:56:43 np0005625204.localdomain dnsmasq[317358]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:56:43 np0005625204.localdomain dnsmasq[317358]: warning: no upstream servers configured
Feb 20 09:56:43 np0005625204.localdomain dnsmasq-dhcp[317358]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 20 09:56:43 np0005625204.localdomain dnsmasq[317358]: read /var/lib/neutron/dhcp/183b90c2-0ae0-467a-8a71-cbddda06cd4d/addn_hosts - 0 addresses
Feb 20 09:56:43 np0005625204.localdomain dnsmasq-dhcp[317358]: read /var/lib/neutron/dhcp/183b90c2-0ae0-467a-8a71-cbddda06cd4d/host
Feb 20 09:56:43 np0005625204.localdomain dnsmasq-dhcp[317358]: read /var/lib/neutron/dhcp/183b90c2-0ae0-467a-8a71-cbddda06cd4d/opts
Feb 20 09:56:43 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:43.185 264355 INFO neutron.agent.dhcp.agent [None req-eae2d8f0-ed48-45d9-8e7b-4c2051db44a7 - - - - - -] DHCP configuration for ports {'cdd8a163-912b-4c7f-8799-245420141a50'} is completed
Feb 20 09:56:44 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:44.137 2 INFO neutron.agent.securitygroups_rpc [None req-ec1ba1f0-724c-41a9-85b6-8188470faaf7 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:44 np0005625204.localdomain ceph-mon[301857]: pgmap v321: 177 pgs: 177 active+clean; 146 MiB data, 820 MiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 15 KiB/s wr, 123 op/s
Feb 20 09:56:44 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:56:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:44.653 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:44 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:44.883 2 INFO neutron.agent.securitygroups_rpc [None req-868e4387-1930-45d7-9199-5bcd1f2558e0 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:56:45 np0005625204.localdomain podman[317359]: 2026-02-20 09:56:45.159412665 +0000 UTC m=+0.090490606 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 09:56:45 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:45.170 2 INFO neutron.agent.securitygroups_rpc [None req-d92a2777-d32e-4211-954b-8d8918f6f596 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:45 np0005625204.localdomain podman[317359]: 2026-02-20 09:56:45.172847783 +0000 UTC m=+0.103925754 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:45 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:56:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:46 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:46.478 2 INFO neutron.agent.securitygroups_rpc [None req-d647a860-3cfb-47b1-bd0e-3817969b125e f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:46 np0005625204.localdomain ceph-mon[301857]: pgmap v322: 177 pgs: 177 active+clean; 146 MiB data, 821 MiB used, 41 GiB / 42 GiB avail; 1.3 MiB/s rd, 26 KiB/s wr, 158 op/s
Feb 20 09:56:46 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "877bc86e-3ae3-4788-a4cd-ecc7ec5a94e2", "format": "json"}]: dispatch
Feb 20 09:56:46 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:46.998 2 INFO neutron.agent.securitygroups_rpc [None req-5e0be3b9-12ef-421f-8325-abea826190b6 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:47 np0005625204.localdomain sshd[317378]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:56:47 np0005625204.localdomain sshd[317378]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4040843744' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4040843744' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:47 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:47.572 2 INFO neutron.agent.securitygroups_rpc [None req-c2e08264-5759-4dbe-9f11-1020e63a5df8 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f9ac42b7-680c-41fc-8784-6176baa738f7", "format": "json"}]: dispatch
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f9ac42b7-680c-41fc-8784-6176baa738f7", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4040843744' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4040843744' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:56:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:56:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:56:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158891 "" "Go-http-client/1.1"
Feb 20 09:56:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:56:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19287 "" "Go-http-client/1.1"
Feb 20 09:56:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e160 e160: 6 total, 6 up, 6 in
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/10291474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/10291474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: pgmap v323: 177 pgs: 177 active+clean; 146 MiB data, 821 MiB used, 41 GiB / 42 GiB avail; 1.1 MiB/s rd, 12 KiB/s wr, 91 op/s
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: osdmap e160: 6 total, 6 up, 6 in
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/10291474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/10291474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:49.655 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:56:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:49.661 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:49.728 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:49Z, description=, device_id=1d3f0830-6050-4ebf-baaf-b9d8b4a1ed67, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58c5ca0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5891370>], id=2ba0f911-9cd1-4881-a954-6bc0829ed2ed, ip_allocation=immediate, mac_address=fa:16:3e:c2:1e:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:34Z, description=, dns_domain=, id=91a3c914-50af-4619-8f46-93ff66e8b045, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1956959081, port_security_enabled=True, project_id=62f842a102bd4d84b1f4d275ec6dbea2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2340, status=ACTIVE, subnets=['eb9e42c2-7344-4772-9e65-da3068c65904'], tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:35Z, vlan_transparent=None, network_id=91a3c914-50af-4619-8f46-93ff66e8b045, port_security_enabled=False, project_id=62f842a102bd4d84b1f4d275ec6dbea2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2392, status=DOWN, tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:49Z on network 91a3c914-50af-4619-8f46-93ff66e8b045
Feb 20 09:56:49 np0005625204.localdomain dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 1 addresses
Feb 20 09:56:49 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host
Feb 20 09:56:49 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts
Feb 20 09:56:49 np0005625204.localdomain podman[317397]: 2026-02-20 09:56:49.944442899 +0000 UTC m=+0.066911751 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:56:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:50.158 264355 INFO neutron.agent.dhcp.agent [None req-f02725fb-8091-4836-abfa-579e6ca5e183 - - - - - -] DHCP configuration for ports {'2ba0f911-9cd1-4881-a954-6bc0829ed2ed'} is completed
Feb 20 09:56:50 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:50.722 2 INFO neutron.agent.securitygroups_rpc [None req-6b7b9439-974f-45e5-a614-5a8be0850c72 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "f96ea30b-5993-4393-8f64-efd08707fd5f", "format": "json"}]: dispatch
Feb 20 09:56:50 np0005625204.localdomain ceph-mon[301857]: pgmap v325: 177 pgs: 177 active+clean; 167 MiB data, 825 MiB used, 41 GiB / 42 GiB avail; 3.1 MiB/s rd, 892 KiB/s wr, 95 op/s
Feb 20 09:56:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "snap_name": "da20d4b5-2abc-49bc-a78c-39ce3cdadf16_977615dd-c728-40c5-bc37-c455d4274398", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "snap_name": "da20d4b5-2abc-49bc-a78c-39ce3cdadf16", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:51.034 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:49Z, description=, device_id=1d3f0830-6050-4ebf-baaf-b9d8b4a1ed67, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a00a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a00730>], id=2ba0f911-9cd1-4881-a954-6bc0829ed2ed, ip_allocation=immediate, mac_address=fa:16:3e:c2:1e:8c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:34Z, description=, dns_domain=, id=91a3c914-50af-4619-8f46-93ff66e8b045, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1956959081, port_security_enabled=True, project_id=62f842a102bd4d84b1f4d275ec6dbea2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=63121, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2340, status=ACTIVE, subnets=['eb9e42c2-7344-4772-9e65-da3068c65904'], tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:35Z, vlan_transparent=None, network_id=91a3c914-50af-4619-8f46-93ff66e8b045, port_security_enabled=False, project_id=62f842a102bd4d84b1f4d275ec6dbea2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2392, status=DOWN, tags=[], tenant_id=62f842a102bd4d84b1f4d275ec6dbea2, updated_at=2026-02-20T09:56:49Z on network 91a3c914-50af-4619-8f46-93ff66e8b045
Feb 20 09:56:51 np0005625204.localdomain dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 1 addresses
Feb 20 09:56:51 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host
Feb 20 09:56:51 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts
Feb 20 09:56:51 np0005625204.localdomain podman[317435]: 2026-02-20 09:56:51.233162098 +0000 UTC m=+0.063684052 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:51 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:51.307 2 INFO neutron.agent.securitygroups_rpc [None req-780282fe-fede-41a8-980f-26511e126244 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:51.500 264355 INFO neutron.agent.dhcp.agent [None req-124ef94c-65bc-4ecf-a0fc-1a6c75e8ae82 - - - - - -] DHCP configuration for ports {'2ba0f911-9cd1-4881-a954-6bc0829ed2ed'} is completed
Feb 20 09:56:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:51.547 264355 INFO neutron.agent.linux.ip_lib [None req-e7815bc1-0006-47f0-bd78-5a13b3bcf1e5 - - - - - -] Device tapc3226333-1e cannot be used as it has no MAC address
Feb 20 09:56:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:51.568 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:51 np0005625204.localdomain kernel: device tapc3226333-1e entered promiscuous mode
Feb 20 09:56:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:51Z|00310|binding|INFO|Claiming lport c3226333-1e26-4492-9c87-0c0e84626249 for this chassis.
Feb 20 09:56:51 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581411.5768] manager: (tapc3226333-1e): new Generic device (/org/freedesktop/NetworkManager/Devices/50)
Feb 20 09:56:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:51Z|00311|binding|INFO|c3226333-1e26-4492-9c87-0c0e84626249: Claiming unknown
Feb 20 09:56:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:51.578 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:51 np0005625204.localdomain systemd-udevd[317466]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:56:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:51.587 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfec81f2-09be-417a-b2dd-d08c91c1c606, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c3226333-1e26-4492-9c87-0c0e84626249) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:51.590 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c3226333-1e26-4492-9c87-0c0e84626249 in datapath 623d6533-b3ff-446c-abe7-d04d15b2cb53 bound to our chassis
Feb 20 09:56:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:51.592 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 623d6533-b3ff-446c-abe7-d04d15b2cb53 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:51.594 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d35a9f0a-a1b4-427c-a133-faca84670ace]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:51.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:51Z|00312|binding|INFO|Setting lport c3226333-1e26-4492-9c87-0c0e84626249 ovn-installed in OVS
Feb 20 09:56:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:51Z|00313|binding|INFO|Setting lport c3226333-1e26-4492-9c87-0c0e84626249 up in Southbound
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:51.615 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc3226333-1e: No such device
Feb 20 09:56:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:51.649 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:51.676 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e161 e161: 6 total, 6 up, 6 in
Feb 20 09:56:52 np0005625204.localdomain dnsmasq[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/addn_hosts - 0 addresses
Feb 20 09:56:52 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/host
Feb 20 09:56:52 np0005625204.localdomain dnsmasq-dhcp[317172]: read /var/lib/neutron/dhcp/91a3c914-50af-4619-8f46-93ff66e8b045/opts
Feb 20 09:56:52 np0005625204.localdomain podman[317528]: 2026-02-20 09:56:52.227549437 +0000 UTC m=+0.063335772 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:56:52 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:52Z|00314|binding|INFO|Releasing lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce from this chassis (sb_readonly=0)
Feb 20 09:56:52 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:52Z|00315|binding|INFO|Setting lport c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce down in Southbound
Feb 20 09:56:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:52.369 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:52 np0005625204.localdomain kernel: device tapc5a7b9ef-7d left promiscuous mode
Feb 20 09:56:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:52.377 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-91a3c914-50af-4619-8f46-93ff66e8b045', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8616b81b-c719-43b5-be2c-dbf68397c33b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:52.379 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c5a7b9ef-7d0c-44fe-bc37-50b465ea21ce in datapath 91a3c914-50af-4619-8f46-93ff66e8b045 unbound from our chassis
Feb 20 09:56:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:52.382 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 91a3c914-50af-4619-8f46-93ff66e8b045 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:52 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:52.382 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[12814e8a-cbce-4efa-9ce4-668fdd8a0ac3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:52.386 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:52 np0005625204.localdomain podman[317575]: 
Feb 20 09:56:52 np0005625204.localdomain podman[317575]: 2026-02-20 09:56:52.619098567 +0000 UTC m=+0.093329823 container create 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:56:52 np0005625204.localdomain systemd[1]: Started libpod-conmon-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196.scope.
Feb 20 09:56:52 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:56:52 np0005625204.localdomain podman[317575]: 2026-02-20 09:56:52.578134054 +0000 UTC m=+0.052365330 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:56:52 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/757a2b6f79ab43144d1e25f65f21cd12cf34ce98c88daf04e540abccae414274/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:56:52 np0005625204.localdomain podman[317575]: 2026-02-20 09:56:52.694114083 +0000 UTC m=+0.168345349 container init 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:56:52 np0005625204.localdomain podman[317575]: 2026-02-20 09:56:52.703812456 +0000 UTC m=+0.178043722 container start 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:56:52 np0005625204.localdomain dnsmasq[317594]: started, version 2.85 cachesize 150
Feb 20 09:56:52 np0005625204.localdomain dnsmasq[317594]: DNS service limited to local subnets
Feb 20 09:56:52 np0005625204.localdomain dnsmasq[317594]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:56:52 np0005625204.localdomain dnsmasq[317594]: warning: no upstream servers configured
Feb 20 09:56:52 np0005625204.localdomain dnsmasq-dhcp[317594]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:56:52 np0005625204.localdomain dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 0 addresses
Feb 20 09:56:52 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host
Feb 20 09:56:52 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts
Feb 20 09:56:52 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:52.873 264355 INFO neutron.agent.dhcp.agent [None req-a2c35da7-7920-406e-afba-a2c8c673ae73 - - - - - -] DHCP configuration for ports {'435cdbe8-09f2-4cba-8ebb-dfddbdfafca4'} is completed
Feb 20 09:56:52 np0005625204.localdomain ceph-mon[301857]: pgmap v326: 177 pgs: 177 active+clean; 203 MiB data, 859 MiB used, 41 GiB / 42 GiB avail; 4.8 MiB/s rd, 2.6 MiB/s wr, 146 op/s
Feb 20 09:56:52 np0005625204.localdomain ceph-mon[301857]: osdmap e161: 6 total, 6 up, 6 in
Feb 20 09:56:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4023810761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4023810761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:53 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:53.862 2 INFO neutron.agent.securitygroups_rpc [None req-db944330-da94-4d69-be05-7c7a8491a44e 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "2b637ee9-db13-447d-b623-b978babd5cfe", "format": "json"}]: dispatch
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "format": "json"}]: dispatch
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5d6cbe8b-e61d-44af-be13-78bc30a91a96", "force": true, "format": "json"}]: dispatch
Feb 20 09:56:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:53.924 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:53Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5868610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58680d0>], id=4fdf0c37-625e-4dd0-9c79-90b44fc71337, ip_allocation=immediate, mac_address=fa:16:3e:40:b5:20, name=tempest-PortsTestJSON-1423588587, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:56:49Z, description=, dns_domain=, id=623d6533-b3ff-446c-abe7-d04d15b2cb53, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-1760858149, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=65333, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2389, status=ACTIVE, subnets=['0edddd91-f1e3-43bf-aadf-6a42a8b0af87'], tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:50Z, vlan_transparent=None, network_id=623d6533-b3ff-446c-abe7-d04d15b2cb53, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['72ed92b6-af24-4274-854b-a52220405faf'], standard_attr_id=2400, status=DOWN, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:53Z on network 623d6533-b3ff-446c-abe7-d04d15b2cb53
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.932144) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581413932251, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2683, "num_deletes": 265, "total_data_size": 5071826, "memory_usage": 5129296, "flush_reason": "Manual Compaction"}
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581413947334, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 3309686, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20100, "largest_seqno": 22778, "table_properties": {"data_size": 3299048, "index_size": 6823, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 23780, "raw_average_key_size": 22, "raw_value_size": 3277368, "raw_average_value_size": 3043, "num_data_blocks": 288, "num_entries": 1077, "num_filter_entries": 1077, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581267, "oldest_key_time": 1771581267, "file_creation_time": 1771581413, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 15234 microseconds, and 8046 cpu microseconds.
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.947388) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 3309686 bytes OK
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.947422) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.949333) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.949358) EVENT_LOG_v1 {"time_micros": 1771581413949351, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.949381) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 5059584, prev total WAL file size 5059584, number of live WAL files 2.
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.950617) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(3232KB)], [30(15MB)]
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581413950693, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 19223658, "oldest_snapshot_seqno": -1}
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2065491436' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2065491436' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12731 keys, 17974446 bytes, temperature: kUnknown
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581414031074, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 17974446, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17900785, "index_size": 40728, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31877, "raw_key_size": 340497, "raw_average_key_size": 26, "raw_value_size": 17683119, "raw_average_value_size": 1388, "num_data_blocks": 1552, "num_entries": 12731, "num_filter_entries": 12731, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581413, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.031738) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 17974446 bytes
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.033543) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.4 rd, 222.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 15.2 +0.0 blob) out(17.1 +0.0 blob), read-write-amplify(11.2) write-amplify(5.4) OK, records in: 13279, records dropped: 548 output_compression: NoCompression
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.033584) EVENT_LOG_v1 {"time_micros": 1771581414033563, "job": 16, "event": "compaction_finished", "compaction_time_micros": 80640, "compaction_time_cpu_micros": 46683, "output_level": 6, "num_output_files": 1, "total_output_size": 17974446, "num_input_records": 13279, "num_output_records": 12731, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581414034413, "job": 16, "event": "table_file_deletion", "file_number": 32}
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581414037855, "job": 16, "event": "table_file_deletion", "file_number": 30}
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:53.950554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037975) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037983) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037986) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:56:54.037992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:56:54 np0005625204.localdomain dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 1 addresses
Feb 20 09:56:54 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host
Feb 20 09:56:54 np0005625204.localdomain podman[317626]: 2026-02-20 09:56:54.216672445 +0000 UTC m=+0.051658068 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:54 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: tmp-crun.GoAI4T.mount: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain sshd[317652]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:56:54 np0005625204.localdomain dnsmasq[317358]: exiting on receipt of SIGTERM
Feb 20 09:56:54 np0005625204.localdomain podman[317640]: 2026-02-20 09:56:54.281821402 +0000 UTC m=+0.082857765 container kill 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: libpod-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358.scope: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain podman[317650]: 2026-02-20 09:56:54.325100435 +0000 UTC m=+0.083020930 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:56:54 np0005625204.localdomain podman[317667]: 2026-02-20 09:56:54.342046929 +0000 UTC m=+0.049842173 container died 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 09:56:54 np0005625204.localdomain podman[317650]: 2026-02-20 09:56:54.364993415 +0000 UTC m=+0.122913910 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.416 264355 INFO neutron.agent.dhcp.agent [None req-720253c6-c001-445f-9718-c34db24c811b - - - - - -] DHCP configuration for ports {'4fdf0c37-625e-4dd0-9c79-90b44fc71337'} is completed
Feb 20 09:56:54 np0005625204.localdomain podman[317667]: 2026-02-20 09:56:54.419549911 +0000 UTC m=+0.127345145 container cleanup 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: libpod-conmon-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358.scope: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain podman[317674]: 2026-02-20 09:56:54.443211868 +0000 UTC m=+0.128653484 container remove 689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-183b90c2-0ae0-467a-8a71-cbddda06cd4d, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:54 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:54Z|00316|binding|INFO|Releasing lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a from this chassis (sb_readonly=0)
Feb 20 09:56:54 np0005625204.localdomain kernel: device tape4daede4-dd left promiscuous mode
Feb 20 09:56:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:54.494 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:54 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:54Z|00317|binding|INFO|Setting lport e4daede4-dda0-4eeb-801e-ab2b266b4f0a down in Southbound
Feb 20 09:56:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:54.505 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-183b90c2-0ae0-467a-8a71-cbddda06cd4d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62f842a102bd4d84b1f4d275ec6dbea2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6aac0e58-10ca-4f2e-89c4-11cc34f042d9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=e4daede4-dda0-4eeb-801e-ab2b266b4f0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:54.507 162652 INFO neutron.agent.ovn.metadata.agent [-] Port e4daede4-dda0-4eeb-801e-ab2b266b4f0a in datapath 183b90c2-0ae0-467a-8a71-cbddda06cd4d unbound from our chassis
Feb 20 09:56:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:54.509 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 183b90c2-0ae0-467a-8a71-cbddda06cd4d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:56:54 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:54.510 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[80d90cd9-8f17-4af2-94b9-c5989b332cdc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:54.517 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-83bff4229e001c3857e1bb62e3030a2db0bdff95dede477c807197129286ef30-merged.mount: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-689a867c542b8c9c11cb75ab1578695d891af76606ca698cd5e0a8a613a54358-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.650 264355 INFO neutron.agent.dhcp.agent [None req-c51a5fd7-7025-4892-82c1-b63078fab0b3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.650 264355 INFO neutron.agent.dhcp.agent [None req-c51a5fd7-7025-4892-82c1-b63078fab0b3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:54 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d183b90c2\x2d0ae0\x2d467a\x2d8a71\x2dcbddda06cd4d.mount: Deactivated successfully.
Feb 20 09:56:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:54.658 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:54.662 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:54 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:54.771 2 INFO neutron.agent.securitygroups_rpc [None req-2eacea7e-ffc4-4411-bf47-38f06768c1ff f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:54 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:54.897 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e162 e162: 6 total, 6 up, 6 in
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: pgmap v328: 177 pgs: 177 active+clean; 203 MiB data, 859 MiB used, 41 GiB / 42 GiB avail; 4.2 MiB/s rd, 2.8 MiB/s wr, 121 op/s
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2065491436' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2065491436' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:55 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:55.097 2 INFO neutron.agent.securitygroups_rpc [None req-fc9a7b92-030e-427f-b2ca-50d327cf6718 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:55 np0005625204.localdomain sshd[317652]: Received disconnect from 86.99.116.54 port 40218:11: Bye Bye [preauth]
Feb 20 09:56:55 np0005625204.localdomain sshd[317652]: Disconnected from authenticating user root 86.99.116.54 port 40218 [preauth]
Feb 20 09:56:55 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:55Z|00318|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:56:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:55.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e163 e163: 6 total, 6 up, 6 in
Feb 20 09:56:55 np0005625204.localdomain ceph-mon[301857]: osdmap e162: 6 total, 6 up, 6 in
Feb 20 09:56:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1035388618' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1035388618' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:55 np0005625204.localdomain ceph-mon[301857]: osdmap e163: 6 total, 6 up, 6 in
Feb 20 09:56:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:56:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:56.345 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:56:53Z, description=, device_id=ac59bc6c-a14a-43db-804a-739accb55b95, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62b46a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62b4b80>], id=4fdf0c37-625e-4dd0-9c79-90b44fc71337, ip_allocation=immediate, mac_address=fa:16:3e:40:b5:20, name=tempest-PortsTestJSON-1423588587, network_id=623d6533-b3ff-446c-abe7-d04d15b2cb53, port_security_enabled=True, project_id=62dd1d7e0cf547678612304aba2895e2, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=3, security_groups=['72ed92b6-af24-4274-854b-a52220405faf'], standard_attr_id=2400, status=ACTIVE, tags=[], tenant_id=62dd1d7e0cf547678612304aba2895e2, updated_at=2026-02-20T09:56:54Z on network 623d6533-b3ff-446c-abe7-d04d15b2cb53
Feb 20 09:56:56 np0005625204.localdomain dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 1 addresses
Feb 20 09:56:56 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host
Feb 20 09:56:56 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts
Feb 20 09:56:56 np0005625204.localdomain podman[317730]: 2026-02-20 09:56:56.551608215 +0000 UTC m=+0.048152322 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:56:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:56:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:56:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:56:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:56:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:56:56 np0005625204.localdomain systemd[1]: tmp-crun.lk9UaJ.mount: Deactivated successfully.
Feb 20 09:56:56 np0005625204.localdomain dnsmasq[317172]: exiting on receipt of SIGTERM
Feb 20 09:56:56 np0005625204.localdomain podman[317766]: 2026-02-20 09:56:56.722405437 +0000 UTC m=+0.052339230 container kill 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:56 np0005625204.localdomain systemd[1]: libpod-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6.scope: Deactivated successfully.
Feb 20 09:56:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:56.779 264355 INFO neutron.agent.dhcp.agent [None req-f70db66b-a68a-4f5e-b535-053047f6e0e8 - - - - - -] DHCP configuration for ports {'4fdf0c37-625e-4dd0-9c79-90b44fc71337'} is completed
Feb 20 09:56:56 np0005625204.localdomain podman[317783]: 2026-02-20 09:56:56.785792939 +0000 UTC m=+0.048666987 container died 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:56 np0005625204.localdomain podman[317783]: 2026-02-20 09:56:56.825530506 +0000 UTC m=+0.088404484 container remove 1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-91a3c914-50af-4619-8f46-93ff66e8b045, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:56:56 np0005625204.localdomain systemd[1]: libpod-conmon-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6.scope: Deactivated successfully.
Feb 20 09:56:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:56.870 264355 INFO neutron.agent.dhcp.agent [None req-dab06233-d07f-4325-9d26-29b01790e6ee - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e164 e164: 6 total, 6 up, 6 in
Feb 20 09:56:57 np0005625204.localdomain ceph-mon[301857]: pgmap v330: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 4.4 MiB/s rd, 5.6 MiB/s wr, 243 op/s
Feb 20 09:56:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/981625636' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:57.204 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.224 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.225 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.226 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.268 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8:0:1:f816:3eff:fef0:4eb0'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f57d4cee-545f-46ed-8eee-c1b976528137, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=cebd560e-7047-4cc1-9642-f5b7ec377d58) old=Port_Binding(mac=['fa:16:3e:f0:4e:b0 2001:db8::f816:3eff:fef0:4eb0'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fef0:4eb0/64', 'neutron:device_id': 'ovnmeta-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-811e2462-6872-485d-9c09-d2dd9cb25273', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cb36e48ce4264babb412d413a8bf7b9f', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.269 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port cebd560e-7047-4cc1-9642-f5b7ec377d58 in datapath 811e2462-6872-485d-9c09-d2dd9cb25273 updated
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.298 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 811e2462-6872-485d-9c09-d2dd9cb25273, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:57.299 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[49153975-1579-4d97-aad6-b9c45ebd4a2f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:57 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:57.478 2 INFO neutron.agent.securitygroups_rpc [None req-8da4750a-6dfd-49c6-9c80-6371938bf016 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:56:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-fd26698bf47eaf748148ca73e0ba4205722c28c51e908d2f97ffc72f55d6d626-merged.mount: Deactivated successfully.
Feb 20 09:56:57 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1984570f0df543d0e189a74c3876c521b3b4d4d71b79d1d32c3bd58cda5a0aa6-userdata-shm.mount: Deactivated successfully.
Feb 20 09:56:57 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d91a3c914\x2d50af\x2d4619\x2d8f46\x2d93ff66e8b045.mount: Deactivated successfully.
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:56:57 np0005625204.localdomain systemd[1]: tmp-crun.jdWdUe.mount: Deactivated successfully.
Feb 20 09:56:57 np0005625204.localdomain dnsmasq[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/addn_hosts - 0 addresses
Feb 20 09:56:57 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/host
Feb 20 09:56:57 np0005625204.localdomain dnsmasq-dhcp[317594]: read /var/lib/neutron/dhcp/623d6533-b3ff-446c-abe7-d04d15b2cb53/opts
Feb 20 09:56:57 np0005625204.localdomain podman[317824]: 2026-02-20 09:56:57.738459623 +0000 UTC m=+0.071288914 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.759 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.760 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:56:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:57.760 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "0b99d555-2c83-4adc-81b2-c57674191232", "format": "json"}]: dispatch
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: osdmap e164: 6 total, 6 up, 6 in
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/939907693' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e165 e165: 6 total, 6 up, 6 in
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2324158721' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:58 np0005625204.localdomain kernel: device tapc3226333-1e left promiscuous mode
Feb 20 09:56:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:58Z|00319|binding|INFO|Releasing lport c3226333-1e26-4492-9c87-0c0e84626249 from this chassis (sb_readonly=0)
Feb 20 09:56:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:56:58Z|00320|binding|INFO|Setting lport c3226333-1e26-4492-9c87-0c0e84626249 down in Southbound
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.176 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:58.182 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-623d6533-b3ff-446c-abe7-d04d15b2cb53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfec81f2-09be-417a-b2dd-d08c91c1c606, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c3226333-1e26-4492-9c87-0c0e84626249) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:56:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:58.183 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c3226333-1e26-4492-9c87-0c0e84626249 in datapath 623d6533-b3ff-446c-abe7-d04d15b2cb53 unbound from our chassis
Feb 20 09:56:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:58.185 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 623d6533-b3ff-446c-abe7-d04d15b2cb53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:56:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:56:58.185 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[46277c38-0280-4e50-bd2c-e6a3b79ff1a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.195 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.199 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:58 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:58.253 2 INFO neutron.agent.securitygroups_rpc [None req-23120728-b18f-4316-9b23-0bff49b361e7 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.267 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.267 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.487 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.488 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11304MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.489 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.489 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.801 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.802 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.802 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2859662458' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:56:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2859662458' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.855 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.924 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.925 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.954 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 09:56:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:58.980 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.019 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:56:59 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:56:59.039 2 INFO neutron.agent.securitygroups_rpc [None req-8dad7750-521e-4bcf-b6a0-3ff2b8fade37 f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: pgmap v333: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 3.6 MiB/s wr, 159 op/s
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: osdmap e165: 6 total, 6 up, 6 in
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2324158721' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2859662458' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2859662458' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:56:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:56:59 np0005625204.localdomain systemd[1]: tmp-crun.MzvD2H.mount: Deactivated successfully.
Feb 20 09:56:59 np0005625204.localdomain podman[317871]: 2026-02-20 09:56:59.154877505 +0000 UTC m=+0.092451196 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:56:59 np0005625204.localdomain podman[317871]: 2026-02-20 09:56:59.166200119 +0000 UTC m=+0.103773860 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:56:59 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:56:59 np0005625204.localdomain dnsmasq[317594]: exiting on receipt of SIGTERM
Feb 20 09:56:59 np0005625204.localdomain podman[317931]: 2026-02-20 09:56:59.32347224 +0000 UTC m=+0.057506165 container kill 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:56:59 np0005625204.localdomain systemd[1]: libpod-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196.scope: Deactivated successfully.
Feb 20 09:56:59 np0005625204.localdomain podman[317951]: 2026-02-20 09:56:59.407943144 +0000 UTC m=+0.057516157 container died 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:56:59 np0005625204.localdomain podman[317951]: 2026-02-20 09:56:59.455190656 +0000 UTC m=+0.104763599 container remove 1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-623d6533-b3ff-446c-abe7-d04d15b2cb53, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:56:59 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1602955077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.482 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.488 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:56:59 np0005625204.localdomain systemd[1]: libpod-conmon-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196.scope: Deactivated successfully.
Feb 20 09:56:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:59.501 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.507 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.510 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.511 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:56:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:56:59.661 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:56:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:56:59.800 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:00 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:00Z|00321|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:57:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3349450434' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3349450434' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1602955077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:57:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2757870375' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2757870375' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:00.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-757a2b6f79ab43144d1e25f65f21cd12cf34ce98c88daf04e540abccae414274-merged.mount: Deactivated successfully.
Feb 20 09:57:00 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ebfded6895cf8e9d34ae275374e87f8f2b198821da8cddb6b45bc5607342196-userdata-shm.mount: Deactivated successfully.
Feb 20 09:57:00 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d623d6533\x2db3ff\x2d446c\x2dabe7\x2dd04d15b2cb53.mount: Deactivated successfully.
Feb 20 09:57:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:00.511 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:00.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:01 np0005625204.localdomain ceph-mon[301857]: pgmap v335: 177 pgs: 177 active+clean; 193 MiB data, 913 MiB used, 41 GiB / 42 GiB avail; 96 KiB/s rd, 13 KiB/s wr, 124 op/s
Feb 20 09:57:01 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "0b99d555-2c83-4adc-81b2-c57674191232_67b12e3d-cabb-4489-8f7a-b787cf53ee59", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:01 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "0b99d555-2c83-4adc-81b2-c57674191232", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2571226302' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2571226302' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:57:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:01.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:02.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:02.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:57:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:57:02 np0005625204.localdomain podman[317973]: 2026-02-20 09:57:02.832410588 +0000 UTC m=+0.084315769 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1770267347, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:57:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e166 e166: 6 total, 6 up, 6 in
Feb 20 09:57:02 np0005625204.localdomain podman[317973]: 2026-02-20 09:57:02.871162474 +0000 UTC m=+0.123067655 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc.)
Feb 20 09:57:02 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:57:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:03.034 2 INFO neutron.agent.securitygroups_rpc [None req-f7f4080a-576f-4ec0-afc4-2b369a4e24bc 90a02ec8973644daaf9f628e26b82aba 68587c4c15964f28ad6d155288e119b0 - - default default] Security group rule updated ['602964d2-c9d4-4795-879d-2f4697b07a9a']
Feb 20 09:57:03 np0005625204.localdomain ceph-mon[301857]: pgmap v336: 177 pgs: 177 active+clean; 193 MiB data, 917 MiB used, 41 GiB / 42 GiB avail; 148 KiB/s rd, 14 KiB/s wr, 199 op/s
Feb 20 09:57:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3998727060' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3998727060' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:03 np0005625204.localdomain ceph-mon[301857]: osdmap e166: 6 total, 6 up, 6 in
Feb 20 09:57:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:03.608 2 INFO neutron.agent.securitygroups_rpc [None req-7415ebd2-08fb-4812-9524-708fe60e5aaa 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['efc53d5c-88f6-4ec9-8815-9d765811b12e']
Feb 20 09:57:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e167 e167: 6 total, 6 up, 6 in
Feb 20 09:57:03 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:03.990 2 INFO neutron.agent.securitygroups_rpc [None req-f1e881f7-c612-45f2-b27b-1f5d8fe2e21f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:57:04 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:04.436 2 INFO neutron.agent.securitygroups_rpc [None req-0e7411d4-9e8e-44df-aee4-9bd8dc94f75f f8f7e83376364aed92a95cbecc4fe358 cb36e48ce4264babb412d413a8bf7b9f - - default default] Security group member updated ['a85f25c3-88e2-4d71-a8d4-72a266c1246c']
Feb 20 09:57:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:04.664 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "2b637ee9-db13-447d-b623-b978babd5cfe_82c45f12-897c-4165-a7b7-4039d7d47e93", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "2b637ee9-db13-447d-b623-b978babd5cfe", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:05 np0005625204.localdomain ceph-mon[301857]: osdmap e167: 6 total, 6 up, 6 in
Feb 20 09:57:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:05.184 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec4f10d3-21c0-4f32-97fb-37b6b004b601, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3568ed7b-9263-43af-b4fd-ae333afc9a3b) old=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:05.186 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3568ed7b-9263-43af-b4fd-ae333afc9a3b in datapath 039b20b8-16a8-495e-968a-63fcd66a566c updated
Feb 20 09:57:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:05.189 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 039b20b8-16a8-495e-968a-63fcd66a566c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:57:05 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:05.191 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[67ad4991-64b9-4f7e-b1b1-880ef2e7b029]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:05.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:05.723 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:57:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:05.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:57:05 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:05.910 2 INFO neutron.agent.securitygroups_rpc [None req-e5b5b6c2-ec9f-4e00-8939-9097e06787f1 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['c1686fb3-a7b5-4191-9c5e-7c249c4e6c3c', 'efc53d5c-88f6-4ec9-8815-9d765811b12e']
Feb 20 09:57:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:57:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:57:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:57:06 np0005625204.localdomain ceph-mon[301857]: pgmap v338: 177 pgs: 177 active+clean; 193 MiB data, 917 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 13 KiB/s wr, 183 op/s
Feb 20 09:57:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2301409396' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:57:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4294664811' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.082 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.082 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.082 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.083 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:57:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:57:06 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:06.382 2 INFO neutron.agent.securitygroups_rpc [None req-15e35e99-a929-4578-aaaf-c6b96452307f 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['c1686fb3-a7b5-4191-9c5e-7c249c4e6c3c']
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.859 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.877 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:57:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:06.878 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:57:07 np0005625204.localdomain ceph-mon[301857]: pgmap v340: 177 pgs: 177 active+clean; 193 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 139 KiB/s rd, 51 KiB/s wr, 190 op/s
Feb 20 09:57:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:57:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "format": "json"}]: dispatch
Feb 20 09:57:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:57:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e168 e168: 6 total, 6 up, 6 in
Feb 20 09:57:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:57:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "f96ea30b-5993-4393-8f64-efd08707fd5f_ff9c641b-ea0b-431f-af91-e17e9c0dd44a", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "f96ea30b-5993-4393-8f64-efd08707fd5f", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2989862322' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2989862322' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: osdmap e168: 6 total, 6 up, 6 in
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e169 e169: 6 total, 6 up, 6 in
Feb 20 09:57:08 np0005625204.localdomain systemd[1]: tmp-crun.kZmkt0.mount: Deactivated successfully.
Feb 20 09:57:08 np0005625204.localdomain podman[317994]: 2026-02-20 09:57:08.157208685 +0000 UTC m=+0.089365444 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 20 09:57:08 np0005625204.localdomain podman[317994]: 2026-02-20 09:57:08.166148537 +0000 UTC m=+0.098305346 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:57:08 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:57:08 np0005625204.localdomain podman[317993]: 2026-02-20 09:57:08.249728414 +0000 UTC m=+0.185834894 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 20 09:57:08 np0005625204.localdomain podman[317993]: 2026-02-20 09:57:08.292024833 +0000 UTC m=+0.228131283 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:57:08 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1936621399' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:57:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1936621399' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:57:09 np0005625204.localdomain ceph-mon[301857]: pgmap v342: 177 pgs: 177 active+clean; 193 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 50 KiB/s wr, 37 op/s
Feb 20 09:57:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "format": "json"}]: dispatch
Feb 20 09:57:09 np0005625204.localdomain ceph-mon[301857]: osdmap e169: 6 total, 6 up, 6 in
Feb 20 09:57:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1936621399' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1936621399' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:09 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:09.630 2 INFO neutron.agent.securitygroups_rpc [None req-d187057a-39e5-4c52-82a0-d1bcafd46a90 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['46d5d21d-63a5-4d3d-a013-7b21b89cdba7']
Feb 20 09:57:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:09.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "snap_name": "faa1b419-44e8-41e3-a207-23a7805bf4c9", "format": "json"}]: dispatch
Feb 20 09:57:10 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3137678042' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:10 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3137678042' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:10.952 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec4f10d3-21c0-4f32-97fb-37b6b004b601, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3568ed7b-9263-43af-b4fd-ae333afc9a3b) old=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:10.954 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3568ed7b-9263-43af-b4fd-ae333afc9a3b in datapath 039b20b8-16a8-495e-968a-63fcd66a566c updated
Feb 20 09:57:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:10.956 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 039b20b8-16a8-495e-968a-63fcd66a566c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:57:10 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:10.957 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[fccf8251-7c70-4a6a-9c24-314ae1f8fca7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "877bc86e-3ae3-4788-a4cd-ecc7ec5a94e2_1280651e-fbca-4fea-b6ea-1b1c294293df", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: pgmap v344: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 88 KiB/s wr, 42 op/s
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "877bc86e-3ae3-4788-a4cd-ecc7ec5a94e2", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:57:11 np0005625204.localdomain sshd[318034]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:57:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:57:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:12.020 2 INFO neutron.agent.securitygroups_rpc [None req-6ce6cc4b-c215-41b7-8af5-e062eb4d8872 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['b7f2b362-1261-45d0-afca-4d7d4dc43da1', '66935af6-6884-4649-9f3d-6c32279f86ee', '46d5d21d-63a5-4d3d-a013-7b21b89cdba7']
Feb 20 09:57:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e170 e170: 6 total, 6 up, 6 in
Feb 20 09:57:12 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve49", "tenant_id": "1401fb23701440858ed7175cc4dba63b", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:57:12 np0005625204.localdomain sshd[318034]: Invalid user claude from 188.166.218.64 port 41150
Feb 20 09:57:12 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:12.591 2 INFO neutron.agent.securitygroups_rpc [None req-9d686e86-35e8-431c-8fc4-b6265d5fa0d0 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['b7f2b362-1261-45d0-afca-4d7d4dc43da1', '66935af6-6884-4649-9f3d-6c32279f86ee']
Feb 20 09:57:12 np0005625204.localdomain sshd[318034]: Received disconnect from 188.166.218.64 port 41150:11: Bye Bye [preauth]
Feb 20 09:57:12 np0005625204.localdomain sshd[318034]: Disconnected from invalid user claude 188.166.218.64 port 41150 [preauth]
Feb 20 09:57:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e171 e171: 6 total, 6 up, 6 in
Feb 20 09:57:13 np0005625204.localdomain ceph-mon[301857]: pgmap v345: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 72 KiB/s wr, 80 op/s
Feb 20 09:57:13 np0005625204.localdomain ceph-mon[301857]: osdmap e170: 6 total, 6 up, 6 in
Feb 20 09:57:13 np0005625204.localdomain ceph-mon[301857]: osdmap e171: 6 total, 6 up, 6 in
Feb 20 09:57:14 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:14.014 264355 INFO neutron.agent.linux.ip_lib [None req-ec7441c0-46d9-4736-8999-f022ef24d103 - - - - - -] Device tapc16513af-fd cannot be used as it has no MAC address
Feb 20 09:57:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:14.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:14 np0005625204.localdomain kernel: device tapc16513af-fd entered promiscuous mode
Feb 20 09:57:14 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581434.0552] manager: (tapc16513af-fd): new Generic device (/org/freedesktop/NetworkManager/Devices/51)
Feb 20 09:57:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:14Z|00322|binding|INFO|Claiming lport c16513af-fdad-437e-89f6-e90f98f0836a for this chassis.
Feb 20 09:57:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:14Z|00323|binding|INFO|c16513af-fdad-437e-89f6-e90f98f0836a: Claiming unknown
Feb 20 09:57:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:14.056 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:14 np0005625204.localdomain systemd-udevd[318046]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:57:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:14.067 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a08202c1391432d972dc0430612e0e0', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=edefd936-7bd3-45c5-ab80-3ad63680dbf7, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c16513af-fdad-437e-89f6-e90f98f0836a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:14.069 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c16513af-fdad-437e-89f6-e90f98f0836a in datapath 4eca95b1-f334-4c45-8797-de13f5964062 bound to our chassis
Feb 20 09:57:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:14.071 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4eca95b1-f334-4c45-8797-de13f5964062 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:57:14 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:14.073 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[08d5c43f-19fa-4206-bda7-b2b2e81587ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:14.099 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:14Z|00324|binding|INFO|Setting lport c16513af-fdad-437e-89f6-e90f98f0836a ovn-installed in OVS
Feb 20 09:57:14 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:14Z|00325|binding|INFO|Setting lport c16513af-fdad-437e-89f6-e90f98f0836a up in Southbound
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapc16513af-fd: No such device
Feb 20 09:57:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:14.140 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:14.172 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "8b7ef8b3-43af-41f4-8a24-ef7ee4fe0934_5c6fcb89-e7b9-4769-a40e-771ef85df0e8", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "snap_name": "8b7ef8b3-43af-41f4-8a24-ef7ee4fe0934", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "snap_name": "faa1b419-44e8-41e3-a207-23a7805bf4c9_428b559d-5ff8-4ee5-af31-0b6d83df877e", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "snap_name": "faa1b419-44e8-41e3-a207-23a7805bf4c9", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:14.697 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:15.039 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:68:6d:09'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '9', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ec4f10d3-21c0-4f32-97fb-37b6b004b601, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3568ed7b-9263-43af-b4fd-ae333afc9a3b) old=Port_Binding(mac=['fa:16:3e:68:6d:09 10.100.0.18 10.100.0.2 10.100.0.34'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-039b20b8-16a8-495e-968a-63fcd66a566c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '62dd1d7e0cf547678612304aba2895e2', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:15.041 162652 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3568ed7b-9263-43af-b4fd-ae333afc9a3b in datapath 039b20b8-16a8-495e-968a-63fcd66a566c updated
Feb 20 09:57:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:15.043 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 039b20b8-16a8-495e-968a-63fcd66a566c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:57:15 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:15.044 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdb8ee4-3f8d-4c1f-8d59-d4df9f2b96f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:15 np0005625204.localdomain podman[318117]: 
Feb 20 09:57:15 np0005625204.localdomain podman[318117]: 2026-02-20 09:57:15.119038039 +0000 UTC m=+0.095462980 container create edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:57:15 np0005625204.localdomain systemd[1]: Started libpod-conmon-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38.scope.
Feb 20 09:57:15 np0005625204.localdomain podman[318117]: 2026-02-20 09:57:15.072510901 +0000 UTC m=+0.048935862 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:57:15 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:57:15 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b81691ba7110d0b98058372d9f0498590f655b1c412c70b155fbfe7659ea93fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:57:15 np0005625204.localdomain podman[318117]: 2026-02-20 09:57:15.20475425 +0000 UTC m=+0.181179181 container init edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 09:57:15 np0005625204.localdomain podman[318117]: 2026-02-20 09:57:15.213728234 +0000 UTC m=+0.190153175 container start edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 20 09:57:15 np0005625204.localdomain dnsmasq[318135]: started, version 2.85 cachesize 150
Feb 20 09:57:15 np0005625204.localdomain dnsmasq[318135]: DNS service limited to local subnets
Feb 20 09:57:15 np0005625204.localdomain dnsmasq[318135]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:57:15 np0005625204.localdomain dnsmasq[318135]: warning: no upstream servers configured
Feb 20 09:57:15 np0005625204.localdomain dnsmasq-dhcp[318135]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:57:15 np0005625204.localdomain dnsmasq[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/addn_hosts - 0 addresses
Feb 20 09:57:15 np0005625204.localdomain dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/host
Feb 20 09:57:15 np0005625204.localdomain dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/opts
Feb 20 09:57:15 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: pgmap v348: 177 pgs: 177 active+clean; 193 MiB data, 903 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 42 KiB/s wr, 64 op/s
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve48", "tenant_id": "1401fb23701440858ed7175cc4dba63b", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1303196629' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1303196629' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:15 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:15.320 264355 INFO neutron.agent.dhcp.agent [None req-ba488952-ff62-4881-bb5d-d7825b9a9db9 - - - - - -] DHCP configuration for ports {'62c9e6ac-55ec-4e9e-80f1-f21e18b1f81c'} is completed
Feb 20 09:57:15 np0005625204.localdomain podman[318136]: 2026-02-20 09:57:15.327324004 +0000 UTC m=+0.080631307 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:57:15 np0005625204.localdomain podman[318136]: 2026-02-20 09:57:15.342157026 +0000 UTC m=+0.095464339 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 20 09:57:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e172 e172: 6 total, 6 up, 6 in
Feb 20 09:57:15 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:57:15 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:15.819 2 INFO neutron.agent.securitygroups_rpc [None req-e7815d36-a39d-42c8-a497-7fe4eae772f9 469c956856df4f7fa0303f662df3cdef 62dd1d7e0cf547678612304aba2895e2 - - default default] Security group member updated ['72ed92b6-af24-4274-854b-a52220405faf']
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2050439858' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2050439858' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:57:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:16.327 2 INFO neutron.agent.securitygroups_rpc [None req-3bc90bc6-4752-435d-941c-f0e75fc5d0a5 e8d99e5aba074cfb8aea01d99045d2af 8a08202c1391432d972dc0430612e0e0 - - default default] Security group member updated ['49b521a4-2cce-4f1a-b690-2fa2cab68db5']
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e173 e173: 6 total, 6 up, 6 in
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: osdmap e172: 6 total, 6 up, 6 in
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2050439858' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:16 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2050439858' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:16.421 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:57:15Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df589a610>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df589ae80>], id=715989dc-23f1-48d7-a326-2c1d3c713a9e, ip_allocation=immediate, mac_address=fa:16:3e:57:89:2e, name=tempest-TagsExtTest-1349879559, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:57:11Z, description=, dns_domain=, id=4eca95b1-f334-4c45-8797-de13f5964062, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TagsExtTest-test-network-293434349, port_security_enabled=True, project_id=8a08202c1391432d972dc0430612e0e0, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=44681, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2521, status=ACTIVE, subnets=['6a34847d-7d7a-4411-adb9-dcdb762b6f01'], tags=[], tenant_id=8a08202c1391432d972dc0430612e0e0, updated_at=2026-02-20T09:57:12Z, vlan_transparent=None, network_id=4eca95b1-f334-4c45-8797-de13f5964062, port_security_enabled=True, project_id=8a08202c1391432d972dc0430612e0e0, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['49b521a4-2cce-4f1a-b690-2fa2cab68db5'], standard_attr_id=2531, status=DOWN, tags=[], tenant_id=8a08202c1391432d972dc0430612e0e0, updated_at=2026-02-20T09:57:16Z on network 4eca95b1-f334-4c45-8797-de13f5964062
Feb 20 09:57:16 np0005625204.localdomain dnsmasq[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/addn_hosts - 1 addresses
Feb 20 09:57:16 np0005625204.localdomain dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/host
Feb 20 09:57:16 np0005625204.localdomain dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/opts
Feb 20 09:57:16 np0005625204.localdomain podman[318173]: 2026-02-20 09:57:16.655941904 +0000 UTC m=+0.063880737 container kill edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:57:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:16.879 264355 INFO neutron.agent.dhcp.agent [None req-2f030488-9419-4142-9d82-aa29e86c9d1f - - - - - -] DHCP configuration for ports {'715989dc-23f1-48d7-a326-2c1d3c713a9e'} is completed
Feb 20 09:57:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:16.927 2 INFO neutron.agent.securitygroups_rpc [None req-4f303a3e-0093-4654-a2f8-5b0853d1acad 3fd5694d6e624148892ddc3041d2f0e1 4bc7f22347de4004b73776eab4064bd0 - - default default] Security group member updated ['c599d16d-0283-4cf2-8a39-4a506ff8f2f0']
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: pgmap v350: 177 pgs: 177 active+clean; 194 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 92 KiB/s rd, 90 KiB/s wr, 140 op/s
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "format": "json"}]: dispatch
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9754040c-efe1-4b31-bb60-2e1a242177cd", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "format": "json"}]: dispatch
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "56e65ef4-1a61-4d5a-9d2e-c59d2323fd6c", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: osdmap e173: 6 total, 6 up, 6 in
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2731859964' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2731859964' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e174 e174: 6 total, 6 up, 6 in
Feb 20 09:57:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:57:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:57:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:57:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157080 "" "Go-http-client/1.1"
Feb 20 09:57:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:57:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18799 "" "Go-http-client/1.1"
Feb 20 09:57:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e175 e175: 6 total, 6 up, 6 in
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.212 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.217 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1e756da8-e38f-41d4-afaf-6555f506717a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.213784', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8aa95d4e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '3198116d2a162b8d7dd27a189c499fb47eb9e6ae2aba2e646575bf2143179831'}]}, 'timestamp': '2026-02-20 09:57:18.218455', '_unique_id': 'a7cf4ec84cf648db81177fc94225ae5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.219 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.220 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.221 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b426184-2a33-45f1-8b73-fcee18e2e14e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.221113', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8aa9da26-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '8d36cf3b1308f127882c99a0733f296c4033d6c6da915a6c7171dedb75a00cba'}]}, 'timestamp': '2026-02-20 09:57:18.221606', '_unique_id': '03af605a448a433386990b3f67d8d984'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.222 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.236 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.237 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '120a3d9f-7682-4e73-bfa7-6bca5f9db7c4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.223739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8aac4946-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '4ae99fb3fc3b199bff3fb41c20b0e9214e9403fa019a75f2e6dcd4eb13b0a35b'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.223739', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8aac5c38-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '6d3f11bf1b2e75adfeb04269de86d84869af846bfb294a0ff30741757bf4075d'}]}, 'timestamp': '2026-02-20 09:57:18.238015', '_unique_id': 'ae0d1377ae2c48cf97a9f4450b68c1f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.239 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.240 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.268 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c69d03f3-89f3-4130-aaf0-7d596339ecdb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.240431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab10f08-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '2e375e242239e23160c258ca57b1c4211ad180ccd7d01d28933cbae61cc8c2d2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.240431', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab12344-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '997906ac3c9ae840d6f7b097ba56d9c46457bbdbe8d9de0e6c2eff9a70ffa12e'}]}, 'timestamp': '2026-02-20 09:57:18.269357', '_unique_id': '775bf6d473fd47eeb78c0c13a90d70d9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.270 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.271 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.272 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70e869ed-943b-4fc8-83c9-97d2615f1c2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.272094', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab1a0a8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': 'ce9e75112d93eeab878ab3d20b7f99418350273bb4660634be488f5322938fab'}]}, 'timestamp': '2026-02-20 09:57:18.272559', '_unique_id': 'e612cd33550a437c98bf6f33cdfe9c04'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.273 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.274 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.274 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2da5cf08-afaf-45f9-a534-c84af187f4bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.274904', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab20e76-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '7208da92c6d716ab8e99d530601dd23713a1049a145d86601cbd0a9532f63460'}]}, 'timestamp': '2026-02-20 09:57:18.275367', '_unique_id': '7336f7fc9cf64e969ebd51b4fb10fb11'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.276 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.277 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.294 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eefd2161-099f-4731-bca9-be7711621fc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:57:18.277760', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '8ab51e4a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.5337437, 'message_signature': '670be395e11a0481ec84551029771770be49dea8fe6bbe044d8f9326c87d7450'}]}, 'timestamp': '2026-02-20 09:57:18.295496', '_unique_id': '12931a25edb54679bbc099565cf36dc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.296 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.298 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.298 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c16d816-82b9-4d03-9fcc-374c0ddef9a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.298585', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab5ac8e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '33555731d3e6f864879e9a9ca00f498949eda5d7abcb517e37ffc6a10bc152b4'}]}, 'timestamp': '2026-02-20 09:57:18.299080', '_unique_id': '0f84d22a45a64319aae948f1e7bd56af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.300 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.301 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.301 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.302 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b462a559-7319-4d33-8fd3-0ac3c5b76c7d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.301310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab61a84-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '914049685523df244357b1d81d1449103771ba8549bc71f225033c5243db9eb7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.301310', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab630dc-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '1c58bcf3299561c960cfc6523aba9d88826d54a7b58fd0ab94e48bef2f1c747a'}]}, 'timestamp': '2026-02-20 09:57:18.302508', '_unique_id': '8c0baed8dcc74b53afc6f29c9381e341'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.303 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.304 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.304 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd1734e8-776f-4936-9ec1-630bdf71b543', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.304895', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab6a18e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': 'c153345ec49f4099d89ea38462f832c8be029ca2e1aba4b662ebb091fa362ea7'}]}, 'timestamp': '2026-02-20 09:57:18.305345', '_unique_id': '5aa7989da31a4a61b0ab45f7490ba0fd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.306 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.307 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.307 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.308 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9211cdbd-268a-4fe4-a674-6aa7dbcf60e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.307593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab70ef8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '12a4077fa20b987e9bc09f608f83aceb499bf07fdfb9ba502791d2284f55427f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.307593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab71eca-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '058bf5b0ad574a23e64caa6aa71b3cf98fe1bfee356ef42455e2492218f9aeea'}]}, 'timestamp': '2026-02-20 09:57:18.308522', '_unique_id': 'c6982f6f308e4f0194758fbe9301fc31'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.309 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.311 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.311 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd753d21-9c91-4011-b65b-61412fd84e97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.311168', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab796d4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '4fd41f4007db6642000431373d65409315fc28e78ed98a4d5ae6dd843c93052b'}]}, 'timestamp': '2026-02-20 09:57:18.311625', '_unique_id': '31dddf4b846b4c28a93104ad19bde4f1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.312 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.313 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.313 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.314 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7971ab64-fe84-4646-98d8-e1846379142a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.313845', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab8020e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '52a08005e3d451e2ab573f356d6947195c55cb83a72afa8056eaa844354c8817'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.313845', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab811c2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': 'ed6e37d3806e040bfe8aaaa2d4adac283bd10fd65607aca3dbb3f03de3fb7982'}]}, 'timestamp': '2026-02-20 09:57:18.314770', '_unique_id': 'f3626e1b56664dffb75540dd3a3eea71'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.315 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.316 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.317 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1899c9a3-6059-4dd8-9911-39d769c85482', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.316973', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab87c2a-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '35a70c6ff7e507410b46c74163d7812b8c9a65152d4dd0456a1f25abb52d37ab'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.316973', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab88f76-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.462937813, 'message_signature': '644e92abb34af1d64ccf73686d52ee83782dc3ee264badb5b0498b7cc01764c5'}]}, 'timestamp': '2026-02-20 09:57:18.317964', '_unique_id': '04cb3f3217d040f78654d91a674311cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.318 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.319 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7c72e93a-5383-4bfe-8be6-6889162b3ffd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.319488', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab8d8a0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '70635494c011ae197f9e311a0db383d2928ed897905c798e29cb87ca25ed8323'}]}, 'timestamp': '2026-02-20 09:57:18.319805', '_unique_id': '11106eaef8f447339cd40377b6e776df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.320 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.321 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.321 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '52490b69-b223-4c6e-b5bc-df12c322686e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.321343', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8ab921b6-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '5a67188ed07cba673f7dc33fe4260f4ff117f11ae887123ef0be03261fa3353f'}]}, 'timestamp': '2026-02-20 09:57:18.321678', '_unique_id': '49cacc562e4e4d8799191c427185d54b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.322 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.323 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8c380bce-398a-4bc4-9d64-51a6cd79e4bc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.323137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab96900-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '7880d2359bdda27ceac9dfd6cc4388e5b2e4c403c4aa11970c735e799f30287c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.323137', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab9742c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '4d02e8e4631ceb72518b2507faa5c3094b4d9a5a4dd837f467d09dfc169784f4'}]}, 'timestamp': '2026-02-20 09:57:18.323769', '_unique_id': '6e9e7fc905bc48a38874f723e0e6c5aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.324 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.325 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f4eac1e-01f1-407c-8915-5f2b00f6cc31', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.325276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8ab9bd9c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '810fee5f6a2d35ac0163726aece82b550188ba6272787c9d3ad506c38f732a7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.325276', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8ab9c936-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '07d1778aea4d39e13c10adc1f01d62b89fde6debb6a226b909e10c754f890da9'}]}, 'timestamp': '2026-02-20 09:57:18.325927', '_unique_id': 'd1db5c6b355246bf8e509b8ab20988d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.326 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.327 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.327 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 18240000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a182bd07-a8e3-46bb-ad81-c6fab47771b4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18240000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:57:18.327340', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '8aba0d9c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.5337437, 'message_signature': '0fe9bd7258dd60ce865c8c80eedf0d103f0110bb401bcbe14b8ee8570beada89'}]}, 'timestamp': '2026-02-20 09:57:18.327705', '_unique_id': '900b74ad02444f96b14d11ed3bd418cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ef04b25-f6f6-44ac-89a6-12e0d7b131db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:57:18.329084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '8aba4f32-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '67c86543522c570a0e70efdfeed399cb9b84f08dcf79bc83ad480d2a812775cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:57:18.329084', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '8aba5c5c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.479665323, 'message_signature': '46cfd91bef4b03702dcaf6f95768c57db7c4e697f89ede9ef3d242dc3e0b2fe2'}]}, 'timestamp': '2026-02-20 09:57:18.329717', '_unique_id': 'd790f24088b94f11aa0f651fa8ca6021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.330 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.331 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b52dca2-cac5-412b-9f46-b50f2e69ba04', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:57:18.331132', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '8aba9f78-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 11997.45297744, 'message_signature': '93f691f32a3aeed537d22049fe467588ba44c3f655dc01706f9a102141d0b59d'}]}, 'timestamp': '2026-02-20 09:57:18.331428', '_unique_id': '0d160842cc6c4b5b862c1b3577b3bed2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:57:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:57:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: osdmap e174: 6 total, 6 up, 6 in
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: pgmap v353: 177 pgs: 177 active+clean; 194 MiB data, 940 MiB used, 41 GiB / 42 GiB avail; 64 KiB/s rd, 110 KiB/s wr, 102 op/s
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 20 09:57:18 np0005625204.localdomain ceph-mon[301857]: osdmap e175: 6 total, 6 up, 6 in
Feb 20 09:57:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e176 e176: 6 total, 6 up, 6 in
Feb 20 09:57:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:19.700 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:19.705 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:20 np0005625204.localdomain ceph-mon[301857]: osdmap e176: 6 total, 6 up, 6 in
Feb 20 09:57:20 np0005625204.localdomain ceph-mon[301857]: pgmap v356: 177 pgs: 177 active+clean; 194 MiB data, 958 MiB used, 41 GiB / 42 GiB avail; 2.5 KiB/s rd, 78 KiB/s wr, 11 op/s
Feb 20 09:57:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:20.850 2 INFO neutron.agent.securitygroups_rpc [None req-41f9c3ce-b340-465f-aa72-7be8aab7d24c 3fd5694d6e624148892ddc3041d2f0e1 4bc7f22347de4004b73776eab4064bd0 - - default default] Security group member updated ['c599d16d-0283-4cf2-8a39-4a506ff8f2f0']
Feb 20 09:57:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e176 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 20 09:57:21 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve47", "tenant_id": "1401fb23701440858ed7175cc4dba63b", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:57:21 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 20 09:57:21 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:57:21 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/397e3bc9-4e24-4148-9154-b1a0a65a1d98/2bd5db79-8b70-47ad-aae0-ee8531062739", "osd", "allow rw pool=manila_data namespace=fsvolumens_397e3bc9-4e24-4148-9154-b1a0a65a1d98", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:57:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e177 e177: 6 total, 6 up, 6 in
Feb 20 09:57:21 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:21.736 2 INFO neutron.agent.securitygroups_rpc [None req-84019169-f531-4b37-ab25-b8fba57ed27f e8d99e5aba074cfb8aea01d99045d2af 8a08202c1391432d972dc0430612e0e0 - - default default] Security group member updated ['49b521a4-2cce-4f1a-b690-2fa2cab68db5']
Feb 20 09:57:21 np0005625204.localdomain dnsmasq[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/addn_hosts - 0 addresses
Feb 20 09:57:21 np0005625204.localdomain dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/host
Feb 20 09:57:21 np0005625204.localdomain podman[318211]: 2026-02-20 09:57:21.998158391 +0000 UTC m=+0.060285358 container kill edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:57:21 np0005625204.localdomain dnsmasq-dhcp[318135]: read /var/lib/neutron/dhcp/4eca95b1-f334-4c45-8797-de13f5964062/opts
Feb 20 09:57:22 np0005625204.localdomain ceph-mon[301857]: osdmap e177: 6 total, 6 up, 6 in
Feb 20 09:57:22 np0005625204.localdomain ceph-mon[301857]: pgmap v358: 177 pgs: 177 active+clean; 194 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 5.1 MiB/s rd, 83 KiB/s wr, 175 op/s
Feb 20 09:57:22 np0005625204.localdomain dnsmasq[318135]: exiting on receipt of SIGTERM
Feb 20 09:57:22 np0005625204.localdomain podman[318251]: 2026-02-20 09:57:22.718912111 +0000 UTC m=+0.065965281 container kill edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:57:22 np0005625204.localdomain systemd[1]: tmp-crun.iVbAjW.mount: Deactivated successfully.
Feb 20 09:57:22 np0005625204.localdomain systemd[1]: libpod-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38.scope: Deactivated successfully.
Feb 20 09:57:22 np0005625204.localdomain podman[318265]: 2026-02-20 09:57:22.810538742 +0000 UTC m=+0.076907904 container died edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:57:22 np0005625204.localdomain podman[318265]: 2026-02-20 09:57:22.848710225 +0000 UTC m=+0.115079327 container cleanup edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:57:22 np0005625204.localdomain systemd[1]: libpod-conmon-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38.scope: Deactivated successfully.
Feb 20 09:57:22 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e178 e178: 6 total, 6 up, 6 in
Feb 20 09:57:22 np0005625204.localdomain podman[318267]: 2026-02-20 09:57:22.881787593 +0000 UTC m=+0.135699146 container remove edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4eca95b1-f334-4c45-8797-de13f5964062, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:57:22 np0005625204.localdomain kernel: device tapc16513af-fd left promiscuous mode
Feb 20 09:57:22 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:22Z|00326|binding|INFO|Releasing lport c16513af-fdad-437e-89f6-e90f98f0836a from this chassis (sb_readonly=0)
Feb 20 09:57:22 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:22Z|00327|binding|INFO|Setting lport c16513af-fdad-437e-89f6-e90f98f0836a down in Southbound
Feb 20 09:57:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:22.932 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:22 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:22.942 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4eca95b1-f334-4c45-8797-de13f5964062', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a08202c1391432d972dc0430612e0e0', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=edefd936-7bd3-45c5-ab80-3ad63680dbf7, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=c16513af-fdad-437e-89f6-e90f98f0836a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:22 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:22.944 162652 INFO neutron.agent.ovn.metadata.agent [-] Port c16513af-fdad-437e-89f6-e90f98f0836a in datapath 4eca95b1-f334-4c45-8797-de13f5964062 unbound from our chassis
Feb 20 09:57:22 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:22.948 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4eca95b1-f334-4c45-8797-de13f5964062, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:57:22 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:22.951 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[11407500-704d-4ddf-b30e-29d2de4f5778]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:22.953 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-b81691ba7110d0b98058372d9f0498590f655b1c412c70b155fbfe7659ea93fa-merged.mount: Deactivated successfully.
Feb 20 09:57:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-edf8de6d9c62e6bbd13a52dbff5d848268175cc36445561ad4c739e7afcbfe38-userdata-shm.mount: Deactivated successfully.
Feb 20 09:57:23 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d4eca95b1\x2df334\x2d4c45\x2d8797\x2dde13f5964062.mount: Deactivated successfully.
Feb 20 09:57:23 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:23.297 264355 INFO neutron.agent.dhcp.agent [None req-99012659-8adf-42c6-8bf6-c470f47099ca - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:23 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:23.297 264355 INFO neutron.agent.dhcp.agent [None req-99012659-8adf-42c6-8bf6-c470f47099ca - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:23 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:23.428 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:23 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 20 09:57:23 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:23Z|00328|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:57:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:23.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:23 np0005625204.localdomain ceph-mon[301857]: osdmap e178: 6 total, 6 up, 6 in
Feb 20 09:57:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e179 e179: 6 total, 6 up, 6 in
Feb 20 09:57:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:24.738 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:24 np0005625204.localdomain ceph-mon[301857]: pgmap v360: 177 pgs: 177 active+clean; 194 MiB data, 947 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 60 KiB/s wr, 127 op/s
Feb 20 09:57:24 np0005625204.localdomain ceph-mon[301857]: osdmap e179: 6 total, 6 up, 6 in
Feb 20 09:57:24 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 20 09:57:24 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 20 09:57:24 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Feb 20 09:57:25 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:57:25 np0005625204.localdomain podman[318294]: 2026-02-20 09:57:25.15493305 +0000 UTC m=+0.089403965 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:57:25 np0005625204.localdomain podman[318294]: 2026-02-20 09:57:25.179454188 +0000 UTC m=+0.113925153 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:57:25 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:57:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 20 09:57:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 20 09:57:25 np0005625204.localdomain ceph-mon[301857]: mgrmap e50: np0005625202.arwxwo(active, since 9m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 09:57:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e180 e180: 6 total, 6 up, 6 in
Feb 20 09:57:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:57:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:57:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:57:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:57:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:57:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:57:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e181 e181: 6 total, 6 up, 6 in
Feb 20 09:57:27 np0005625204.localdomain ceph-mon[301857]: pgmap v362: 177 pgs: 177 active+clean; 241 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 3.6 MiB/s rd, 3.6 MiB/s wr, 224 op/s
Feb 20 09:57:27 np0005625204.localdomain ceph-mon[301857]: osdmap e180: 6 total, 6 up, 6 in
Feb 20 09:57:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1300248318' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1300248318' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:27 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e182 e182: 6 total, 6 up, 6 in
Feb 20 09:57:28 np0005625204.localdomain ceph-mon[301857]: osdmap e181: 6 total, 6 up, 6 in
Feb 20 09:57:28 np0005625204.localdomain ceph-mon[301857]: osdmap e182: 6 total, 6 up, 6 in
Feb 20 09:57:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3824275205' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3824275205' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:28 np0005625204.localdomain sshd[318317]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:57:29 np0005625204.localdomain ceph-mon[301857]: pgmap v365: 177 pgs: 177 active+clean; 241 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 90 KiB/s rd, 4.6 MiB/s wr, 143 op/s
Feb 20 09:57:29 np0005625204.localdomain sshd[318317]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:57:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:29.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:57:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:29.742 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:57:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:29.742 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:57:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:29.742 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:57:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:29.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:29.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:57:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "format": "json"}]: dispatch
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "397e3bc9-4e24-4148-9154-b1a0a65a1d98", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e183 e183: 6 total, 6 up, 6 in
Feb 20 09:57:30 np0005625204.localdomain podman[318319]: 2026-02-20 09:57:30.150752994 +0000 UTC m=+0.090215870 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:57:30 np0005625204.localdomain podman[318319]: 2026-02-20 09:57:30.197137727 +0000 UTC m=+0.136600613 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 09:57:30 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:57:30 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:30.751 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:31 np0005625204.localdomain ceph-mon[301857]: pgmap v367: 177 pgs: 177 active+clean; 233 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 95 KiB/s rd, 3.9 MiB/s wr, 152 op/s
Feb 20 09:57:31 np0005625204.localdomain ceph-mon[301857]: osdmap e183: 6 total, 6 up, 6 in
Feb 20 09:57:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e183 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:32 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e184 e184: 6 total, 6 up, 6 in
Feb 20 09:57:32 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e185 e185: 6 total, 6 up, 6 in
Feb 20 09:57:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: pgmap v369: 177 pgs: 177 active+clean; 195 MiB data, 976 MiB used, 41 GiB / 42 GiB avail; 103 KiB/s rd, 66 KiB/s wr, 148 op/s
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: osdmap e184: 6 total, 6 up, 6 in
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: osdmap e185: 6 total, 6 up, 6 in
Feb 20 09:57:33 np0005625204.localdomain podman[318342]: 2026-02-20 09:57:33.155735509 +0000 UTC m=+0.085236398 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Feb 20 09:57:33 np0005625204.localdomain podman[318342]: 2026-02-20 09:57:33.203102032 +0000 UTC m=+0.132602931 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1770267347)
Feb 20 09:57:33 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2716620294' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:57:33 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2716620294' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:33.411 2 INFO neutron.agent.securitygroups_rpc [None req-203ffc30-16ac-4832-a92b-6d9503978c8f fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']
Feb 20 09:57:33 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:33.796 2 INFO neutron.agent.securitygroups_rpc [None req-203ffc30-16ac-4832-a92b-6d9503978c8f fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']
Feb 20 09:57:34 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e186 e186: 6 total, 6 up, 6 in
Feb 20 09:57:34 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2716620294' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:34 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2716620294' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:34.770 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:57:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:34.773 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:57:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:34.773 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:57:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:34.773 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:57:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:34.787 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:34.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:57:34 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:34.813 2 INFO neutron.agent.securitygroups_rpc [None req-3f70ffff-dbcf-4468-97f0-19b384f44318 fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']
Feb 20 09:57:34 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:34.854 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:35 np0005625204.localdomain ceph-mon[301857]: pgmap v372: 177 pgs: 177 active+clean; 195 MiB data, 976 MiB used, 41 GiB / 42 GiB avail; 102 KiB/s rd, 65 KiB/s wr, 146 op/s
Feb 20 09:57:35 np0005625204.localdomain ceph-mon[301857]: osdmap e186: 6 total, 6 up, 6 in
Feb 20 09:57:35 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:57:35.473 2 INFO neutron.agent.securitygroups_rpc [None req-16cacc97-a725-4413-b643-7033155fe483 fd5cbccf037047f8a8d2b9488223d9dc 92c7b74ddebf4a4a82ffeb6b9b8a9111 - - default default] Security group member updated ['23f09a95-adc3-4f59-96fa-bbbacd5ff83a']
Feb 20 09:57:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:37 np0005625204.localdomain ceph-mon[301857]: pgmap v374: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 10 KiB/s wr, 91 op/s
Feb 20 09:57:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3566039117' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3566039117' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 e187: 6 total, 6 up, 6 in
Feb 20 09:57:38 np0005625204.localdomain ceph-mon[301857]: pgmap v375: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 9.2 KiB/s wr, 83 op/s
Feb 20 09:57:38 np0005625204.localdomain ceph-mon[301857]: osdmap e187: 6 total, 6 up, 6 in
Feb 20 09:57:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:57:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:57:39 np0005625204.localdomain podman[318362]: 2026-02-20 09:57:39.151231088 +0000 UTC m=+0.083489024 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:57:39 np0005625204.localdomain systemd[1]: tmp-crun.dFdZw3.mount: Deactivated successfully.
Feb 20 09:57:39 np0005625204.localdomain podman[318363]: 2026-02-20 09:57:39.247882623 +0000 UTC m=+0.171305580 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 09:57:39 np0005625204.localdomain podman[318362]: 2026-02-20 09:57:39.255044312 +0000 UTC m=+0.187302208 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 20 09:57:39 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:57:39 np0005625204.localdomain podman[318363]: 2026-02-20 09:57:39.286262353 +0000 UTC m=+0.209685350 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:57:39 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:57:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:39.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:39.790 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:40 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:40.488 264355 INFO neutron.agent.linux.ip_lib [None req-7e3a4af5-26d0-4b97-ad9d-9cb19ae10ef5 - - - - - -] Device tapfc50cf28-29 cannot be used as it has no MAC address
Feb 20 09:57:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:40.550 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:40 np0005625204.localdomain kernel: device tapfc50cf28-29 entered promiscuous mode
Feb 20 09:57:40 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581460.5581] manager: (tapfc50cf28-29): new Generic device (/org/freedesktop/NetworkManager/Devices/52)
Feb 20 09:57:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:40.558 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:40Z|00329|binding|INFO|Claiming lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 for this chassis.
Feb 20 09:57:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:40Z|00330|binding|INFO|fc50cf28-29d3-47fd-a51c-af45ce6bd7a1: Claiming unknown
Feb 20 09:57:40 np0005625204.localdomain systemd-udevd[318416]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:57:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:40.575 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92c7b74ddebf4a4a82ffeb6b9b8a9111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c5062c6-b980-4e27-a02e-d7f042725453, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=fc50cf28-29d3-47fd-a51c-af45ce6bd7a1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:40.577 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 in datapath 530c1fe2-b1ac-42bf-9f43-13da698642f0 bound to our chassis
Feb 20 09:57:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:40.581 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 530c1fe2-b1ac-42bf-9f43-13da698642f0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:57:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:40.583 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[860e2e57-2086-4a8c-88e7-ab9e38c5cd58]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:40Z|00331|binding|INFO|Setting lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 ovn-installed in OVS
Feb 20 09:57:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:40Z|00332|binding|INFO|Setting lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 up in Southbound
Feb 20 09:57:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:40.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapfc50cf28-29: No such device
Feb 20 09:57:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:40.647 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:40.679 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:40 np0005625204.localdomain ceph-mon[301857]: pgmap v377: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 16 KiB/s wr, 76 op/s
Feb 20 09:57:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:41Z|00333|binding|INFO|Removing iface tapfc50cf28-29 ovn-installed in OVS
Feb 20 09:57:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:41.111 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 868e8685-06f4-43ca-b5fe-c6d154a2841f with type ""
Feb 20 09:57:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:41Z|00334|binding|INFO|Removing lport fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 ovn-installed in OVS
Feb 20 09:57:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:41.113 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-530c1fe2-b1ac-42bf-9f43-13da698642f0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '92c7b74ddebf4a4a82ffeb6b9b8a9111', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3c5062c6-b980-4e27-a02e-d7f042725453, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=fc50cf28-29d3-47fd-a51c-af45ce6bd7a1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:41.113 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:41.116 162652 INFO neutron.agent.ovn.metadata.agent [-] Port fc50cf28-29d3-47fd-a51c-af45ce6bd7a1 in datapath 530c1fe2-b1ac-42bf-9f43-13da698642f0 unbound from our chassis
Feb 20 09:57:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:41.118 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 530c1fe2-b1ac-42bf-9f43-13da698642f0 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:57:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:41.118 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:41 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:41.118 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1a291e-092b-44b1-8ce2-4cf585bcb271]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:41 np0005625204.localdomain podman[318487]: 
Feb 20 09:57:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:41Z|00335|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:57:41 np0005625204.localdomain podman[318487]: 2026-02-20 09:57:41.546219609 +0000 UTC m=+0.091463167 container create 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 09:57:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:41.560 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:41 np0005625204.localdomain systemd[1]: Started libpod-conmon-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460.scope.
Feb 20 09:57:41 np0005625204.localdomain podman[318487]: 2026-02-20 09:57:41.501511667 +0000 UTC m=+0.046755255 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:57:41 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:57:41 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a6c57f5625fab7ddc77f2282f6cf9f71a784ec5eef9d14cbb7d029b3c1e11247/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:57:41 np0005625204.localdomain podman[318487]: 2026-02-20 09:57:41.621188503 +0000 UTC m=+0.166432071 container init 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:57:41 np0005625204.localdomain podman[318487]: 2026-02-20 09:57:41.629874448 +0000 UTC m=+0.175118016 container start 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 09:57:41 np0005625204.localdomain dnsmasq[318505]: started, version 2.85 cachesize 150
Feb 20 09:57:41 np0005625204.localdomain dnsmasq[318505]: DNS service limited to local subnets
Feb 20 09:57:41 np0005625204.localdomain dnsmasq[318505]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:57:41 np0005625204.localdomain dnsmasq[318505]: warning: no upstream servers configured
Feb 20 09:57:41 np0005625204.localdomain dnsmasq-dhcp[318505]: DHCPv6, static leases only on 2001:db8:0:1::, lease time 1d
Feb 20 09:57:41 np0005625204.localdomain dnsmasq[318505]: read /var/lib/neutron/dhcp/530c1fe2-b1ac-42bf-9f43-13da698642f0/addn_hosts - 0 addresses
Feb 20 09:57:41 np0005625204.localdomain dnsmasq-dhcp[318505]: read /var/lib/neutron/dhcp/530c1fe2-b1ac-42bf-9f43-13da698642f0/host
Feb 20 09:57:41 np0005625204.localdomain dnsmasq-dhcp[318505]: read /var/lib/neutron/dhcp/530c1fe2-b1ac-42bf-9f43-13da698642f0/opts
Feb 20 09:57:41 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:41.741 264355 INFO neutron.agent.dhcp.agent [None req-35c9fc8a-e339-472e-affe-e03a51503b53 - - - - - -] DHCP configuration for ports {'dfe491bd-7fe5-4be1-94d7-71260e8282f7'} is completed
Feb 20 09:57:41 np0005625204.localdomain sudo[318520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:57:41 np0005625204.localdomain dnsmasq[318505]: exiting on receipt of SIGTERM
Feb 20 09:57:41 np0005625204.localdomain podman[318528]: 2026-02-20 09:57:41.875269625 +0000 UTC m=+0.060277488 container kill 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:57:41 np0005625204.localdomain sudo[318520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:57:41 np0005625204.localdomain systemd[1]: libpod-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460.scope: Deactivated successfully.
Feb 20 09:57:41 np0005625204.localdomain sudo[318520]: pam_unix(sudo:session): session closed for user root
Feb 20 09:57:41 np0005625204.localdomain podman[318554]: 2026-02-20 09:57:41.952143967 +0000 UTC m=+0.059314178 container died 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:57:41 np0005625204.localdomain sudo[318557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:57:41 np0005625204.localdomain sudo[318557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:57:42 np0005625204.localdomain podman[318554]: 2026-02-20 09:57:42.003677228 +0000 UTC m=+0.110847399 container cleanup 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:57:42 np0005625204.localdomain systemd[1]: libpod-conmon-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460.scope: Deactivated successfully.
Feb 20 09:57:42 np0005625204.localdomain podman[318556]: 2026-02-20 09:57:42.082819809 +0000 UTC m=+0.184564955 container remove 2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-530c1fe2-b1ac-42bf-9f43-13da698642f0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:57:42 np0005625204.localdomain kernel: device tapfc50cf28-29 left promiscuous mode
Feb 20 09:57:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:42.098 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:42.116 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:42.145 264355 INFO neutron.agent.dhcp.agent [None req-b5e18cee-fe6c-4955-acf8-2f35280e794b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:42.146 264355 INFO neutron.agent.dhcp.agent [None req-b5e18cee-fe6c-4955-acf8-2f35280e794b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-a6c57f5625fab7ddc77f2282f6cf9f71a784ec5eef9d14cbb7d029b3c1e11247-merged.mount: Deactivated successfully.
Feb 20 09:57:42 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ec3f80d98cef70cedec6f47e99068810d76c9ddef4a8d43783bd562add70460-userdata-shm.mount: Deactivated successfully.
Feb 20 09:57:42 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d530c1fe2\x2db1ac\x2d42bf\x2d9f43\x2d13da698642f0.mount: Deactivated successfully.
Feb 20 09:57:42 np0005625204.localdomain sudo[318557]: pam_unix(sudo:session): session closed for user root
Feb 20 09:57:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:42.786 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:57:42 np0005625204.localdomain ceph-mon[301857]: pgmap v378: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 14 KiB/s wr, 83 op/s
Feb 20 09:57:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:57:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:57:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:57:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:57:43 np0005625204.localdomain sudo[318635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:57:43 np0005625204.localdomain sudo[318635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:57:43 np0005625204.localdomain sudo[318635]: pam_unix(sudo:session): session closed for user root
Feb 20 09:57:44 np0005625204.localdomain ceph-mon[301857]: pgmap v379: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 12 KiB/s wr, 71 op/s
Feb 20 09:57:44 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:57:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:44.829 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:46 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:57:46 np0005625204.localdomain podman[318653]: 2026-02-20 09:57:46.146177409 +0000 UTC m=+0.083691880 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 20 09:57:46 np0005625204.localdomain podman[318653]: 2026-02-20 09:57:46.157974279 +0000 UTC m=+0.095488730 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:57:46 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:57:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:46 np0005625204.localdomain ceph-mon[301857]: pgmap v380: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 5.7 KiB/s wr, 16 op/s
Feb 20 09:57:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:57:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:57:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:57:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:57:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:57:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18342 "" "Go-http-client/1.1"
Feb 20 09:57:48 np0005625204.localdomain ceph-mon[301857]: pgmap v381: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 5.7 KiB/s wr, 16 op/s
Feb 20 09:57:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:49.830 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:49.834 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:50 np0005625204.localdomain ceph-mon[301857]: pgmap v382: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 4.9 KiB/s wr, 14 op/s
Feb 20 09:57:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:51 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "92707d6b-2313-43a9-b4cc-78c6dfb385e9", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:57:51 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "92707d6b-2313-43a9-b4cc-78c6dfb385e9", "format": "json"}]: dispatch
Feb 20 09:57:51 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:57:52 np0005625204.localdomain ceph-mon[301857]: pgmap v383: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 255 B/s wr, 13 op/s
Feb 20 09:57:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e188 e188: 6 total, 6 up, 6 in
Feb 20 09:57:53 np0005625204.localdomain sshd[318673]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:57:53 np0005625204.localdomain ceph-mon[301857]: osdmap e188: 6 total, 6 up, 6 in
Feb 20 09:57:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:54.836 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:57:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:54.839 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:57:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:54.839 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:57:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:54.839 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:57:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:54.872 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:54 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:54.873 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:57:54 np0005625204.localdomain sshd[318673]: Invalid user admin from 103.191.14.210 port 36526
Feb 20 09:57:55 np0005625204.localdomain ceph-mon[301857]: pgmap v385: 177 pgs: 177 active+clean; 195 MiB data, 961 MiB used, 41 GiB / 42 GiB avail
Feb 20 09:57:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ad9afd43-9e16-42b2-8035-3a7a6ffc2a82", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:57:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:57:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e189 e189: 6 total, 6 up, 6 in
Feb 20 09:57:55 np0005625204.localdomain sshd[318673]: Received disconnect from 103.191.14.210 port 36526:11: Bye Bye [preauth]
Feb 20 09:57:55 np0005625204.localdomain sshd[318673]: Disconnected from invalid user admin 103.191.14.210 port 36526 [preauth]
Feb 20 09:57:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:57:55 np0005625204.localdomain podman[318675]: 2026-02-20 09:57:55.337519401 +0000 UTC m=+0.091341763 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:57:55 np0005625204.localdomain podman[318675]: 2026-02-20 09:57:55.352353654 +0000 UTC m=+0.106176056 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:57:55 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ad9afd43-9e16-42b2-8035-3a7a6ffc2a82", "format": "json"}]: dispatch
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: osdmap e189: 6 total, 6 up, 6 in
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:57:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:57:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:57:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:57:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:57:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:57:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3816358087' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:57:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3816358087' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:56.803 264355 INFO neutron.agent.linux.ip_lib [None req-e4a8c918-57fc-4c1c-9320-241fb609b547 - - - - - -] Device tapf3db41a2-50 cannot be used as it has no MAC address
Feb 20 09:57:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:56.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:56 np0005625204.localdomain kernel: device tapf3db41a2-50 entered promiscuous mode
Feb 20 09:57:56 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581476.8388] manager: (tapf3db41a2-50): new Generic device (/org/freedesktop/NetworkManager/Devices/53)
Feb 20 09:57:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:56Z|00336|binding|INFO|Claiming lport f3db41a2-50e3-4148-b9ec-2d158c7c524f for this chassis.
Feb 20 09:57:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:56Z|00337|binding|INFO|f3db41a2-50e3-4148-b9ec-2d158c7c524f: Claiming unknown
Feb 20 09:57:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:56.842 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:56 np0005625204.localdomain systemd-udevd[318709]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:57:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:56.851 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d774fdc5-de30-406f-8a1c-54d7b2f97bb3, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f3db41a2-50e3-4148-b9ec-2d158c7c524f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:56.853 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f3db41a2-50e3-4148-b9ec-2d158c7c524f in datapath 4bb090fa-9f17-4be7-ae34-cbbdab18a773 bound to our chassis
Feb 20 09:57:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:56.855 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4bb090fa-9f17-4be7-ae34-cbbdab18a773 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:57:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:56.856 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c831b79e-9a22-4427-877f-21d9db4c8d80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:56Z|00338|binding|INFO|Setting lport f3db41a2-50e3-4148-b9ec-2d158c7c524f ovn-installed in OVS
Feb 20 09:57:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:56Z|00339|binding|INFO|Setting lport f3db41a2-50e3-4148-b9ec-2d158c7c524f up in Southbound
Feb 20 09:57:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:56.881 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapf3db41a2-50: No such device
Feb 20 09:57:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:56.924 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:56.955 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ef5178d4-d0bc-4d31-ba72-f530bb14424f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:57:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ef5178d4-d0bc-4d31-ba72-f530bb14424f", "format": "json"}]: dispatch
Feb 20 09:57:57 np0005625204.localdomain ceph-mon[301857]: pgmap v387: 177 pgs: 177 active+clean; 195 MiB data, 962 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 17 KiB/s wr, 25 op/s
Feb 20 09:57:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3816358087' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:57:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3816358087' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:57:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1119748421' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:57:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:57.327 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:57.328 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:57.329 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:57:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:57.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:57 np0005625204.localdomain podman[318776]: 
Feb 20 09:57:57 np0005625204.localdomain podman[318776]: 2026-02-20 09:57:57.900290414 +0000 UTC m=+0.097440060 container create 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:57:57 np0005625204.localdomain systemd[1]: Started libpod-conmon-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7.scope.
Feb 20 09:57:57 np0005625204.localdomain podman[318776]: 2026-02-20 09:57:57.855068477 +0000 UTC m=+0.052218183 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:57:57 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:57:57 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8d2a4d44a48183477bfe77e2460f5a0a928460e76df8182294ed0bcb1b4a2e9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:57:57 np0005625204.localdomain podman[318776]: 2026-02-20 09:57:57.982981604 +0000 UTC m=+0.180131260 container init 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:57:57 np0005625204.localdomain podman[318776]: 2026-02-20 09:57:57.993325199 +0000 UTC m=+0.190474865 container start 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:57:58 np0005625204.localdomain dnsmasq[318794]: started, version 2.85 cachesize 150
Feb 20 09:57:58 np0005625204.localdomain dnsmasq[318794]: DNS service limited to local subnets
Feb 20 09:57:58 np0005625204.localdomain dnsmasq[318794]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:57:58 np0005625204.localdomain dnsmasq[318794]: warning: no upstream servers configured
Feb 20 09:57:58 np0005625204.localdomain dnsmasq-dhcp[318794]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:57:58 np0005625204.localdomain dnsmasq[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/addn_hosts - 0 addresses
Feb 20 09:57:58 np0005625204.localdomain dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/host
Feb 20 09:57:58 np0005625204.localdomain dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/opts
Feb 20 09:57:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2600015293' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:57:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:58Z|00340|binding|INFO|Removing iface tapf3db41a2-50 ovn-installed in OVS
Feb 20 09:57:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:58.127 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port bc145312-347e-4752-96cb-e00a86df8e63 with type ""
Feb 20 09:57:58 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:58Z|00341|binding|INFO|Removing lport f3db41a2-50e3-4148-b9ec-2d158c7c524f ovn-installed in OVS
Feb 20 09:57:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:58.129 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4bb090fa-9f17-4be7-ae34-cbbdab18a773', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d774fdc5-de30-406f-8a1c-54d7b2f97bb3, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=f3db41a2-50e3-4148-b9ec-2d158c7c524f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:57:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:58.129 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:58.132 162652 INFO neutron.agent.ovn.metadata.agent [-] Port f3db41a2-50e3-4148-b9ec-2d158c7c524f in datapath 4bb090fa-9f17-4be7-ae34-cbbdab18a773 unbound from our chassis
Feb 20 09:57:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:58.133 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:58.134 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4bb090fa-9f17-4be7-ae34-cbbdab18a773, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:57:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:57:58.135 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4bb646-b22c-41b0-8f29-b3abe5ae2b14]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:57:58 np0005625204.localdomain kernel: device tapf3db41a2-50 left promiscuous mode
Feb 20 09:57:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:58.149 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.198 264355 INFO neutron.agent.dhcp.agent [None req-ce1ef91b-fd12-444e-a3b8-5d109028a4ed - - - - - -] DHCP configuration for ports {'9bf53a91-5c17-4ed0-911d-ea82253cfbbe'} is completed
Feb 20 09:57:58 np0005625204.localdomain dnsmasq[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/addn_hosts - 0 addresses
Feb 20 09:57:58 np0005625204.localdomain dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/host
Feb 20 09:57:58 np0005625204.localdomain podman[318814]: 2026-02-20 09:57:58.721264598 +0000 UTC m=+0.051894992 container kill 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:57:58 np0005625204.localdomain dnsmasq-dhcp[318794]: read /var/lib/neutron/dhcp/4bb090fa-9f17-4be7-ae34-cbbdab18a773/opts
Feb 20 09:57:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent [None req-31c6e7db-28d2-424f-ab15-dd0d4deffca8 - - - - - -] Unable to reload_allocations dhcp for 4bb090fa-9f17-4be7-ae34-cbbdab18a773.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf3db41a2-50 not found in namespace qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773.
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     return fut.result()
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     raise self._exception
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapf3db41a2-50 not found in namespace qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773.
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.745 264355 ERROR neutron.agent.dhcp.agent 
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.748 264355 INFO neutron.agent.dhcp.agent [None req-c2bf4659-23a7-4d94-a36c-8a2e456b8aa4 - - - - - -] Synchronizing state
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.872 264355 INFO neutron.agent.dhcp.agent [None req-4e01f771-80af-4303-ac93-faffa05069c8 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.874 264355 INFO neutron.agent.dhcp.agent [-] Starting network 4bb090fa-9f17-4be7-ae34-cbbdab18a773 dhcp configuration
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.874 264355 INFO neutron.agent.dhcp.agent [-] Finished network 4bb090fa-9f17-4be7-ae34-cbbdab18a773 dhcp configuration
Feb 20 09:57:58 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:57:58.875 264355 INFO neutron.agent.dhcp.agent [None req-4e01f771-80af-4303-ac93-faffa05069c8 - - - - - -] Synchronizing state complete
Feb 20 09:57:59 np0005625204.localdomain ceph-mon[301857]: pgmap v388: 177 pgs: 177 active+clean; 195 MiB data, 962 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 17 KiB/s wr, 25 op/s
Feb 20 09:57:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ad9afd43-9e16-42b2-8035-3a7a6ffc2a82", "format": "json"}]: dispatch
Feb 20 09:57:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ad9afd43-9e16-42b2-8035-3a7a6ffc2a82", "force": true, "format": "json"}]: dispatch
Feb 20 09:57:59 np0005625204.localdomain dnsmasq[318794]: exiting on receipt of SIGTERM
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: tmp-crun.6KX25h.mount: Deactivated successfully.
Feb 20 09:57:59 np0005625204.localdomain podman[318842]: 2026-02-20 09:57:59.174337453 +0000 UTC m=+0.069298633 container kill 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: libpod-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7.scope: Deactivated successfully.
Feb 20 09:57:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:57:59Z|00342|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.252 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:59 np0005625204.localdomain podman[318858]: 2026-02-20 09:57:59.294198875 +0000 UTC m=+0.093366387 container died 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: tmp-crun.Wa8jan.mount: Deactivated successfully.
Feb 20 09:57:59 np0005625204.localdomain podman[318858]: 2026-02-20 09:57:59.391568882 +0000 UTC m=+0.190736374 container remove 36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4bb090fa-9f17-4be7-ae34-cbbdab18a773, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: libpod-conmon-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7.scope: Deactivated successfully.
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.754 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.755 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.756 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.756 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.756 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:57:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:57:59.873 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-8d2a4d44a48183477bfe77e2460f5a0a928460e76df8182294ed0bcb1b4a2e9a-merged.mount: Deactivated successfully.
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36dab77d633499f7259effc4dcbb23beadf1a7469f8a20d43d0557721385a5d7-userdata-shm.mount: Deactivated successfully.
Feb 20 09:57:59 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d4bb090fa\x2d9f17\x2d4be7\x2dae34\x2dcbbdab18a773.mount: Deactivated successfully.
Feb 20 09:58:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:58:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4041107687' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.268 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.364 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.365 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.589 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.590 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11297MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.591 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.591 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.669 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.670 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.670 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:58:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:00.703 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:58:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:58:00 np0005625204.localdomain podman[318906]: 2026-02-20 09:58:00.887708346 +0000 UTC m=+0.101601547 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:58:00 np0005625204.localdomain ceph-mon[301857]: pgmap v389: 177 pgs: 177 active+clean; 195 MiB data, 962 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 48 KiB/s wr, 27 op/s
Feb 20 09:58:00 np0005625204.localdomain podman[318906]: 2026-02-20 09:58:00.921937639 +0000 UTC m=+0.135830870 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 09:58:00 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:58:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:58:01 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2856583038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:01.208 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:58:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:01.216 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:58:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:01.233 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:58:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:01.235 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:58:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:01.236 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:58:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4041107687' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2856583038' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:01 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:02.236 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:02.237 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:02.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e190 e190: 6 total, 6 up, 6 in
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ff9e0809-0706-432a-8bd4-0d7cee6634c8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: pgmap v390: 177 pgs: 177 active+clean; 195 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 46 KiB/s wr, 46 op/s
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ff9e0809-0706-432a-8bd4-0d7cee6634c8", "format": "json"}]: dispatch
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ef5178d4-d0bc-4d31-ba72-f530bb14424f", "format": "json"}]: dispatch
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ef5178d4-d0bc-4d31-ba72-f530bb14424f", "force": true, "format": "json"}]: dispatch
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4091018490' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4091018490' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:02 np0005625204.localdomain ceph-mon[301857]: osdmap e190: 6 total, 6 up, 6 in
Feb 20 09:58:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:58:03 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1636974104' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:58:03 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1636974104' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:03 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:58:03 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:03.612 264355 INFO neutron.agent.linux.ip_lib [None req-ddb54f48-b62f-4e06-907e-8322adf99f73 - - - - - -] Device tapb3eab012-76 cannot be used as it has no MAC address
Feb 20 09:58:03 np0005625204.localdomain podman[318951]: 2026-02-20 09:58:03.632912418 +0000 UTC m=+0.081976270 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.642 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:03 np0005625204.localdomain kernel: device tapb3eab012-76 entered promiscuous mode
Feb 20 09:58:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:03Z|00343|binding|INFO|Claiming lport b3eab012-7666-4655-9b51-e4c7e9621497 for this chassis.
Feb 20 09:58:03 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581483.6526] manager: (tapb3eab012-76): new Generic device (/org/freedesktop/NetworkManager/Devices/54)
Feb 20 09:58:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:03Z|00344|binding|INFO|b3eab012-7666-4655-9b51-e4c7e9621497: Claiming unknown
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.652 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:03 np0005625204.localdomain systemd-udevd[318978]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:03.663 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0e47cd34a784cbb89cbe56eafed5650', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f70c52-a5b6-4d06-85ef-b3fdc5aa9c4b, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=b3eab012-7666-4655-9b51-e4c7e9621497) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:03.664 162652 INFO neutron.agent.ovn.metadata.agent [-] Port b3eab012-7666-4655-9b51-e4c7e9621497 in datapath 93dbe9b6-4551-4902-9476-0f2070facdb5 bound to our chassis
Feb 20 09:58:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:03.665 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 93dbe9b6-4551-4902-9476-0f2070facdb5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:58:03 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:03.666 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf34a15-5d34-4b33-9bc6-925eae68fa2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:03 np0005625204.localdomain podman[318951]: 2026-02-20 09:58:03.674668219 +0000 UTC m=+0.123732061 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1770267347, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container)
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:03Z|00345|binding|INFO|Setting lport b3eab012-7666-4655-9b51-e4c7e9621497 ovn-installed in OVS
Feb 20 09:58:03 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:03Z|00346|binding|INFO|Setting lport b3eab012-7666-4655-9b51-e4c7e9621497 up in Southbound
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.692 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapb3eab012-76: No such device
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.737 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.738 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:58:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:03.758 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1636974104' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1636974104' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:04 np0005625204.localdomain podman[319049]: 
Feb 20 09:58:04 np0005625204.localdomain podman[319049]: 2026-02-20 09:58:04.774155789 +0000 UTC m=+0.097156612 container create fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:04 np0005625204.localdomain systemd[1]: Started libpod-conmon-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb.scope.
Feb 20 09:58:04 np0005625204.localdomain podman[319049]: 2026-02-20 09:58:04.732013855 +0000 UTC m=+0.055014718 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:04 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:04 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6954e2904a7005976441dfd44549da912216319c18455669f4ef3b3e79ca01de/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:04 np0005625204.localdomain podman[319049]: 2026-02-20 09:58:04.859127838 +0000 UTC m=+0.182128661 container init fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 09:58:04 np0005625204.localdomain podman[319049]: 2026-02-20 09:58:04.870349639 +0000 UTC m=+0.193350462 container start fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:04 np0005625204.localdomain dnsmasq[319067]: started, version 2.85 cachesize 150
Feb 20 09:58:04 np0005625204.localdomain dnsmasq[319067]: DNS service limited to local subnets
Feb 20 09:58:04 np0005625204.localdomain dnsmasq[319067]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:04 np0005625204.localdomain dnsmasq[319067]: warning: no upstream servers configured
Feb 20 09:58:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:04.878 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:04 np0005625204.localdomain dnsmasq-dhcp[319067]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:04 np0005625204.localdomain dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 0 addresses
Feb 20 09:58:04 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host
Feb 20 09:58:04 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts
Feb 20 09:58:04 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:04.991 264355 INFO neutron.agent.dhcp.agent [None req-603e188f-4a62-4308-8e61-a2be321918fc - - - - - -] DHCP configuration for ports {'e9546fa0-e2a1-4388-b155-54ab8a4b6a66'} is completed
Feb 20 09:58:05 np0005625204.localdomain ceph-mon[301857]: pgmap v392: 177 pgs: 177 active+clean; 195 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 30 KiB/s wr, 22 op/s
Feb 20 09:58:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3388708526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3083830693' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3083830693' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3015932775' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:06.020 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:58:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:58:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:58:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.724 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.882 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.883 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.884 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:58:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:06.884 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:58:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ff9e0809-0706-432a-8bd4-0d7cee6634c8", "format": "json"}]: dispatch
Feb 20 09:58:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ff9e0809-0706-432a-8bd4-0d7cee6634c8", "force": true, "format": "json"}]: dispatch
Feb 20 09:58:07 np0005625204.localdomain ceph-mon[301857]: pgmap v393: 177 pgs: 177 active+clean; 195 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 43 KiB/s wr, 55 op/s
Feb 20 09:58:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:07.331 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:58:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:07.920 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:58:07Z, description=, device_id=7557495a-ace7-48ee-949b-56260afaa059, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5952700>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5952df0>], id=9a0d4ed4-2522-49a8-9226-4337c377056e, ip_allocation=immediate, mac_address=fa:16:3e:40:7f:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:02Z, description=, dns_domain=, id=93dbe9b6-4551-4902-9476-0f2070facdb5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-933360853-network, port_security_enabled=True, project_id=d0e47cd34a784cbb89cbe56eafed5650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16329, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2853, status=ACTIVE, subnets=['be3c004e-54c2-476b-98f9-a1eb38b39ea4'], tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:02Z, vlan_transparent=None, network_id=93dbe9b6-4551-4902-9476-0f2070facdb5, port_security_enabled=False, project_id=d0e47cd34a784cbb89cbe56eafed5650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2880, status=DOWN, tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:07Z on network 93dbe9b6-4551-4902-9476-0f2070facdb5
Feb 20 09:58:07 np0005625204.localdomain sshd[319068]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:07.948 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:58:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:07.966 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:58:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:07.967 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:58:08 np0005625204.localdomain podman[319087]: 2026-02-20 09:58:08.149501629 +0000 UTC m=+0.068952372 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 20 09:58:08 np0005625204.localdomain dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 1 addresses
Feb 20 09:58:08 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host
Feb 20 09:58:08 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts
Feb 20 09:58:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:08.463 264355 INFO neutron.agent.dhcp.agent [None req-7e34d078-2605-4fbc-acb7-bc6902701015 - - - - - -] DHCP configuration for ports {'9a0d4ed4-2522-49a8-9226-4337c377056e'} is completed
Feb 20 09:58:08 np0005625204.localdomain ceph-mon[301857]: pgmap v394: 177 pgs: 177 active+clean; 195 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 43 KiB/s wr, 55 op/s
Feb 20 09:58:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1749782927' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1749782927' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:09 np0005625204.localdomain sshd[319068]: Invalid user admin from 203.228.30.198 port 38770
Feb 20 09:58:09 np0005625204.localdomain sshd[319068]: Received disconnect from 203.228.30.198 port 38770:11: Bye Bye [preauth]
Feb 20 09:58:09 np0005625204.localdomain sshd[319068]: Disconnected from invalid user admin 203.228.30.198 port 38770 [preauth]
Feb 20 09:58:09 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:09.394 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:58:07Z, description=, device_id=7557495a-ace7-48ee-949b-56260afaa059, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59e3ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59d9be0>], id=9a0d4ed4-2522-49a8-9226-4337c377056e, ip_allocation=immediate, mac_address=fa:16:3e:40:7f:8f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:02Z, description=, dns_domain=, id=93dbe9b6-4551-4902-9476-0f2070facdb5, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-933360853-network, port_security_enabled=True, project_id=d0e47cd34a784cbb89cbe56eafed5650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16329, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2853, status=ACTIVE, subnets=['be3c004e-54c2-476b-98f9-a1eb38b39ea4'], tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:02Z, vlan_transparent=None, network_id=93dbe9b6-4551-4902-9476-0f2070facdb5, port_security_enabled=False, project_id=d0e47cd34a784cbb89cbe56eafed5650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2880, status=DOWN, tags=[], tenant_id=d0e47cd34a784cbb89cbe56eafed5650, updated_at=2026-02-20T09:58:07Z on network 93dbe9b6-4551-4902-9476-0f2070facdb5
Feb 20 09:58:09 np0005625204.localdomain dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 1 addresses
Feb 20 09:58:09 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host
Feb 20 09:58:09 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts
Feb 20 09:58:09 np0005625204.localdomain podman[319125]: 2026-02-20 09:58:09.598493927 +0000 UTC m=+0.069792698 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:58:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:58:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:58:09 np0005625204.localdomain podman[319140]: 2026-02-20 09:58:09.746338242 +0000 UTC m=+0.106061423 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 09:58:09 np0005625204.localdomain podman[319139]: 2026-02-20 09:58:09.718804663 +0000 UTC m=+0.084033572 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:58:09 np0005625204.localdomain podman[319140]: 2026-02-20 09:58:09.779161422 +0000 UTC m=+0.138884553 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 20 09:58:09 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:58:09 np0005625204.localdomain podman[319139]: 2026-02-20 09:58:09.802046159 +0000 UTC m=+0.167275038 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 20 09:58:09 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:58:09 np0005625204.localdomain sshd[319190]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:09.879 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:09 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:09.911 264355 INFO neutron.agent.dhcp.agent [None req-b32da0f6-1830-419d-b62f-3009676400b2 - - - - - -] DHCP configuration for ports {'9a0d4ed4-2522-49a8-9226-4337c377056e'} is completed
Feb 20 09:58:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "134e598c-37ed-480d-a639-35f631513b30", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "134e598c-37ed-480d-a639-35f631513b30", "format": "json"}]: dispatch
Feb 20 09:58:10 np0005625204.localdomain sshd[319190]: Invalid user claude from 57.128.218.144 port 38738
Feb 20 09:58:10 np0005625204.localdomain sshd[319190]: Received disconnect from 57.128.218.144 port 38738:11: Bye Bye [preauth]
Feb 20 09:58:10 np0005625204.localdomain sshd[319190]: Disconnected from invalid user claude 57.128.218.144 port 38738 [preauth]
Feb 20 09:58:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e191 e191: 6 total, 6 up, 6 in
Feb 20 09:58:10 np0005625204.localdomain ceph-mon[301857]: pgmap v395: 177 pgs: 177 active+clean; 195 MiB data, 963 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 42 KiB/s wr, 55 op/s
Feb 20 09:58:11 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:11.128 264355 INFO neutron.agent.linux.ip_lib [None req-d0d0c787-ea20-4cde-846a-790e04278dc7 - - - - - -] Device tapedf7cdb1-89 cannot be used as it has no MAC address
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.150 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain kernel: device tapedf7cdb1-89 entered promiscuous mode
Feb 20 09:58:11 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581491.1594] manager: (tapedf7cdb1-89): new Generic device (/org/freedesktop/NetworkManager/Devices/55)
Feb 20 09:58:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:11Z|00347|binding|INFO|Claiming lport edf7cdb1-8914-4d5f-9318-a56053278a5b for this chassis.
Feb 20 09:58:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:11Z|00348|binding|INFO|edf7cdb1-8914-4d5f-9318-a56053278a5b: Claiming unknown
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.160 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain systemd-udevd[319202]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:11.173 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=188c8087-bb75-458a-8eed-59a2b56d79c4, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=edf7cdb1-8914-4d5f-9318-a56053278a5b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:11Z|00349|binding|INFO|Setting lport edf7cdb1-8914-4d5f-9318-a56053278a5b ovn-installed in OVS
Feb 20 09:58:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:11Z|00350|binding|INFO|Setting lport edf7cdb1-8914-4d5f-9318-a56053278a5b up in Southbound
Feb 20 09:58:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:11.177 162652 INFO neutron.agent.ovn.metadata.agent [-] Port edf7cdb1-8914-4d5f-9318-a56053278a5b in datapath 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 bound to our chassis
Feb 20 09:58:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:11.179 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.175 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.176 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:11.180 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9a89e0a3-29bc-47bc-b119-f02c69a0596a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.212 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapedf7cdb1-89: No such device
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.255 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:11.287 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:11 np0005625204.localdomain sshd[319244]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:11 np0005625204.localdomain sshd[319244]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:58:11 np0005625204.localdomain ceph-mon[301857]: osdmap e191: 6 total, 6 up, 6 in
Feb 20 09:58:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e192 e192: 6 total, 6 up, 6 in
Feb 20 09:58:12 np0005625204.localdomain podman[319275]: 
Feb 20 09:58:12 np0005625204.localdomain podman[319275]: 2026-02-20 09:58:12.219567946 +0000 UTC m=+0.094041327 container create 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 20 09:58:12 np0005625204.localdomain systemd[1]: Started libpod-conmon-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d.scope.
Feb 20 09:58:12 np0005625204.localdomain systemd[1]: tmp-crun.qQzq6n.mount: Deactivated successfully.
Feb 20 09:58:12 np0005625204.localdomain podman[319275]: 2026-02-20 09:58:12.175211875 +0000 UTC m=+0.049685256 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:12 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:12 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48b49eced9f4fc65c05c8c092b97ef8318ea17ed43fd0245e646a30e3aa60417/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:12 np0005625204.localdomain podman[319275]: 2026-02-20 09:58:12.307701251 +0000 UTC m=+0.182174642 container init 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:12 np0005625204.localdomain podman[319275]: 2026-02-20 09:58:12.31848642 +0000 UTC m=+0.192959801 container start 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:58:12 np0005625204.localdomain dnsmasq[319293]: started, version 2.85 cachesize 150
Feb 20 09:58:12 np0005625204.localdomain dnsmasq[319293]: DNS service limited to local subnets
Feb 20 09:58:12 np0005625204.localdomain dnsmasq[319293]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:12 np0005625204.localdomain dnsmasq[319293]: warning: no upstream servers configured
Feb 20 09:58:12 np0005625204.localdomain dnsmasq-dhcp[319293]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:12 np0005625204.localdomain dnsmasq[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/addn_hosts - 0 addresses
Feb 20 09:58:12 np0005625204.localdomain dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/host
Feb 20 09:58:12 np0005625204.localdomain dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/opts
Feb 20 09:58:12 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:12.473 264355 INFO neutron.agent.dhcp.agent [None req-cebf620c-6ca4-460d-b7d9-780231987c77 - - - - - -] DHCP configuration for ports {'d725344c-9c12-4f8f-b2ef-5d64cff6fbc5'} is completed
Feb 20 09:58:12 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:12.562 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port c3f233a6-25d7-4255-a243-46db9de6fdef with type ""
Feb 20 09:58:12 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:12Z|00351|binding|INFO|Removing iface tapedf7cdb1-89 ovn-installed in OVS
Feb 20 09:58:12 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:12Z|00352|binding|INFO|Removing lport edf7cdb1-8914-4d5f-9318-a56053278a5b ovn-installed in OVS
Feb 20 09:58:12 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:12.564 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-80ba55c6-a1c6-4e76-942b-f70ca34ddad5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=188c8087-bb75-458a-8eed-59a2b56d79c4, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=edf7cdb1-8914-4d5f-9318-a56053278a5b) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:12.564 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:12 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:12.567 162652 INFO neutron.agent.ovn.metadata.agent [-] Port edf7cdb1-8914-4d5f-9318-a56053278a5b in datapath 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 unbound from our chassis
Feb 20 09:58:12 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:12.570 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:12.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:12 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:12.571 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3dd196d7-69d1-4af7-8541-6ff0d73bd0e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:12.574 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:12 np0005625204.localdomain kernel: device tapedf7cdb1-89 left promiscuous mode
Feb 20 09:58:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:12.593 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:13 np0005625204.localdomain ceph-mon[301857]: pgmap v397: 177 pgs: 177 active+clean; 195 MiB data, 964 MiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 50 KiB/s wr, 67 op/s
Feb 20 09:58:13 np0005625204.localdomain ceph-mon[301857]: osdmap e192: 6 total, 6 up, 6 in
Feb 20 09:58:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "134e598c-37ed-480d-a639-35f631513b30", "format": "json"}]: dispatch
Feb 20 09:58:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "134e598c-37ed-480d-a639-35f631513b30", "force": true, "format": "json"}]: dispatch
Feb 20 09:58:13 np0005625204.localdomain dnsmasq[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/addn_hosts - 0 addresses
Feb 20 09:58:13 np0005625204.localdomain dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/host
Feb 20 09:58:13 np0005625204.localdomain dnsmasq-dhcp[319293]: read /var/lib/neutron/dhcp/80ba55c6-a1c6-4e76-942b-f70ca34ddad5/opts
Feb 20 09:58:13 np0005625204.localdomain podman[319313]: 2026-02-20 09:58:13.120309889 +0000 UTC m=+0.057951466 container kill 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent [None req-f8bf157c-3c39-41ba-9b81-8c81d1c0304b - - - - - -] Unable to reload_allocations dhcp for 80ba55c6-a1c6-4e76-942b-f70ca34ddad5.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapedf7cdb1-89 not found in namespace qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5.
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     return fut.result()
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     raise self._exception
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tapedf7cdb1-89 not found in namespace qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5.
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.149 264355 ERROR neutron.agent.dhcp.agent 
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.152 264355 INFO neutron.agent.dhcp.agent [None req-4e01f771-80af-4303-ac93-faffa05069c8 - - - - - -] Synchronizing state
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.285 264355 INFO neutron.agent.dhcp.agent [None req-18b14ea9-72f3-48ab-a5d6-7e417394bb56 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.286 264355 INFO neutron.agent.dhcp.agent [-] Starting network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 dhcp configuration
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.286 264355 INFO neutron.agent.dhcp.agent [-] Finished network 80ba55c6-a1c6-4e76-942b-f70ca34ddad5 dhcp configuration
Feb 20 09:58:13 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:13.287 264355 INFO neutron.agent.dhcp.agent [None req-18b14ea9-72f3-48ab-a5d6-7e417394bb56 - - - - - -] Synchronizing state complete
Feb 20 09:58:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:13Z|00353|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:58:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:13.429 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:13 np0005625204.localdomain podman[319343]: 2026-02-20 09:58:13.531022924 +0000 UTC m=+0.062960000 container kill 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:13 np0005625204.localdomain dnsmasq[319293]: exiting on receipt of SIGTERM
Feb 20 09:58:13 np0005625204.localdomain systemd[1]: libpod-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d.scope: Deactivated successfully.
Feb 20 09:58:13 np0005625204.localdomain podman[319356]: 2026-02-20 09:58:13.586263107 +0000 UTC m=+0.044299371 container died 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:13 np0005625204.localdomain podman[319356]: 2026-02-20 09:58:13.620581062 +0000 UTC m=+0.078617296 container cleanup 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 20 09:58:13 np0005625204.localdomain systemd[1]: libpod-conmon-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d.scope: Deactivated successfully.
Feb 20 09:58:13 np0005625204.localdomain podman[319358]: 2026-02-20 09:58:13.634398963 +0000 UTC m=+0.081744451 container remove 53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-80ba55c6-a1c6-4e76-942b-f70ca34ddad5, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:58:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e193 e193: 6 total, 6 up, 6 in
Feb 20 09:58:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-48b49eced9f4fc65c05c8c092b97ef8318ea17ed43fd0245e646a30e3aa60417-merged.mount: Deactivated successfully.
Feb 20 09:58:14 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53f4026e743a8b1e650ecc51232341bc667169455b568dfb52ba12f6bd96b52d-userdata-shm.mount: Deactivated successfully.
Feb 20 09:58:14 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d80ba55c6\x2da1c6\x2d4e76\x2d942b\x2df70ca34ddad5.mount: Deactivated successfully.
Feb 20 09:58:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:14.911 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:15 np0005625204.localdomain ceph-mon[301857]: pgmap v399: 177 pgs: 177 active+clean; 195 MiB data, 964 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 32 KiB/s wr, 27 op/s
Feb 20 09:58:15 np0005625204.localdomain ceph-mon[301857]: osdmap e193: 6 total, 6 up, 6 in
Feb 20 09:58:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e194 e194: 6 total, 6 up, 6 in
Feb 20 09:58:15 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: osdmap e194: 6 total, 6 up, 6 in
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f29b2b87-a99b-49d8-b340-b09eb6239fb8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f29b2b87-a99b-49d8-b340-b09eb6239fb8", "format": "json"}]: dispatch
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4248636922' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/4248636922' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:16.816 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:17 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:58:17 np0005625204.localdomain ceph-mon[301857]: pgmap v402: 177 pgs: 177 active+clean; 196 MiB data, 968 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 29 KiB/s wr, 88 op/s
Feb 20 09:58:17 np0005625204.localdomain podman[319385]: 2026-02-20 09:58:17.1481336 +0000 UTC m=+0.085746904 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 20 09:58:17 np0005625204.localdomain podman[319385]: 2026-02-20 09:58:17.16422185 +0000 UTC m=+0.101835134 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:58:17 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:58:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:58:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:58:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:58:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 20 09:58:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:58:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18823 "" "Go-http-client/1.1"
Feb 20 09:58:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e195 e195: 6 total, 6 up, 6 in
Feb 20 09:58:18 np0005625204.localdomain sshd[319404]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:18 np0005625204.localdomain ceph-mon[301857]: pgmap v403: 177 pgs: 177 active+clean; 196 MiB data, 968 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 22 KiB/s wr, 67 op/s
Feb 20 09:58:18 np0005625204.localdomain ceph-mon[301857]: osdmap e195: 6 total, 6 up, 6 in
Feb 20 09:58:18 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/11949244' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:18 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/11949244' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:19 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:19.164 264355 INFO neutron.agent.linux.ip_lib [None req-64405f07-8e63-45d1-8b5e-a0d184eac1fb - - - - - -] Device tap53d7e2be-a1 cannot be used as it has no MAC address
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.220 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain kernel: device tap53d7e2be-a1 entered promiscuous mode
Feb 20 09:58:19 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581499.2331] manager: (tap53d7e2be-a1): new Generic device (/org/freedesktop/NetworkManager/Devices/56)
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.235 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:19Z|00354|binding|INFO|Claiming lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 for this chassis.
Feb 20 09:58:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:19Z|00355|binding|INFO|53d7e2be-a1e5-44d8-ac61-a6537aee6bf9: Claiming unknown
Feb 20 09:58:19 np0005625204.localdomain systemd-udevd[319416]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:19.249 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a0911e4-b523-4ec2-a3e4-c2816971929f, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=53d7e2be-a1e5-44d8-ac61-a6537aee6bf9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:19.251 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 in datapath 375f3c2e-e621-47b1-ab46-942f46f000f6 bound to our chassis
Feb 20 09:58:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:19.253 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 375f3c2e-e621-47b1-ab46-942f46f000f6 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:58:19 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:19.255 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc527df-0b4a-4aa7-afae-9e64bf5b7968]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:19Z|00356|binding|INFO|Setting lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 ovn-installed in OVS
Feb 20 09:58:19 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:19Z|00357|binding|INFO|Setting lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 up in Southbound
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap53d7e2be-a1: No such device
Feb 20 09:58:19 np0005625204.localdomain sshd[319404]: Invalid user sol from 45.148.10.240 port 57110
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.314 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.347 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain sshd[319404]: Connection closed by invalid user sol 45.148.10.240 port 57110 [preauth]
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:19.914 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "format": "json"}]: dispatch
Feb 20 09:58:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:20 np0005625204.localdomain podman[319485]: 
Feb 20 09:58:20 np0005625204.localdomain podman[319485]: 2026-02-20 09:58:20.249960086 +0000 UTC m=+0.095774499 container create c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:58:20 np0005625204.localdomain systemd[1]: Started libpod-conmon-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62.scope.
Feb 20 09:58:20 np0005625204.localdomain podman[319485]: 2026-02-20 09:58:20.204889403 +0000 UTC m=+0.050703856 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:20 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:20 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce6b4e0a927c90fbf1419bb6b55ac40e63e6fb8985741dd6b996564328cf80bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:20 np0005625204.localdomain podman[319485]: 2026-02-20 09:58:20.329132899 +0000 UTC m=+0.174947302 container init c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:20 np0005625204.localdomain podman[319485]: 2026-02-20 09:58:20.338067181 +0000 UTC m=+0.183881584 container start c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:58:20 np0005625204.localdomain dnsmasq[319503]: started, version 2.85 cachesize 150
Feb 20 09:58:20 np0005625204.localdomain dnsmasq[319503]: DNS service limited to local subnets
Feb 20 09:58:20 np0005625204.localdomain dnsmasq[319503]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:20 np0005625204.localdomain dnsmasq[319503]: warning: no upstream servers configured
Feb 20 09:58:20 np0005625204.localdomain dnsmasq-dhcp[319503]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:20 np0005625204.localdomain dnsmasq[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/addn_hosts - 0 addresses
Feb 20 09:58:20 np0005625204.localdomain dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/host
Feb 20 09:58:20 np0005625204.localdomain dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/opts
Feb 20 09:58:20 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:20.478 264355 INFO neutron.agent.dhcp.agent [None req-1f68b521-5b01-459b-8511-bc0710ce2eb4 - - - - - -] DHCP configuration for ports {'2fc13e4c-b894-4358-b28b-9fe96597fd36'} is completed
Feb 20 09:58:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:20.707 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 856512c4-e24e-424b-8516-4e8839e85823 with type ""
Feb 20 09:58:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:20Z|00358|binding|INFO|Removing iface tap53d7e2be-a1 ovn-installed in OVS
Feb 20 09:58:20 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:20Z|00359|binding|INFO|Removing lport 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 ovn-installed in OVS
Feb 20 09:58:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:20.710 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.1/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-375f3c2e-e621-47b1-ab46-942f46f000f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a0911e4-b523-4ec2-a3e4-c2816971929f, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=53d7e2be-a1e5-44d8-ac61-a6537aee6bf9) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:20.712 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 53d7e2be-a1e5-44d8-ac61-a6537aee6bf9 in datapath 375f3c2e-e621-47b1-ab46-942f46f000f6 unbound from our chassis
Feb 20 09:58:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:20.716 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 375f3c2e-e621-47b1-ab46-942f46f000f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:20 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:20.739 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d752f8-5d2e-46eb-9900-7777e2807a44]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:20.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:20.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:20 np0005625204.localdomain kernel: device tap53d7e2be-a1 left promiscuous mode
Feb 20 09:58:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:20.758 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f29b2b87-a99b-49d8-b340-b09eb6239fb8", "format": "json"}]: dispatch
Feb 20 09:58:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f29b2b87-a99b-49d8-b340-b09eb6239fb8", "force": true, "format": "json"}]: dispatch
Feb 20 09:58:20 np0005625204.localdomain ceph-mon[301857]: pgmap v405: 177 pgs: 177 active+clean; 196 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 34 KiB/s wr, 89 op/s
Feb 20 09:58:21 np0005625204.localdomain dnsmasq[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/addn_hosts - 0 addresses
Feb 20 09:58:21 np0005625204.localdomain dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/host
Feb 20 09:58:21 np0005625204.localdomain dnsmasq-dhcp[319503]: read /var/lib/neutron/dhcp/375f3c2e-e621-47b1-ab46-942f46f000f6/opts
Feb 20 09:58:21 np0005625204.localdomain podman[319522]: 2026-02-20 09:58:21.132842036 +0000 UTC m=+0.063768853 container kill c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent [None req-231747be-d089-4af3-a9e3-b8f752dcac4e - - - - - -] Unable to reload_allocations dhcp for 375f3c2e-e621-47b1-ab46-942f46f000f6.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap53d7e2be-a1 not found in namespace qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6.
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     return fut.result()
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     raise self._exception
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap53d7e2be-a1 not found in namespace qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6.
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.161 264355 ERROR neutron.agent.dhcp.agent 
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.165 264355 INFO neutron.agent.dhcp.agent [None req-18b14ea9-72f3-48ab-a5d6-7e417394bb56 - - - - - -] Synchronizing state
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.274 264355 INFO neutron.agent.dhcp.agent [None req-70d4c29f-e879-4681-ad82-c583ac60bbc8 - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.274 264355 INFO neutron.agent.dhcp.agent [-] Starting network 375f3c2e-e621-47b1-ab46-942f46f000f6 dhcp configuration
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.275 264355 INFO neutron.agent.dhcp.agent [-] Finished network 375f3c2e-e621-47b1-ab46-942f46f000f6 dhcp configuration
Feb 20 09:58:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:21.275 264355 INFO neutron.agent.dhcp.agent [None req-70d4c29f-e879-4681-ad82-c583ac60bbc8 - - - - - -] Synchronizing state complete
Feb 20 09:58:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e195 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:21 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:21Z|00360|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:58:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:21.387 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:21 np0005625204.localdomain dnsmasq[319503]: exiting on receipt of SIGTERM
Feb 20 09:58:21 np0005625204.localdomain podman[319553]: 2026-02-20 09:58:21.544891551 +0000 UTC m=+0.057763121 container kill c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:21 np0005625204.localdomain systemd[1]: libpod-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62.scope: Deactivated successfully.
Feb 20 09:58:21 np0005625204.localdomain podman[319566]: 2026-02-20 09:58:21.618146663 +0000 UTC m=+0.056220274 container died c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 20 09:58:21 np0005625204.localdomain systemd[1]: tmp-crun.lf59km.mount: Deactivated successfully.
Feb 20 09:58:21 np0005625204.localdomain podman[319566]: 2026-02-20 09:58:21.656383248 +0000 UTC m=+0.094456819 container cleanup c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:21 np0005625204.localdomain systemd[1]: libpod-conmon-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62.scope: Deactivated successfully.
Feb 20 09:58:21 np0005625204.localdomain podman[319567]: 2026-02-20 09:58:21.693684945 +0000 UTC m=+0.125923628 container remove c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-375f3c2e-e621-47b1-ab46-942f46f000f6, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:58:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-ce6b4e0a927c90fbf1419bb6b55ac40e63e6fb8985741dd6b996564328cf80bb-merged.mount: Deactivated successfully.
Feb 20 09:58:22 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c060352c7f062e3777d9a0dda76bbd57fad785813af702c5bfaed263b7f22b62-userdata-shm.mount: Deactivated successfully.
Feb 20 09:58:22 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d375f3c2e\x2de621\x2d47b1\x2dab46\x2d942f46f000f6.mount: Deactivated successfully.
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e196 e196: 6 total, 6 up, 6 in
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: pgmap v406: 177 pgs: 177 active+clean; 196 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 96 KiB/s rd, 54 KiB/s wr, 134 op/s
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1485614407' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1485614407' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:22 np0005625204.localdomain ceph-mon[301857]: osdmap e196: 6 total, 6 up, 6 in
Feb 20 09:58:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:58:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:24.949 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ec39e814-8000-487e-8406-15e7edd35489", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ec39e814-8000-487e-8406-15e7edd35489", "format": "json"}]: dispatch
Feb 20 09:58:25 np0005625204.localdomain ceph-mon[301857]: pgmap v408: 177 pgs: 177 active+clean; 196 MiB data, 991 MiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 35 KiB/s wr, 75 op/s
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e197 e197: 6 total, 6 up, 6 in
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3466105401' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3466105401' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 09:58:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:58:26 np0005625204.localdomain systemd[1]: tmp-crun.Csi2xy.mount: Deactivated successfully.
Feb 20 09:58:26 np0005625204.localdomain podman[319594]: 2026-02-20 09:58:26.170519145 +0000 UTC m=+0.098015427 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:58:26 np0005625204.localdomain podman[319594]: 2026-02-20 09:58:26.180125578 +0000 UTC m=+0.107621850 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 09:58:26 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:58:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:58:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:58:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:58:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:58:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:58:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:58:27 np0005625204.localdomain ceph-mon[301857]: pgmap v409: 177 pgs: 177 active+clean; 196 MiB data, 992 MiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 57 KiB/s wr, 130 op/s
Feb 20 09:58:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:58:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:58:27 np0005625204.localdomain ceph-mon[301857]: osdmap e197: 6 total, 6 up, 6 in
Feb 20 09:58:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2603382454' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2603382454' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:27 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:27.242 264355 INFO neutron.agent.linux.ip_lib [None req-d38ae07c-887e-4698-b929-469cb11c791e - - - - - -] Device tap50325231-1b cannot be used as it has no MAC address
Feb 20 09:58:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:27.271 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:27 np0005625204.localdomain kernel: device tap50325231-1b entered promiscuous mode
Feb 20 09:58:27 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581507.2799] manager: (tap50325231-1b): new Generic device (/org/freedesktop/NetworkManager/Devices/57)
Feb 20 09:58:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:27.280 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:27Z|00361|binding|INFO|Claiming lport 50325231-1b72-40e0-af54-5e19218597d1 for this chassis.
Feb 20 09:58:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:27Z|00362|binding|INFO|50325231-1b72-40e0-af54-5e19218597d1: Claiming unknown
Feb 20 09:58:27 np0005625204.localdomain systemd-udevd[319628]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:27.291 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c60e6b37-6fb4-4889-8f95-ad293363e22b, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=50325231-1b72-40e0-af54-5e19218597d1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:27.293 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50325231-1b72-40e0-af54-5e19218597d1 in datapath 838109c2-99d3-418e-a5f0-0558fd60210c bound to our chassis
Feb 20 09:58:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:27.295 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 838109c2-99d3-418e-a5f0-0558fd60210c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:58:27 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:27.296 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[73a967ba-5353-4145-92e9-315f9f3ee905]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:27Z|00363|binding|INFO|Setting lport 50325231-1b72-40e0-af54-5e19218597d1 ovn-installed in OVS
Feb 20 09:58:27 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:27Z|00364|binding|INFO|Setting lport 50325231-1b72-40e0-af54-5e19218597d1 up in Southbound
Feb 20 09:58:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:27.319 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:27.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:27.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3400407797' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3400407797' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:28 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:28.330 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 3e87c615-1295-4106-9d99-81bb6cceb56f with type ""
Feb 20 09:58:28 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:28Z|00365|binding|INFO|Removing iface tap50325231-1b ovn-installed in OVS
Feb 20 09:58:28 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:28Z|00366|binding|INFO|Removing lport 50325231-1b72-40e0-af54-5e19218597d1 ovn-installed in OVS
Feb 20 09:58:28 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:28.333 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-838109c2-99d3-418e-a5f0-0558fd60210c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c60e6b37-6fb4-4889-8f95-ad293363e22b, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=50325231-1b72-40e0-af54-5e19218597d1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:28.332 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:28 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:28.336 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 50325231-1b72-40e0-af54-5e19218597d1 in datapath 838109c2-99d3-418e-a5f0-0558fd60210c unbound from our chassis
Feb 20 09:58:28 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:28.339 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 838109c2-99d3-418e-a5f0-0558fd60210c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:28.341 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:28 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:28.340 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[b405b11d-4509-4b96-b7e7-019b2960f356]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:28 np0005625204.localdomain podman[319683]: 
Feb 20 09:58:28 np0005625204.localdomain podman[319683]: 2026-02-20 09:58:28.380132607 +0000 UTC m=+0.094023725 container create 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Feb 20 09:58:28 np0005625204.localdomain systemd[1]: Started libpod-conmon-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43.scope.
Feb 20 09:58:28 np0005625204.localdomain podman[319683]: 2026-02-20 09:58:28.33654815 +0000 UTC m=+0.050439258 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:28 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:28 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/af85346b1dd7396e797437697a4bc476248c2106c48a90c416e25973a6117729/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:28 np0005625204.localdomain podman[319683]: 2026-02-20 09:58:28.459199797 +0000 UTC m=+0.173090905 container init 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:28 np0005625204.localdomain podman[319683]: 2026-02-20 09:58:28.472130981 +0000 UTC m=+0.186022089 container start 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 20 09:58:28 np0005625204.localdomain dnsmasq[319701]: started, version 2.85 cachesize 150
Feb 20 09:58:28 np0005625204.localdomain dnsmasq[319701]: DNS service limited to local subnets
Feb 20 09:58:28 np0005625204.localdomain dnsmasq[319701]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:28 np0005625204.localdomain dnsmasq[319701]: warning: no upstream servers configured
Feb 20 09:58:28 np0005625204.localdomain dnsmasq-dhcp[319701]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:28 np0005625204.localdomain dnsmasq[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/addn_hosts - 0 addresses
Feb 20 09:58:28 np0005625204.localdomain dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/host
Feb 20 09:58:28 np0005625204.localdomain dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/opts
Feb 20 09:58:28 np0005625204.localdomain kernel: device tap50325231-1b left promiscuous mode
Feb 20 09:58:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:28.600 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:28.617 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.642 264355 INFO neutron.agent.dhcp.agent [None req-09c10960-720f-434d-9a7a-47df59168085 - - - - - -] DHCP configuration for ports {'5cac4abd-33fd-4748-939e-f73012dbea21'} is completed
Feb 20 09:58:28 np0005625204.localdomain dnsmasq[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/addn_hosts - 0 addresses
Feb 20 09:58:28 np0005625204.localdomain dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/host
Feb 20 09:58:28 np0005625204.localdomain dnsmasq-dhcp[319701]: read /var/lib/neutron/dhcp/838109c2-99d3-418e-a5f0-0558fd60210c/opts
Feb 20 09:58:28 np0005625204.localdomain podman[319721]: 2026-02-20 09:58:28.824981021 +0000 UTC m=+0.065489476 container kill 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent [None req-1e997006-4abd-4b9d-8dde-9196ebc1ab64 - - - - - -] Unable to reload_allocations dhcp for 838109c2-99d3-418e-a5f0-0558fd60210c.: neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap50325231-1b not found in namespace qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c.
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent Traceback (most recent call last):
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/dhcp/agent.py", line 264, in _call_driver
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     rv = getattr(driver, action)(**action_kwargs)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 673, in reload_allocations
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     self.device_manager.update(self.network, self.interface_name)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1899, in update
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route(network, device_name)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1610, in _set_default_route
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     self._set_default_route_ip_version(network, device_name,
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/dhcp.py", line 1539, in _set_default_route_ip_version
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     gateway = device.route.get_gateway(ip_version=ip_version)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 671, in get_gateway
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     routes = self.list_routes(ip_version, scope=scope, table=table)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 656, in list_routes
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     return list_ip_routes(self._parent.namespace, ip_version, scope=scope,
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/neutron/agent/linux/ip_lib.py", line 1611, in list_ip_routes
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     routes = privileged.list_ip_routes(namespace, ip_version, device=device,
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 333, in wrapped_f
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     return self(f, *args, **kw)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 423, in __call__
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     do = self.iter(retry_state=retry_state)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 360, in iter
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     return fut.result()
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 439, in result
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     return self.__get_result()
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib64/python3.9/concurrent/futures/_base.py", line 391, in __get_result
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     raise self._exception
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/tenacity/__init__.py", line 426, in __call__
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     result = fn(*args, **kwargs)
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/priv_context.py", line 271, in _wrap
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     return self.channel.remote_call(name, args, kwargs,
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent   File "/usr/lib/python3.9/site-packages/oslo_privsep/daemon.py", line 215, in remote_call
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent     raise exc_type(*result[2])
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent neutron.privileged.agent.linux.ip_lib.NetworkInterfaceNotFound: Network interface tap50325231-1b not found in namespace qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c.
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.851 264355 ERROR neutron.agent.dhcp.agent 
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.855 264355 INFO neutron.agent.dhcp.agent [None req-70d4c29f-e879-4681-ad82-c583ac60bbc8 - - - - - -] Synchronizing state
Feb 20 09:58:28 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:28Z|00367|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:58:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:28.976 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:28 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:28.992 264355 INFO neutron.agent.dhcp.agent [None req-063963d4-8eb4-458e-839e-1886c1f7eefb - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:58:29 np0005625204.localdomain ceph-mon[301857]: pgmap v411: 177 pgs: 177 active+clean; 196 MiB data, 992 MiB used, 41 GiB / 42 GiB avail; 79 KiB/s rd, 48 KiB/s wr, 114 op/s
Feb 20 09:58:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ec39e814-8000-487e-8406-15e7edd35489", "format": "json"}]: dispatch
Feb 20 09:58:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ec39e814-8000-487e-8406-15e7edd35489", "force": true, "format": "json"}]: dispatch
Feb 20 09:58:29 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1678028411' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:29 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1678028411' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:58:29 np0005625204.localdomain dnsmasq[319701]: exiting on receipt of SIGTERM
Feb 20 09:58:29 np0005625204.localdomain podman[319751]: 2026-02-20 09:58:29.187284161 +0000 UTC m=+0.065947011 container kill 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:29 np0005625204.localdomain systemd[1]: libpod-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43.scope: Deactivated successfully.
Feb 20 09:58:29 np0005625204.localdomain podman[319764]: 2026-02-20 09:58:29.262193092 +0000 UTC m=+0.062415032 container died 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:58:29 np0005625204.localdomain podman[319764]: 2026-02-20 09:58:29.293844857 +0000 UTC m=+0.094066767 container cleanup 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 09:58:29 np0005625204.localdomain systemd[1]: libpod-conmon-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43.scope: Deactivated successfully.
Feb 20 09:58:29 np0005625204.localdomain podman[319766]: 2026-02-20 09:58:29.34972549 +0000 UTC m=+0.137208721 container remove 3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-838109c2-99d3-418e-a5f0-0558fd60210c, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 09:58:29 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:29.373 264355 INFO neutron.agent.dhcp.agent [None req-e455f8f0-087d-4011-b3b1-264293719a26 - - - - - -] Synchronizing state complete
Feb 20 09:58:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-af85346b1dd7396e797437697a4bc476248c2106c48a90c416e25973a6117729-merged.mount: Deactivated successfully.
Feb 20 09:58:29 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3c8487eadd5f336ba2cd36bccbf7ad9f8ee254b9dc2bfc79008d155b1eb87b43-userdata-shm.mount: Deactivated successfully.
Feb 20 09:58:29 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d838109c2\x2d99d3\x2d418e\x2da5f0\x2d0558fd60210c.mount: Deactivated successfully.
Feb 20 09:58:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:29.979 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:29 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:29.984 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1881893775' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:58:30 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1881893775' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:58:31 np0005625204.localdomain ceph-mon[301857]: pgmap v412: 177 pgs: 177 active+clean; 196 MiB data, 992 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 23 KiB/s wr, 58 op/s
Feb 20 09:58:31 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1881893775' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:31 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1881893775' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:31 np0005625204.localdomain podman[319792]: 2026-02-20 09:58:31.15295974 +0000 UTC m=+0.082739671 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:58:31 np0005625204.localdomain podman[319792]: 2026-02-20 09:58:31.191083722 +0000 UTC m=+0.120863673 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:58:31 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:58:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:32 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:32.242 264355 INFO neutron.agent.linux.ip_lib [None req-6517a283-69ca-4a9d-949a-a8fed98055d1 - - - - - -] Device tap75e94885-97 cannot be used as it has no MAC address
Feb 20 09:58:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:32.272 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:32 np0005625204.localdomain kernel: device tap75e94885-97 entered promiscuous mode
Feb 20 09:58:32 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581512.2820] manager: (tap75e94885-97): new Generic device (/org/freedesktop/NetworkManager/Devices/58)
Feb 20 09:58:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:32.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:32Z|00368|binding|INFO|Claiming lport 75e94885-97d0-4be1-9ebe-cfa150020c4f for this chassis.
Feb 20 09:58:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:32Z|00369|binding|INFO|75e94885-97d0-4be1-9ebe-cfa150020c4f: Claiming unknown
Feb 20 09:58:32 np0005625204.localdomain systemd-udevd[319825]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:32.295 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=75e94885-97d0-4be1-9ebe-cfa150020c4f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:32.296 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 75e94885-97d0-4be1-9ebe-cfa150020c4f in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b bound to our chassis
Feb 20 09:58:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:32.297 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 01006bb5-6e96-485f-99d6-c3f27965c51b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:58:32 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:32.298 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[895bdf37-5c51-4c36-8f0e-cbf59826b133]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:32Z|00370|binding|INFO|Setting lport 75e94885-97d0-4be1-9ebe-cfa150020c4f ovn-installed in OVS
Feb 20 09:58:32 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:32Z|00371|binding|INFO|Setting lport 75e94885-97d0-4be1-9ebe-cfa150020c4f up in Southbound
Feb 20 09:58:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:32.320 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:32.324 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap75e94885-97: No such device
Feb 20 09:58:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:32.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:32.408 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:32 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 e198: 6 total, 6 up, 6 in
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1453609140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1453609140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: pgmap v413: 177 pgs: 177 active+clean; 196 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 104 KiB/s rd, 70 KiB/s wr, 148 op/s
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: osdmap e198: 6 total, 6 up, 6 in
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1453609140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:58:33 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1453609140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:58:33 np0005625204.localdomain podman[319896]: 
Feb 20 09:58:33 np0005625204.localdomain podman[319896]: 2026-02-20 09:58:33.359777408 +0000 UTC m=+0.090537879 container create 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:58:33 np0005625204.localdomain systemd[1]: Started libpod-conmon-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646.scope.
Feb 20 09:58:33 np0005625204.localdomain podman[319896]: 2026-02-20 09:58:33.315745576 +0000 UTC m=+0.046506087 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:33 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:33 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f21ae561d6365eec38a5d27ea0386b57d32a2312c2b64cc194e30ff4e8fac29e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:33 np0005625204.localdomain podman[319896]: 2026-02-20 09:58:33.439445265 +0000 UTC m=+0.170205776 container init 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:33 np0005625204.localdomain podman[319896]: 2026-02-20 09:58:33.454511345 +0000 UTC m=+0.185271816 container start 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:58:33 np0005625204.localdomain dnsmasq[319914]: started, version 2.85 cachesize 150
Feb 20 09:58:33 np0005625204.localdomain dnsmasq[319914]: DNS service limited to local subnets
Feb 20 09:58:33 np0005625204.localdomain dnsmasq[319914]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:33 np0005625204.localdomain dnsmasq[319914]: warning: no upstream servers configured
Feb 20 09:58:33 np0005625204.localdomain dnsmasq-dhcp[319914]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:33 np0005625204.localdomain dnsmasq[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses
Feb 20 09:58:33 np0005625204.localdomain dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host
Feb 20 09:58:33 np0005625204.localdomain dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts
Feb 20 09:58:33 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:33.626 264355 INFO neutron.agent.dhcp.agent [None req-a608acbe-e3c7-4374-9dec-33e2310b4308 - - - - - -] DHCP configuration for ports {'536b2372-3fbe-40de-9c4f-f7c748ef2941'} is completed
Feb 20 09:58:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:33.979 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:34 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:58:34 np0005625204.localdomain podman[319915]: 2026-02-20 09:58:34.147855619 +0000 UTC m=+0.087026033 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, version=9.7, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git)
Feb 20 09:58:34 np0005625204.localdomain podman[319915]: 2026-02-20 09:58:34.164024322 +0000 UTC m=+0.103194786 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, release=1770267347, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Feb 20 09:58:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:58:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:58:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "92707d6b-2313-43a9-b4cc-78c6dfb385e9", "format": "json"}]: dispatch
Feb 20 09:58:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "92707d6b-2313-43a9-b4cc-78c6dfb385e9", "force": true, "format": "json"}]: dispatch
Feb 20 09:58:34 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:58:34 np0005625204.localdomain systemd[1]: tmp-crun.NWw8Wy.mount: Deactivated successfully.
Feb 20 09:58:34 np0005625204.localdomain dnsmasq[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses
Feb 20 09:58:34 np0005625204.localdomain dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host
Feb 20 09:58:34 np0005625204.localdomain dnsmasq-dhcp[319914]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts
Feb 20 09:58:34 np0005625204.localdomain podman[319952]: 2026-02-20 09:58:34.701769786 +0000 UTC m=+0.071307843 container kill 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:34.982 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:35 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:35.073 264355 INFO neutron.agent.dhcp.agent [None req-4359195f-f41d-48b2-88d8-b018c7a86e8d - - - - - -] DHCP configuration for ports {'536b2372-3fbe-40de-9c4f-f7c748ef2941', '75e94885-97d0-4be1-9ebe-cfa150020c4f'} is completed
Feb 20 09:58:35 np0005625204.localdomain ceph-mon[301857]: pgmap v415: 177 pgs: 177 active+clean; 196 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 75 KiB/s rd, 53 KiB/s wr, 105 op/s
Feb 20 09:58:35 np0005625204.localdomain dnsmasq[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/addn_hosts - 0 addresses
Feb 20 09:58:35 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/host
Feb 20 09:58:35 np0005625204.localdomain dnsmasq-dhcp[319067]: read /var/lib/neutron/dhcp/93dbe9b6-4551-4902-9476-0f2070facdb5/opts
Feb 20 09:58:35 np0005625204.localdomain podman[319990]: 2026-02-20 09:58:35.395623927 +0000 UTC m=+0.038675320 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:58:35 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:35Z|00372|binding|INFO|Releasing lport b3eab012-7666-4655-9b51-e4c7e9621497 from this chassis (sb_readonly=0)
Feb 20 09:58:35 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:35Z|00373|binding|INFO|Setting lport b3eab012-7666-4655-9b51-e4c7e9621497 down in Southbound
Feb 20 09:58:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:35.800 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:35 np0005625204.localdomain kernel: device tapb3eab012-76 left promiscuous mode
Feb 20 09:58:35 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:35.809 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93dbe9b6-4551-4902-9476-0f2070facdb5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0e47cd34a784cbb89cbe56eafed5650', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7f70c52-a5b6-4d06-85ef-b3fdc5aa9c4b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=b3eab012-7666-4655-9b51-e4c7e9621497) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:35 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:35.811 162652 INFO neutron.agent.ovn.metadata.agent [-] Port b3eab012-7666-4655-9b51-e4c7e9621497 in datapath 93dbe9b6-4551-4902-9476-0f2070facdb5 unbound from our chassis
Feb 20 09:58:35 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:35.816 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 93dbe9b6-4551-4902-9476-0f2070facdb5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:35 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:35.817 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[e11353cb-063d-479d-8b9a-caf738a3c9d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:35.822 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:35.823 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:36 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:58:36 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:58:36 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:58:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:36.356 162652 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 520cd37f-c614-4b43-98b3-c526073f80d7 with type ""
Feb 20 09:58:36 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:36Z|00374|binding|INFO|Removing iface tap75e94885-97 ovn-installed in OVS
Feb 20 09:58:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:36.358 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=75e94885-97d0-4be1-9ebe-cfa150020c4f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:36 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:36Z|00375|binding|INFO|Removing lport 75e94885-97d0-4be1-9ebe-cfa150020c4f ovn-installed in OVS
Feb 20 09:58:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:36.359 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 75e94885-97d0-4be1-9ebe-cfa150020c4f in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b unbound from our chassis
Feb 20 09:58:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:36.362 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01006bb5-6e96-485f-99d6-c3f27965c51b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:36 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:36.399 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[af81aeee-c425-42ac-9265-a08ece9342c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:36.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:36.402 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:36 np0005625204.localdomain dnsmasq[319914]: exiting on receipt of SIGTERM
Feb 20 09:58:36 np0005625204.localdomain podman[320028]: 2026-02-20 09:58:36.475440956 +0000 UTC m=+0.055943505 container kill 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 09:58:36 np0005625204.localdomain systemd[1]: libpod-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646.scope: Deactivated successfully.
Feb 20 09:58:36 np0005625204.localdomain podman[320042]: 2026-02-20 09:58:36.540257701 +0000 UTC m=+0.052551672 container died 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:36 np0005625204.localdomain systemd[1]: tmp-crun.MgrJx2.mount: Deactivated successfully.
Feb 20 09:58:36 np0005625204.localdomain podman[320042]: 2026-02-20 09:58:36.581931761 +0000 UTC m=+0.094225692 container cleanup 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 20 09:58:36 np0005625204.localdomain systemd[1]: libpod-conmon-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646.scope: Deactivated successfully.
Feb 20 09:58:36 np0005625204.localdomain podman[320044]: 2026-02-20 09:58:36.605966133 +0000 UTC m=+0.110020382 container remove 6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:58:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:36.617 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:36 np0005625204.localdomain kernel: device tap75e94885-97 left promiscuous mode
Feb 20 09:58:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:36.636 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:36.654 264355 INFO neutron.agent.dhcp.agent [None req-e455f8f0-087d-4011-b3b1-264293719a26 - - - - - -] Synchronizing state
Feb 20 09:58:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:36.801 264355 INFO neutron.agent.dhcp.agent [None req-8d45ea50-451f-40e0-9662-1427715115cb - - - - - -] All active networks have been fetched through RPC.
Feb 20 09:58:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:36.802 264355 INFO neutron.agent.dhcp.agent [-] Starting network 01006bb5-6e96-485f-99d6-c3f27965c51b dhcp configuration
Feb 20 09:58:37 np0005625204.localdomain ceph-mon[301857]: pgmap v416: 177 pgs: 177 active+clean; 197 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 63 KiB/s wr, 122 op/s
Feb 20 09:58:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:58:37 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:37.324 264355 INFO neutron.agent.linux.ip_lib [None req-f6cba991-fb9f-4c51-a0ee-e863ba004ca3 - - - - - -] Device tap44cb708b-a5 cannot be used as it has no MAC address
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.351 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain kernel: device tap44cb708b-a5 entered promiscuous mode
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.358 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581517.3590] manager: (tap44cb708b-a5): new Generic device (/org/freedesktop/NetworkManager/Devices/59)
Feb 20 09:58:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:37Z|00376|binding|INFO|Claiming lport 44cb708b-a522-48fc-9798-157cbbbe1988 for this chassis.
Feb 20 09:58:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:37Z|00377|binding|INFO|44cb708b-a522-48fc-9798-157cbbbe1988: Claiming unknown
Feb 20 09:58:37 np0005625204.localdomain systemd-udevd[320078]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:37Z|00378|binding|INFO|Setting lport 44cb708b-a522-48fc-9798-157cbbbe1988 ovn-installed in OVS
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.367 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:37Z|00379|binding|INFO|Setting lport 44cb708b-a522-48fc-9798-157cbbbe1988 up in Southbound
Feb 20 09:58:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:37.371 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=44cb708b-a522-48fc-9798-157cbbbe1988) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.370 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:37.375 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 44cb708b-a522-48fc-9798-157cbbbe1988 in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b bound to our chassis
Feb 20 09:58:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:37.379 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 004e0c9d-465d-4bd4-9c9e-d4787ee58d30 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:58:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:37.379 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01006bb5-6e96-485f-99d6-c3f27965c51b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:37 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:37.381 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[dbea7bad-ef4b-4d35-aba2-d0cfdc548581]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.389 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.427 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:37.508 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f21ae561d6365eec38a5d27ea0386b57d32a2312c2b64cc194e30ff4e8fac29e-merged.mount: Deactivated successfully.
Feb 20 09:58:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6eafe6e21c60cb8ec8928b55db1ad6bedb306beb018bc25e6a75f81e134fa646-userdata-shm.mount: Deactivated successfully.
Feb 20 09:58:38 np0005625204.localdomain podman[320134]: 
Feb 20 09:58:38 np0005625204.localdomain podman[320134]: 2026-02-20 09:58:38.359187441 +0000 UTC m=+0.093172660 container create e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:38 np0005625204.localdomain systemd[1]: Started libpod-conmon-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415.scope.
Feb 20 09:58:38 np0005625204.localdomain podman[320134]: 2026-02-20 09:58:38.314576872 +0000 UTC m=+0.048562111 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:38 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:38 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f3c6ec8dbf39580881a394b21a90590bb4d78083db4873530e20c2b99ab090a3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:38 np0005625204.localdomain podman[320134]: 2026-02-20 09:58:38.45893414 +0000 UTC m=+0.192919359 container init e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 20 09:58:38 np0005625204.localdomain podman[320134]: 2026-02-20 09:58:38.468132441 +0000 UTC m=+0.202117660 container start e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 09:58:38 np0005625204.localdomain dnsmasq[320151]: started, version 2.85 cachesize 150
Feb 20 09:58:38 np0005625204.localdomain dnsmasq[320151]: DNS service limited to local subnets
Feb 20 09:58:38 np0005625204.localdomain dnsmasq[320151]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:38 np0005625204.localdomain dnsmasq[320151]: warning: no upstream servers configured
Feb 20 09:58:38 np0005625204.localdomain dnsmasq-dhcp[320151]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:38 np0005625204.localdomain dnsmasq[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses
Feb 20 09:58:38 np0005625204.localdomain dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host
Feb 20 09:58:38 np0005625204.localdomain dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts
Feb 20 09:58:38 np0005625204.localdomain systemd[1]: tmp-crun.QgJDG1.mount: Deactivated successfully.
Feb 20 09:58:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:38.530 264355 INFO neutron.agent.dhcp.agent [None req-6cc633c4-1491-4b60-a6ee-0e2ab62fa379 - - - - - -] Finished network 01006bb5-6e96-485f-99d6-c3f27965c51b dhcp configuration
Feb 20 09:58:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:38.530 264355 INFO neutron.agent.dhcp.agent [None req-8d45ea50-451f-40e0-9662-1427715115cb - - - - - -] Synchronizing state complete
Feb 20 09:58:38 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:38.635 264355 INFO neutron.agent.dhcp.agent [None req-3c5cc925-3a4e-40ed-aa05-dc24ae9cfdb8 - - - - - -] DHCP configuration for ports {'536b2372-3fbe-40de-9c4f-f7c748ef2941'} is completed
Feb 20 09:58:38 np0005625204.localdomain systemd[1]: tmp-crun.qE16Pu.mount: Deactivated successfully.
Feb 20 09:58:38 np0005625204.localdomain dnsmasq[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/addn_hosts - 0 addresses
Feb 20 09:58:38 np0005625204.localdomain dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/host
Feb 20 09:58:38 np0005625204.localdomain dnsmasq-dhcp[320151]: read /var/lib/neutron/dhcp/01006bb5-6e96-485f-99d6-c3f27965c51b/opts
Feb 20 09:58:38 np0005625204.localdomain podman[320169]: 2026-02-20 09:58:38.753653059 +0000 UTC m=+0.075168480 container kill e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 09:58:39 np0005625204.localdomain dnsmasq[320151]: exiting on receipt of SIGTERM
Feb 20 09:58:39 np0005625204.localdomain podman[320206]: 2026-02-20 09:58:39.198287606 +0000 UTC m=+0.063355890 container kill e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:58:39 np0005625204.localdomain systemd[1]: libpod-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415.scope: Deactivated successfully.
Feb 20 09:58:39 np0005625204.localdomain ceph-mon[301857]: pgmap v417: 177 pgs: 177 active+clean; 197 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 81 KiB/s rd, 60 KiB/s wr, 116 op/s
Feb 20 09:58:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:58:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 09:58:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 09:58:39 np0005625204.localdomain podman[320219]: 2026-02-20 09:58:39.263853025 +0000 UTC m=+0.050567602 container died e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 20 09:58:39 np0005625204.localdomain podman[320219]: 2026-02-20 09:58:39.29686312 +0000 UTC m=+0.083577647 container cleanup e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 09:58:39 np0005625204.localdomain systemd[1]: libpod-conmon-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415.scope: Deactivated successfully.
Feb 20 09:58:39 np0005625204.localdomain podman[320220]: 2026-02-20 09:58:39.350085521 +0000 UTC m=+0.129489326 container remove e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-01006bb5-6e96-485f-99d6-c3f27965c51b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:58:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:39.410 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:39 np0005625204.localdomain kernel: device tap44cb708b-a5 left promiscuous mode
Feb 20 09:58:39 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:39Z|00380|binding|INFO|Releasing lport 44cb708b-a522-48fc-9798-157cbbbe1988 from this chassis (sb_readonly=0)
Feb 20 09:58:39 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:39Z|00381|binding|INFO|Setting lport 44cb708b-a522-48fc-9798-157cbbbe1988 down in Southbound
Feb 20 09:58:39 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:39.418 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01006bb5-6e96-485f-99d6-c3f27965c51b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '80723b5de8af4075aa84c53e89b4d020', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bf7da8ed-e6e8-4b16-9eb3-015b0c928d92, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=44cb708b-a522-48fc-9798-157cbbbe1988) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:39 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:39.420 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 44cb708b-a522-48fc-9798-157cbbbe1988 in datapath 01006bb5-6e96-485f-99d6-c3f27965c51b unbound from our chassis
Feb 20 09:58:39 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:39.421 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 01006bb5-6e96-485f-99d6-c3f27965c51b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:58:39 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:39.423 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d5d2a1cb-eee2-4fb1-af72-6accf4f9e6ac]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:39.434 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-f3c6ec8dbf39580881a394b21a90590bb4d78083db4873530e20c2b99ab090a3-merged.mount: Deactivated successfully.
Feb 20 09:58:39 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e815b540d431fc02db938edcdeb58b0efc0b292baab608779c243dea77ddf415-userdata-shm.mount: Deactivated successfully.
Feb 20 09:58:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:39.646 264355 INFO neutron.agent.dhcp.agent [None req-be451e73-74db-4370-b7ab-24915ca4d317 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:58:39 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d01006bb5\x2d6e96\x2d485f\x2d99d6\x2dc3f27965c51b.mount: Deactivated successfully.
Feb 20 09:58:39 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:39.649 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:58:39 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:39.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:58:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:58:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:40Z|00382|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:58:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:40.120 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:40 np0005625204.localdomain podman[320249]: 2026-02-20 09:58:40.157047247 +0000 UTC m=+0.092761607 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 20 09:58:40 np0005625204.localdomain podman[320249]: 2026-02-20 09:58:40.169072994 +0000 UTC m=+0.104787324 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 09:58:40 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:58:40 np0005625204.localdomain systemd[1]: tmp-crun.14o06S.mount: Deactivated successfully.
Feb 20 09:58:40 np0005625204.localdomain podman[320248]: 2026-02-20 09:58:40.230885618 +0000 UTC m=+0.168018631 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:58:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:58:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:58:40 np0005625204.localdomain podman[320248]: 2026-02-20 09:58:40.275385884 +0000 UTC m=+0.212518907 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:40 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:58:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:41Z|00383|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:58:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:41.264 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:41 np0005625204.localdomain ceph-mon[301857]: pgmap v418: 177 pgs: 177 active+clean; 197 MiB data, 997 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 60 KiB/s wr, 113 op/s
Feb 20 09:58:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:41.629 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:58:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:42.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:42 np0005625204.localdomain systemd[1]: tmp-crun.SxBFXX.mount: Deactivated successfully.
Feb 20 09:58:42 np0005625204.localdomain dnsmasq[319067]: exiting on receipt of SIGTERM
Feb 20 09:58:42 np0005625204.localdomain podman[320307]: 2026-02-20 09:58:42.461523639 +0000 UTC m=+0.105487575 container kill fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 20 09:58:42 np0005625204.localdomain systemd[1]: libpod-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb.scope: Deactivated successfully.
Feb 20 09:58:42 np0005625204.localdomain podman[320323]: 2026-02-20 09:58:42.550570973 +0000 UTC m=+0.063466445 container died fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:58:42 np0005625204.localdomain systemd[1]: tmp-crun.XaDAgD.mount: Deactivated successfully.
Feb 20 09:58:42 np0005625204.localdomain podman[320323]: 2026-02-20 09:58:42.605981851 +0000 UTC m=+0.118877273 container remove fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-93dbe9b6-4551-4902-9476-0f2070facdb5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:58:42 np0005625204.localdomain systemd[1]: libpod-conmon-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb.scope: Deactivated successfully.
Feb 20 09:58:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:42.852 264355 INFO neutron.agent.dhcp.agent [None req-caf08bf9-d1c1-4d8c-aacf-3246e5ef4799 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:58:42 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:42.853 264355 INFO neutron.agent.dhcp.agent [None req-caf08bf9-d1c1-4d8c-aacf-3246e5ef4799 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:58:43 np0005625204.localdomain sudo[320349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:58:43 np0005625204.localdomain sudo[320349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:58:43 np0005625204.localdomain sudo[320349]: pam_unix(sudo:session): session closed for user root
Feb 20 09:58:43 np0005625204.localdomain sudo[320367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 20 09:58:43 np0005625204.localdomain sudo[320367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:58:43 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:43.321 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:58:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-6954e2904a7005976441dfd44549da912216319c18455669f4ef3b3e79ca01de-merged.mount: Deactivated successfully.
Feb 20 09:58:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd2291d46eb7f4e0330b8e584357085da0c2364b80713196a01f53b73e887dbb-userdata-shm.mount: Deactivated successfully.
Feb 20 09:58:43 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d93dbe9b6\x2d4551\x2d4902\x2d9476\x2d0f2070facdb5.mount: Deactivated successfully.
Feb 20 09:58:44 np0005625204.localdomain ceph-mon[301857]: pgmap v419: 177 pgs: 177 active+clean; 197 MiB data, 998 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 49 KiB/s wr, 36 op/s
Feb 20 09:58:44 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:58:44 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:58:44 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:58:44 np0005625204.localdomain podman[320457]: 2026-02-20 09:58:44.16921146 +0000 UTC m=+0.115200891 container exec 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, GIT_BRANCH=main, org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.42.2, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.component=rhceph-container, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2026-02-09T10:25:24Z, org.opencontainers.image.created=2026-02-09T10:25:24Z, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7)
Feb 20 09:58:44 np0005625204.localdomain podman[320457]: 2026-02-20 09:58:44.309269137 +0000 UTC m=+0.255258608 container exec_died 9ceeaa0407d502f3fea9c7b7d5c2dfe483f1d9f8f94db6e789aa81376ccd4b92 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-a8557ee9-b55d-5519-942c-cf8f6172f1d8-crash-np0005625204, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vcs-ref=b3986a21dbd047e1edac0f24f7c0e811518e5b14, vendor=Red Hat, Inc., org.opencontainers.image.revision=b3986a21dbd047e1edac0f24f7c0e811518e5b14, GIT_CLEAN=True, org.opencontainers.image.created=2026-02-09T10:25:24Z, release=1770267347, version=7, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.42.2, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, build-date=2026-02-09T10:25:24Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 20 09:58:44 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:44.989 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:45 np0005625204.localdomain sudo[320367]: pam_unix(sudo:session): session closed for user root
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: pgmap v420: 177 pgs: 177 active+clean; 197 MiB data, 998 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 46 KiB/s wr, 34 op/s
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:45 np0005625204.localdomain sudo[320579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:58:45 np0005625204.localdomain sudo[320579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:58:45 np0005625204.localdomain sudo[320579]: pam_unix(sudo:session): session closed for user root
Feb 20 09:58:45 np0005625204.localdomain sudo[320597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:58:45 np0005625204.localdomain sudo[320597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:58:46 np0005625204.localdomain sudo[320597]: pam_unix(sudo:session): session closed for user root
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:58:46 np0005625204.localdomain sudo[320648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:58:46 np0005625204.localdomain sudo[320648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:58:46 np0005625204.localdomain sudo[320648]: pam_unix(sudo:session): session closed for user root
Feb 20 09:58:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: pgmap v421: 177 pgs: 177 active+clean; 197 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 54 KiB/s wr, 33 op/s
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625203.localdomain to 836.6M
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625202.localdomain to 836.6M
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625203.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625202.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: Adjusting osd_memory_target on np0005625204.localdomain to 836.6M
Feb 20 09:58:47 np0005625204.localdomain ceph-mon[301857]: Unable to set osd_memory_target on np0005625204.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 20 09:58:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:47.643 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:58:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:58:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:58:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:58:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:58:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18361 "" "Go-http-client/1.1"
Feb 20 09:58:47 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:47Z|00384|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:58:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:47.990 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:48 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:58:48 np0005625204.localdomain podman[320666]: 2026-02-20 09:58:48.166886623 +0000 UTC m=+0.097267625 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 09:58:48 np0005625204.localdomain podman[320666]: 2026-02-20 09:58:48.177159175 +0000 UTC m=+0.107540147 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 09:58:48 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:58:49 np0005625204.localdomain ceph-mon[301857]: pgmap v422: 177 pgs: 177 active+clean; 197 MiB data, 999 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 39 KiB/s wr, 6 op/s
Feb 20 09:58:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:58:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:58:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:58:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:58:49 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:49.992 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:58:51 np0005625204.localdomain ceph-mon[301857]: pgmap v423: 177 pgs: 177 active+clean; 197 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 39 KiB/s wr, 7 op/s
Feb 20 09:58:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:51 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:51.883 264355 INFO neutron.agent.linux.ip_lib [None req-65f50249-de55-4a89-86ea-a1b7d8e9557e - - - - - -] Device taped429b7d-6e cannot be used as it has no MAC address
Feb 20 09:58:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:51.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:51 np0005625204.localdomain kernel: device taped429b7d-6e entered promiscuous mode
Feb 20 09:58:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:51.913 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:51 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581531.9140] manager: (taped429b7d-6e): new Generic device (/org/freedesktop/NetworkManager/Devices/60)
Feb 20 09:58:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:51Z|00385|binding|INFO|Claiming lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 for this chassis.
Feb 20 09:58:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:51Z|00386|binding|INFO|ed429b7d-6ede-437c-a873-d5a788bbc1e3: Claiming unknown
Feb 20 09:58:51 np0005625204.localdomain systemd-udevd[320695]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:51.934 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5dde04c-263c-492d-a9af-a31ca9074a96, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ed429b7d-6ede-437c-a873-d5a788bbc1e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:51.936 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ed429b7d-6ede-437c-a873-d5a788bbc1e3 in datapath fc30869a-497f-4b61-b96d-28cefb439c42 bound to our chassis
Feb 20 09:58:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:51.939 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 49b3a0fb-cff4-4472-ae6a-4c0aa102af48 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:58:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:51.939 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc30869a-497f-4b61-b96d-28cefb439c42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:51 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:51.942 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[9853aed1-e05c-4bae-9806-1a41d57ae3e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:51Z|00387|binding|INFO|Setting lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 ovn-installed in OVS
Feb 20 09:58:51 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:51Z|00388|binding|INFO|Setting lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 up in Southbound
Feb 20 09:58:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:51.955 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:51.997 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:52.031 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:52 np0005625204.localdomain podman[320748]: 
Feb 20 09:58:52 np0005625204.localdomain podman[320748]: 2026-02-20 09:58:52.920565018 +0000 UTC m=+0.093884652 container create fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:58:52 np0005625204.localdomain sshd[320761]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:52 np0005625204.localdomain systemd[1]: Started libpod-conmon-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854.scope.
Feb 20 09:58:52 np0005625204.localdomain podman[320748]: 2026-02-20 09:58:52.876456263 +0000 UTC m=+0.049775867 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:52 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:52 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0c41e1a106862f8d161432c72fe9494234bced6572d34f77ad98b154cf87b9a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:53 np0005625204.localdomain podman[320748]: 2026-02-20 09:58:53.002220185 +0000 UTC m=+0.175539789 container init fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:58:53 np0005625204.localdomain podman[320748]: 2026-02-20 09:58:53.013290143 +0000 UTC m=+0.186609747 container start fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 20 09:58:53 np0005625204.localdomain dnsmasq[320767]: started, version 2.85 cachesize 150
Feb 20 09:58:53 np0005625204.localdomain dnsmasq[320767]: DNS service limited to local subnets
Feb 20 09:58:53 np0005625204.localdomain dnsmasq[320767]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:53 np0005625204.localdomain dnsmasq[320767]: warning: no upstream servers configured
Feb 20 09:58:53 np0005625204.localdomain dnsmasq-dhcp[320767]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:53 np0005625204.localdomain dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 0 addresses
Feb 20 09:58:53 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host
Feb 20 09:58:53 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts
Feb 20 09:58:53 np0005625204.localdomain sshd[320761]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:58:53 np0005625204.localdomain ceph-mon[301857]: pgmap v424: 177 pgs: 177 active+clean; 253 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 4.7 MiB/s wr, 37 op/s
Feb 20 09:58:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:58:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 09:58:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 09:58:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:53.584 264355 INFO neutron.agent.dhcp.agent [None req-8fc9c3fc-284b-447f-a2b0-22674525fb40 - - - - - -] DHCP configuration for ports {'d398f4e9-922d-4f2f-be79-41ec31c42aab'} is completed
Feb 20 09:58:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:58:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:58:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bc980eec-f143-463a-b3c7-dd1de3420de8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bc980eec-f143-463a-b3c7-dd1de3420de8", "format": "json"}]: dispatch
Feb 20 09:58:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:55.037 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:55 np0005625204.localdomain ceph-mon[301857]: pgmap v425: 177 pgs: 177 active+clean; 253 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 4.7 MiB/s wr, 34 op/s
Feb 20 09:58:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 20 09:58:55 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/438591704' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:56 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:56.156 264355 INFO neutron.agent.linux.ip_lib [None req-6fbd4e58-3077-4d00-af5b-1e6e04439f73 - - - - - -] Device tapca71dfc6-5b cannot be used as it has no MAC address
Feb 20 09:58:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:56.238 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:56 np0005625204.localdomain kernel: device tapca71dfc6-5b entered promiscuous mode
Feb 20 09:58:56 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581536.2452] manager: (tapca71dfc6-5b): new Generic device (/org/freedesktop/NetworkManager/Devices/61)
Feb 20 09:58:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:56Z|00389|binding|INFO|Claiming lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 for this chassis.
Feb 20 09:58:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:56Z|00390|binding|INFO|ca71dfc6-5b76-44e7-a509-dbdd64a83fd3: Claiming unknown
Feb 20 09:58:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:56.246 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:56 np0005625204.localdomain systemd-udevd[320779]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:58:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:56.265 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02320057-bc72-4d8a-838a-f4d7286b15bc, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ca71dfc6-5b76-44e7-a509-dbdd64a83fd3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:56.267 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 in datapath 31019f31-c68c-481a-9b72-3317c35499b9 bound to our chassis
Feb 20 09:58:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:56.273 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 892814dd-0933-4ec3-939e-24147c680731 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 09:58:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:56.273 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31019f31-c68c-481a-9b72-3317c35499b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:58:56 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:56.275 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[999f7548-3b27-42f1-9704-1af328891dc4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:58:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:56.285 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:56Z|00391|binding|INFO|Setting lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 ovn-installed in OVS
Feb 20 09:58:56 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:58:56Z|00392|binding|INFO|Setting lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 up in Southbound
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:56.290 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapca71dfc6-5b: No such device
Feb 20 09:58:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:58:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:56.331 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/438591704' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:58:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:58:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:58:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1695766244' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:56.365 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:56 np0005625204.localdomain podman[320786]: 2026-02-20 09:58:56.399819223 +0000 UTC m=+0.102211796 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:58:56 np0005625204.localdomain podman[320786]: 2026-02-20 09:58:56.408993792 +0000 UTC m=+0.111386355 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:58:56 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:58:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:58:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:58:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:58:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:58:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:58:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:58:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e199 e199: 6 total, 6 up, 6 in
Feb 20 09:58:57 np0005625204.localdomain ceph-mon[301857]: pgmap v426: 177 pgs: 177 active+clean; 436 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 1.8 MiB/s rd, 18 MiB/s wr, 85 op/s
Feb 20 09:58:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:58:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e9bd40a1-418b-4133-bfcb-60a8a2bd8eed", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:58:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:58:57 np0005625204.localdomain ceph-mon[301857]: osdmap e199: 6 total, 6 up, 6 in
Feb 20 09:58:57 np0005625204.localdomain podman[320872]: 
Feb 20 09:58:57 np0005625204.localdomain podman[320872]: 2026-02-20 09:58:57.395434547 +0000 UTC m=+0.123928097 container create 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 09:58:57 np0005625204.localdomain systemd[1]: Started libpod-conmon-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674.scope.
Feb 20 09:58:57 np0005625204.localdomain podman[320872]: 2026-02-20 09:58:57.34599048 +0000 UTC m=+0.074484070 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 09:58:57 np0005625204.localdomain systemd[1]: tmp-crun.qzkYdY.mount: Deactivated successfully.
Feb 20 09:58:57 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 09:58:57 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/095be6d19a7c831778435ebe06a1dd1952a6d636d2c8437ba888c4b45c7d81ce/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 09:58:57 np0005625204.localdomain podman[320872]: 2026-02-20 09:58:57.490294907 +0000 UTC m=+0.218788457 container init 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 20 09:58:57 np0005625204.localdomain podman[320872]: 2026-02-20 09:58:57.500147467 +0000 UTC m=+0.228641017 container start 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:58:57 np0005625204.localdomain dnsmasq[320889]: started, version 2.85 cachesize 150
Feb 20 09:58:57 np0005625204.localdomain dnsmasq[320889]: DNS service limited to local subnets
Feb 20 09:58:57 np0005625204.localdomain dnsmasq[320889]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 09:58:57 np0005625204.localdomain dnsmasq[320889]: warning: no upstream servers configured
Feb 20 09:58:57 np0005625204.localdomain dnsmasq-dhcp[320889]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 09:58:57 np0005625204.localdomain dnsmasq[320889]: read /var/lib/neutron/dhcp/31019f31-c68c-481a-9b72-3317c35499b9/addn_hosts - 0 addresses
Feb 20 09:58:57 np0005625204.localdomain dnsmasq-dhcp[320889]: read /var/lib/neutron/dhcp/31019f31-c68c-481a-9b72-3317c35499b9/host
Feb 20 09:58:57 np0005625204.localdomain dnsmasq-dhcp[320889]: read /var/lib/neutron/dhcp/31019f31-c68c-481a-9b72-3317c35499b9/opts
Feb 20 09:58:57 np0005625204.localdomain sshd[320890]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:57 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:58:57.688 264355 INFO neutron.agent.dhcp.agent [None req-e4fbba05-c9f4-4cb2-88d4-e6adeed73622 - - - - - -] DHCP configuration for ports {'d3b0281c-a10c-40f6-ae69-d8e1ee2004d8'} is completed
Feb 20 09:58:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e200 e200: 6 total, 6 up, 6 in
Feb 20 09:58:58 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e9bd40a1-418b-4133-bfcb-60a8a2bd8eed", "format": "json"}]: dispatch
Feb 20 09:58:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/420351506' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:58 np0005625204.localdomain sshd[320892]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:58:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:58:58 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1250733315' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:58:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:58:59 np0005625204.localdomain sshd[320892]: Invalid user sshadmin from 154.91.170.41 port 51424
Feb 20 09:58:59 np0005625204.localdomain sshd[320890]: Invalid user ubuntu from 182.93.7.194 port 43752
Feb 20 09:58:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:59.052 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:58:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:58:59.054 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:58:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:58:59.087 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:58:59 np0005625204.localdomain sshd[320892]: Received disconnect from 154.91.170.41 port 51424:11: Bye Bye [preauth]
Feb 20 09:58:59 np0005625204.localdomain sshd[320892]: Disconnected from invalid user sshadmin 154.91.170.41 port 51424 [preauth]
Feb 20 09:58:59 np0005625204.localdomain sshd[320890]: Received disconnect from 182.93.7.194 port 43752:11: Bye Bye [preauth]
Feb 20 09:58:59 np0005625204.localdomain sshd[320890]: Disconnected from invalid user ubuntu 182.93.7.194 port 43752 [preauth]
Feb 20 09:58:59 np0005625204.localdomain ceph-mon[301857]: pgmap v428: 177 pgs: 177 active+clean; 436 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 2.1 MiB/s rd, 21 MiB/s wr, 99 op/s
Feb 20 09:58:59 np0005625204.localdomain ceph-mon[301857]: osdmap e200: 6 total, 6 up, 6 in
Feb 20 09:58:59 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1250733315' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:00.041 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: pgmap v430: 177 pgs: 177 active+clean; 469 MiB data, 1.6 GiB used, 40 GiB / 42 GiB avail; 878 KiB/s rd, 24 MiB/s wr, 80 op/s
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e9bd40a1-418b-4133-bfcb-60a8a2bd8eed", "format": "json"}]: dispatch
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e9bd40a1-418b-4133-bfcb-60a8a2bd8eed", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.435339) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540435377, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2816, "num_deletes": 266, "total_data_size": 4226566, "memory_usage": 4292672, "flush_reason": "Manual Compaction"}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540448904, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2755961, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22783, "largest_seqno": 25594, "table_properties": {"data_size": 2744524, "index_size": 7302, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 27174, "raw_average_key_size": 22, "raw_value_size": 2720547, "raw_average_value_size": 2257, "num_data_blocks": 307, "num_entries": 1205, "num_filter_entries": 1205, "num_deletions": 266, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581414, "oldest_key_time": 1771581414, "file_creation_time": 1771581540, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 13876 microseconds, and 7157 cpu microseconds.
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.449211) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2755961 bytes OK
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.449333) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.451589) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.451614) EVENT_LOG_v1 {"time_micros": 1771581540451608, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.451671) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 4213392, prev total WAL file size 4213392, number of live WAL files 2.
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.453403) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2691KB)], [33(17MB)]
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540453460, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 20730407, "oldest_snapshot_seqno": -1}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 13383 keys, 19529775 bytes, temperature: kUnknown
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540542072, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 19529775, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19449742, "index_size": 45500, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33477, "raw_key_size": 356165, "raw_average_key_size": 26, "raw_value_size": 19218584, "raw_average_value_size": 1436, "num_data_blocks": 1741, "num_entries": 13383, "num_filter_entries": 13383, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581540, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.542434) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 19529775 bytes
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.544122) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 233.7 rd, 220.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 17.1 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(14.6) write-amplify(7.1) OK, records in: 13936, records dropped: 553 output_compression: NoCompression
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.544152) EVENT_LOG_v1 {"time_micros": 1771581540544139, "job": 18, "event": "compaction_finished", "compaction_time_micros": 88723, "compaction_time_cpu_micros": 54434, "output_level": 6, "num_output_files": 1, "total_output_size": 19529775, "num_input_records": 13936, "num_output_records": 13383, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540544677, "job": 18, "event": "table_file_deletion", "file_number": 35}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581540547753, "job": 18, "event": "table_file_deletion", "file_number": 33}
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.453283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547795) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547807) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:00.547811) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 20 09:59:01 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/691707941' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/691707941' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:01.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:01.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:59:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:01.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:59:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:01.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:59:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:01.744 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 09:59:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:01.744 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:59:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:59:02 np0005625204.localdomain podman[320914]: 2026-02-20 09:59:02.135705434 +0000 UTC m=+0.072334085 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:59:02 np0005625204.localdomain podman[320914]: 2026-02-20 09:59:02.148492585 +0000 UTC m=+0.085121216 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 09:59:02 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:02.151 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:59:01Z, description=, device_id=95ba958f-a3ec-4f5b-8859-f347b6468462, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df59577c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5957f70>], id=55476999-5229-4851-a7ff-9e5f33ce284f, ip_allocation=immediate, mac_address=fa:16:3e:16:8a:b8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:49Z, description=, dns_domain=, id=fc30869a-497f-4b61-b96d-28cefb439c42, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-129517273, port_security_enabled=True, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16743, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3039, status=ACTIVE, subnets=['5feacf59-4769-4952-9d4b-54ca0cbd92a2'], tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:58:50Z, vlan_transparent=None, network_id=fc30869a-497f-4b61-b96d-28cefb439c42, port_security_enabled=False, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3078, status=DOWN, tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:59:01Z on network fc30869a-497f-4b61-b96d-28cefb439c42
Feb 20 09:59:02 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:59:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:59:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/909841881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.183 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.275 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 09:59:02 np0005625204.localdomain dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 1 addresses
Feb 20 09:59:02 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host
Feb 20 09:59:02 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts
Feb 20 09:59:02 np0005625204.localdomain systemd[1]: tmp-crun.y2vRKr.mount: Deactivated successfully.
Feb 20 09:59:02 np0005625204.localdomain podman[320955]: 2026-02-20 09:59:02.429590509 +0000 UTC m=+0.085867718 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.524 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.526 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11268MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.527 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.527 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:59:02 np0005625204.localdomain ceph-mon[301857]: pgmap v431: 177 pgs: 177 active+clean; 602 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 2.8 MiB/s rd, 37 MiB/s wr, 170 op/s
Feb 20 09:59:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3305751398' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3305751398' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/909841881' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.773 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.773 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.774 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 09:59:02 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:02.861 264355 INFO neutron.agent.dhcp.agent [None req-faff82a9-e1fb-49d1-99ad-0350d66adec5 - - - - - -] DHCP configuration for ports {'55476999-5229-4851-a7ff-9e5f33ce284f'} is completed
Feb 20 09:59:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:02.871 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.003723) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543003773, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 297, "num_deletes": 256, "total_data_size": 82012, "memory_usage": 89232, "flush_reason": "Manual Compaction"}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543006540, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 53441, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25599, "largest_seqno": 25891, "table_properties": {"data_size": 51528, "index_size": 152, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4865, "raw_average_key_size": 17, "raw_value_size": 47694, "raw_average_value_size": 172, "num_data_blocks": 7, "num_entries": 277, "num_filter_entries": 277, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581540, "oldest_key_time": 1771581540, "file_creation_time": 1771581543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 2840 microseconds, and 743 cpu microseconds.
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.006570) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 53441 bytes OK
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.006586) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.008299) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.008317) EVENT_LOG_v1 {"time_micros": 1771581543008313, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.008339) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 79797, prev total WAL file size 79797, number of live WAL files 2.
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.012725) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323733' seq:0, type:0; will stop at (end)
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(52KB)], [36(18MB)]
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543012828, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 19583216, "oldest_snapshot_seqno": -1}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 13141 keys, 18931300 bytes, temperature: kUnknown
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543087756, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 18931300, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 18854102, "index_size": 43241, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32901, "raw_key_size": 352094, "raw_average_key_size": 26, "raw_value_size": 18628400, "raw_average_value_size": 1417, "num_data_blocks": 1636, "num_entries": 13141, "num_filter_entries": 13141, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581543, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.088204) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 18931300 bytes
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.090335) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 260.9 rd, 252.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 18.6 +0.0 blob) out(18.1 +0.0 blob), read-write-amplify(720.7) write-amplify(354.2) OK, records in: 13660, records dropped: 519 output_compression: NoCompression
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.090365) EVENT_LOG_v1 {"time_micros": 1771581543090352, "job": 20, "event": "compaction_finished", "compaction_time_micros": 75065, "compaction_time_cpu_micros": 41969, "output_level": 6, "num_output_files": 1, "total_output_size": 18931300, "num_input_records": 13660, "num_output_records": 13141, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543090526, "job": 20, "event": "table_file_deletion", "file_number": 38}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581543093206, "job": 20, "event": "table_file_deletion", "file_number": 36}
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.012553) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093288) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093291) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-09:59:03.093292) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4235731053' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:03.362 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 09:59:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:03.370 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 09:59:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:03.384 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 09:59:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:03.387 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 09:59:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:03.387 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:59:03 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:03.663 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T09:59:01Z, description=, device_id=95ba958f-a3ec-4f5b-8859-f347b6468462, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df57ed8b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df6287d90>], id=55476999-5229-4851-a7ff-9e5f33ce284f, ip_allocation=immediate, mac_address=fa:16:3e:16:8a:b8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:58:49Z, description=, dns_domain=, id=fc30869a-497f-4b61-b96d-28cefb439c42, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeTest-test-network-129517273, port_security_enabled=True, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=16743, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3039, status=ACTIVE, subnets=['5feacf59-4769-4952-9d4b-54ca0cbd92a2'], tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:58:50Z, vlan_transparent=None, network_id=fc30869a-497f-4b61-b96d-28cefb439c42, port_security_enabled=False, project_id=db4c2cde6adc4016a4bb7c41aa8e59c8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3078, status=DOWN, tags=[], tenant_id=db4c2cde6adc4016a4bb7c41aa8e59c8, updated_at=2026-02-20T09:59:01Z on network fc30869a-497f-4b61-b96d-28cefb439c42
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2407210374' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4235731053' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:03 np0005625204.localdomain systemd[1]: tmp-crun.e2PzWa.mount: Deactivated successfully.
Feb 20 09:59:03 np0005625204.localdomain podman[321015]: 2026-02-20 09:59:03.923169545 +0000 UTC m=+0.082715640 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:59:03 np0005625204.localdomain dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 1 addresses
Feb 20 09:59:03 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host
Feb 20 09:59:03 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts
Feb 20 09:59:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:04.055 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 09:59:04 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:04.264 264355 INFO neutron.agent.dhcp.agent [None req-7b843224-f6b6-45c3-8534-8c2e6b37671c - - - - - -] DHCP configuration for ports {'55476999-5229-4851-a7ff-9e5f33ce284f'} is completed
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.388 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.389 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 09:59:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e201 e201: 6 total, 6 up, 6 in
Feb 20 09:59:04 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bc980eec-f143-463a-b3c7-dd1de3420de8", "format": "json"}]: dispatch
Feb 20 09:59:04 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bc980eec-f143-463a-b3c7-dd1de3420de8", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:04 np0005625204.localdomain ceph-mon[301857]: pgmap v432: 177 pgs: 177 active+clean; 602 MiB data, 2.0 GiB used, 40 GiB / 42 GiB avail; 2.7 MiB/s rd, 18 MiB/s wr, 92 op/s
Feb 20 09:59:04 np0005625204.localdomain podman[321053]: 2026-02-20 09:59:04.792360967 +0000 UTC m=+0.076701537 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:59:04 np0005625204.localdomain dnsmasq[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/addn_hosts - 0 addresses
Feb 20 09:59:04 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/host
Feb 20 09:59:04 np0005625204.localdomain dnsmasq-dhcp[320767]: read /var/lib/neutron/dhcp/fc30869a-497f-4b61-b96d-28cefb439c42/opts
Feb 20 09:59:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:59:04 np0005625204.localdomain podman[321066]: 2026-02-20 09:59:04.915686635 +0000 UTC m=+0.090831998 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1770267347, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 20 09:59:04 np0005625204.localdomain podman[321066]: 2026-02-20 09:59:04.933964902 +0000 UTC m=+0.109110315 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, release=1770267347, config_id=openstack_network_exporter)
Feb 20 09:59:04 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.973 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:04Z|00393|binding|INFO|Releasing lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 from this chassis (sb_readonly=0)
Feb 20 09:59:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:04Z|00394|binding|INFO|Setting lport ed429b7d-6ede-437c-a873-d5a788bbc1e3 down in Southbound
Feb 20 09:59:04 np0005625204.localdomain kernel: device taped429b7d-6e left promiscuous mode
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.984 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:04.984 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc30869a-497f-4b61-b96d-28cefb439c42', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d5dde04c-263c-492d-a9af-a31ca9074a96, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ed429b7d-6ede-437c-a873-d5a788bbc1e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:59:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:04.987 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ed429b7d-6ede-437c-a873-d5a788bbc1e3 in datapath fc30869a-497f-4b61-b96d-28cefb439c42 unbound from our chassis
Feb 20 09:59:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:04.990 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc30869a-497f-4b61-b96d-28cefb439c42, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:59:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:04.991 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[0d63e503-abc3-4237-9c13-376afcfe3e80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:59:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:04.999 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:05.042 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:05.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:05 np0005625204.localdomain ceph-mon[301857]: osdmap e201: 6 total, 6 up, 6 in
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.021 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.022 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 09:59:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:06 np0005625204.localdomain dnsmasq[320889]: exiting on receipt of SIGTERM
Feb 20 09:59:06 np0005625204.localdomain podman[321111]: 2026-02-20 09:59:06.450889019 +0000 UTC m=+0.058594746 container kill 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:59:06 np0005625204.localdomain systemd[1]: libpod-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674.scope: Deactivated successfully.
Feb 20 09:59:06 np0005625204.localdomain podman[321126]: 2026-02-20 09:59:06.52472164 +0000 UTC m=+0.056121351 container died 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:59:06 np0005625204.localdomain systemd[1]: tmp-crun.aRUac6.mount: Deactivated successfully.
Feb 20 09:59:06 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674-userdata-shm.mount: Deactivated successfully.
Feb 20 09:59:06 np0005625204.localdomain podman[321126]: 2026-02-20 09:59:06.558987433 +0000 UTC m=+0.090387064 container cleanup 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:59:06 np0005625204.localdomain systemd[1]: libpod-conmon-8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674.scope: Deactivated successfully.
Feb 20 09:59:06 np0005625204.localdomain podman[321127]: 2026-02-20 09:59:06.596794605 +0000 UTC m=+0.124514104 container remove 8979c07c7aef23ff05e43a3b1f5b0aa1558afac3408faa8e55741fbe0ab7c674 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-31019f31-c68c-481a-9b72-3317c35499b9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.609 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:06 np0005625204.localdomain kernel: device tapca71dfc6-5b left promiscuous mode
Feb 20 09:59:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:06Z|00395|binding|INFO|Releasing lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 from this chassis (sb_readonly=0)
Feb 20 09:59:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:06Z|00396|binding|INFO|Setting lport ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 down in Southbound
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.622 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-31019f31-c68c-481a-9b72-3317c35499b9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'db4c2cde6adc4016a4bb7c41aa8e59c8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=02320057-bc72-4d8a-838a-f4d7286b15bc, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=ca71dfc6-5b76-44e7-a509-dbdd64a83fd3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.624 162652 INFO neutron.agent.ovn.metadata.agent [-] Port ca71dfc6-5b76-44e7-a509-dbdd64a83fd3 in datapath 31019f31-c68c-481a-9b72-3317c35499b9 unbound from our chassis
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.627 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 31019f31-c68c-481a-9b72-3317c35499b9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 09:59:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:06.628 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[90776264-ab6d-413a-9032-3575d7906376]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.633 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 09:59:06 np0005625204.localdomain ceph-mon[301857]: pgmap v434: 177 pgs: 177 active+clean; 769 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 5.4 MiB/s rd, 35 MiB/s wr, 174 op/s
Feb 20 09:59:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1770859695' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/812428714' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.801 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.802 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.802 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 09:59:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:06.803 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 09:59:06 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:06.874 264355 INFO neutron.agent.dhcp.agent [None req-17d2ed3d-d760-42a6-8ec9-5ce820d2cafd - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:59:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:07.000 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:59:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:07.235 264355 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:59:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:07.401 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 09:59:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:07.416 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 09:59:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:07.417 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 09:59:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:07Z|00397|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:59:07 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-095be6d19a7c831778435ebe06a1dd1952a6d636d2c8437ba888c4b45c7d81ce-merged.mount: Deactivated successfully.
Feb 20 09:59:07 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d31019f31\x2dc68c\x2d481a\x2d9b72\x2d3317c35499b9.mount: Deactivated successfully.
Feb 20 09:59:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:07.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9e3e5801-af35-415e-a8cf-27831a052e85", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9e3e5801-af35-415e-a8cf-27831a052e85", "format": "json"}]: dispatch
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 09:59:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:08 np0005625204.localdomain dnsmasq[320767]: exiting on receipt of SIGTERM
Feb 20 09:59:08 np0005625204.localdomain podman[321172]: 2026-02-20 09:59:08.335696676 +0000 UTC m=+0.063946769 container kill fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 09:59:08 np0005625204.localdomain systemd[1]: libpod-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854.scope: Deactivated successfully.
Feb 20 09:59:08 np0005625204.localdomain podman[321184]: 2026-02-20 09:59:08.397954963 +0000 UTC m=+0.046681643 container died fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 09:59:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854-userdata-shm.mount: Deactivated successfully.
Feb 20 09:59:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-e0c41e1a106862f8d161432c72fe9494234bced6572d34f77ad98b154cf87b9a-merged.mount: Deactivated successfully.
Feb 20 09:59:08 np0005625204.localdomain podman[321184]: 2026-02-20 09:59:08.483776638 +0000 UTC m=+0.132503318 container cleanup fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:59:08 np0005625204.localdomain systemd[1]: libpod-conmon-fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854.scope: Deactivated successfully.
Feb 20 09:59:08 np0005625204.localdomain podman[321186]: 2026-02-20 09:59:08.505461009 +0000 UTC m=+0.146996080 container remove fc0fe7efb72dacbc849ffcd6ffe93f25d7abfd7f587261a79cd8781d4ccd4854 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fc30869a-497f-4b61-b96d-28cefb439c42, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:59:08 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dfc30869a\x2d497f\x2d4b61\x2db96d\x2d28cefb439c42.mount: Deactivated successfully.
Feb 20 09:59:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:08.728 264355 INFO neutron.agent.dhcp.agent [None req-8f1db16b-8bb3-49dd-a911-4681008d3273 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:59:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:08.729 264355 INFO neutron.agent.dhcp.agent [None req-8f1db16b-8bb3-49dd-a911-4681008d3273 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 09:59:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e202 e202: 6 total, 6 up, 6 in
Feb 20 09:59:09 np0005625204.localdomain ceph-mon[301857]: pgmap v435: 177 pgs: 177 active+clean; 769 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 4.7 MiB/s rd, 31 MiB/s wr, 151 op/s
Feb 20 09:59:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "29f88fa1-d324-41a4-9d70-2be9681e5831", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "29f88fa1-d324-41a4-9d70-2be9681e5831", "format": "json"}]: dispatch
Feb 20 09:59:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:10.047 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:10.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:10 np0005625204.localdomain ceph-mon[301857]: osdmap e202: 6 total, 6 up, 6 in
Feb 20 09:59:10 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e203 e203: 6 total, 6 up, 6 in
Feb 20 09:59:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:59:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:59:11 np0005625204.localdomain podman[321216]: 2026-02-20 09:59:11.125833976 +0000 UTC m=+0.068268481 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 20 09:59:11 np0005625204.localdomain podman[321217]: 2026-02-20 09:59:11.192742314 +0000 UTC m=+0.125035992 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 20 09:59:11 np0005625204.localdomain podman[321216]: 2026-02-20 09:59:11.223471801 +0000 UTC m=+0.165906286 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:59:11 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:59:11 np0005625204.localdomain podman[321217]: 2026-02-20 09:59:11.27824089 +0000 UTC m=+0.210534587 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 20 09:59:11 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:59:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:11 np0005625204.localdomain ceph-mon[301857]: pgmap v437: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 801 MiB data, 2.4 GiB used, 40 GiB / 42 GiB avail; 2.7 MiB/s rd, 22 MiB/s wr, 91 op/s
Feb 20 09:59:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9e3e5801-af35-415e-a8cf-27831a052e85", "format": "json"}]: dispatch
Feb 20 09:59:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9e3e5801-af35-415e-a8cf-27831a052e85", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:59:11 np0005625204.localdomain ceph-mon[301857]: osdmap e203: 6 total, 6 up, 6 in
Feb 20 09:59:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:13.236 2 INFO neutron.agent.securitygroups_rpc [None req-02b2e18c-e6bb-49a4-a8f1-2084c77a3d21 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['aead394c-a7d3-40bc-acee-c30aa527c351']
Feb 20 09:59:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:13.237 2 INFO neutron.agent.securitygroups_rpc [None req-056c38b9-a9d3-4d30-8e17-1a44ab4fc9c9 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['aead394c-a7d3-40bc-acee-c30aa527c351']
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e204 e204: 6 total, 6 up, 6 in
Feb 20 09:59:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:13.834 2 INFO neutron.agent.securitygroups_rpc [None req-5619c222-5cd3-438e-b875-1e00ee8b5a9d b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:13 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:13.835 2 INFO neutron.agent.securitygroups_rpc [None req-76a11935-2f93-444b-98b0-ed592d92678c b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: pgmap v439: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 857 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 2.3 MiB/s rd, 27 MiB/s wr, 159 op/s
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "93d1cbc6-d6ab-4443-b83a-ea63f0f8018d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "93d1cbc6-d6ab-4443-b83a-ea63f0f8018d", "format": "json"}]: dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2887261055' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2887261055' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2616686135' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2616686135' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:13 np0005625204.localdomain ceph-mon[301857]: osdmap e204: 6 total, 6 up, 6 in
Feb 20 09:59:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:14.332 2 INFO neutron.agent.securitygroups_rpc [None req-92dae041-433a-447b-819c-ca016de78f58 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:14.395 2 INFO neutron.agent.securitygroups_rpc [None req-2cd7aa12-02dc-45d2-aacd-015fd7ca5faf b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:14.897 2 INFO neutron.agent.securitygroups_rpc [None req-8704f1e9-0786-45ef-9124-4ff6c69c9edf b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:14.898 2 INFO neutron.agent.securitygroups_rpc [None req-82bfdf30-c718-45ba-8302-76a78964efac b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:14.898 2 INFO neutron.agent.securitygroups_rpc [None req-6f3202e0-dd36-49f7-90de-4aa05c7d3120 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:14 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:14.899 2 INFO neutron.agent.securitygroups_rpc [None req-348b4a4e-d300-45af-acd6-5c08e553ddf3 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:15.051 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:15.053 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:15.053 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:15.053 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:15 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:15.468 2 INFO neutron.agent.securitygroups_rpc [None req-9fbdc85e-428a-4317-ba9b-cd92888d9cb2 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:15 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:15.470 2 INFO neutron.agent.securitygroups_rpc [None req-db51d6cf-e253-422e-b04f-9b8629f57782 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['f624598f-524d-4cc1-8bc8-0fb402ed46a6']
Feb 20 09:59:15 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:15.980 2 INFO neutron.agent.securitygroups_rpc [None req-af481eae-6194-40c9-88e4-bb7253323390 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['16efbbcf-ddc6-4434-9318-5d841ffddaef']
Feb 20 09:59:15 np0005625204.localdomain ceph-mon[301857]: pgmap v441: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 857 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 93 KiB/s rd, 15 MiB/s wr, 131 op/s
Feb 20 09:59:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:16.010 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:16.011 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:16.603 2 INFO neutron.agent.securitygroups_rpc [None req-8397ac5f-4c0e-48b4-864b-bbce3e3a32e8 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['868259ee-6cd3-44fa-b964-b511ba69ce8b']
Feb 20 09:59:16 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:16.738 2 INFO neutron.agent.securitygroups_rpc [None req-248bf9b5-6ff0-42de-8583-69a922702068 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['868259ee-6cd3-44fa-b964-b511ba69ce8b']
Feb 20 09:59:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e205 e205: 6 total, 6 up, 6 in
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ec89c339-8feb-4567-a8fe-cb38fb00b60f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: pgmap v442: 177 pgs: 177 active+clean; 835 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 126 KiB/s rd, 17 MiB/s wr, 184 op/s
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ec89c339-8feb-4567-a8fe-cb38fb00b60f", "format": "json"}]: dispatch
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 09:59:17 np0005625204.localdomain ceph-mon[301857]: osdmap e205: 6 total, 6 up, 6 in
Feb 20 09:59:17 np0005625204.localdomain podman[241968]: time="2026-02-20T09:59:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:59:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:59:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:59:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:59:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18354 "" "Go-http-client/1.1"
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.319 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.323 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b50408a6-f1e1-4511-bb79-1516f3a6cbff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.320775', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd2401bde-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'ef80806b076098a93f2442ea2acb311d7a0791674fc8751ac507e6e3be259c78'}]}, 'timestamp': '2026-02-20 09:59:18.324528', '_unique_id': 'f57be228435647c58202a499e4ba1368'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.325 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.326 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e206 e206: 6 total, 6 up, 6 in
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.350 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.350 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1755dc8-4733-450d-b8f5-6ba5b32d7368', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.326730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2441752-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'c34e8599db8aca9926a70877f22460a183c07535890d7bc63a6628c3a7df7851'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.326730', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24425e4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '8e648ee3b9b6abf139b947304aedca2ce3b8d203012a46d60cf086e9005d5b85'}]}, 'timestamp': '2026-02-20 09:59:18.350939', '_unique_id': '6703c9e1cf6140e49a30412aa758f2ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.351 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.352 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ae143fbb-cc78-456b-8b51-3907e9285d93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.352852', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd2447c38-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '9756a2c4a54c2ddd7d7b1f6fe250e635fe9ec5c6d925140848385273e1dca8bc'}]}, 'timestamp': '2026-02-20 09:59:18.353153', '_unique_id': 'd0c339835a654edfac92b6110469754a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.353 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.354 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.365 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '084730ce-2e99-4dc5-872c-85fb67319720', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.354520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd2464c5c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '38633e92014546172936328eb874ca24bc206a189264a14af0d4d427381761fa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.354520', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24658f0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': 'f55295629faf5a1a816f89324770016eb8f4fc142c32f4e16ef6ff40fe3d2273'}]}, 'timestamp': '2026-02-20 09:59:18.367992', '_unique_id': '0c7337abfdef46c79cb953f907b224b9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.370 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.370 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.385 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 18870000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1aaa8438-6358-4da2-b711-92cdd50c9d0d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 18870000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:59:18.371054', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'd24996a0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.624978983, 'message_signature': '83153108c7c9477705511cf6e1bf329322df1e9c4b7e888e44e24a94e9c2bf79'}]}, 'timestamp': '2026-02-20 09:59:18.386842', '_unique_id': '8372dad0397949a988dea59b3accff43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.388 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.390 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.390 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.390 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd1571ecd-f191-4570-ab8e-5907ab470046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.390380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24a3b3c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': 'edd3c502dea069fc0d33a567ccd65b194c65a371adc099ecb6204522d5210f68'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.390380', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24a4f46-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '0136080f8c88e9d093f609b35d7f2968d94c35beee13c019e76639f80f52d17a'}]}, 'timestamp': '2026-02-20 09:59:18.391383', '_unique_id': '515d2316d2e245d7882ffcc34c8aa3aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.392 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.393 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.393 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '62dcd81c-0e3e-417f-8673-3784886d04e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.393784', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24abeb8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'f793bf70744fc94a00a589bf7c0167ee36dc848560f7f4e748ec4328fd72cff5'}]}, 'timestamp': '2026-02-20 09:59:18.394263', '_unique_id': '069774aba70f45b9adb7e90b1f6dd27e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.395 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.396 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.396 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.397 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dd1e34a-e254-45f2-bfd2-f43853dee2ad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.396703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24b310e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'f019db74b348b5cab53be197aa4fdcae964779dbec9812b6f999ce593725448c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.396703', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24b439c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'd085530df47e212a585e2dfa8e69acf0f9d1e4a9770a77f42c4c742e1e79e84e'}]}, 'timestamp': '2026-02-20 09:59:18.397668', '_unique_id': 'd6b9787f82f34f8080f172e19343800e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.398 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.399 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0222a9e-6b7f-4eca-911f-6dfeb62dfc30', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.399751', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24ba760-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '89320e7351cf2056cce931d310b2842f4b2a9a701d92a31b17041f87659edc6e'}]}, 'timestamp': '2026-02-20 09:59:18.400214', '_unique_id': '87f08278c9bc4ed59fa5e7f9526b4902'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.401 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e175e9c-70fb-4569-aa41-5144468e6450', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.402282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24c0a20-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '1a15be2b4d6ab008da60eb527d178c64eb781a00f1852bcae597172b4c4e124d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.402282', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24c1df8-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.593722221, 'message_signature': '4ef65a0f3ce8131a6e258dc5be1b64c90926a205e266a27f5eb96d858bf2e2dd'}]}, 'timestamp': '2026-02-20 09:59:18.403249', '_unique_id': '84ffe443c8134b79bb2726773af3ca6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.405 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.406 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '84cf1918-5238-4979-8c15-cbace1c22c69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.405672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24c8eb4-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '0179f39443ea90b02ea08402da55c7b36e38956b6ac5c03db9725918d1862873'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.405672', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24c9e5e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '31ec141441995e7f944cea4a86ee927c86c1cd1b56ffb0b12a01f1b541d782bf'}]}, 'timestamp': '2026-02-20 09:59:18.406507', '_unique_id': 'd4c68cca99eb4fc9a0092f327ad2c4f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.407 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.408 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.408 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.409 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dcba8144-e7d8-4be9-9c4e-1ff7fbb7e27b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.408697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24d04ac-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '52d12aa817d16d64481e792c380ebfe0dbb77e6452da9b34c004aa6337c72c07'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.408697', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24d1442-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'dd114991cfeb47ac009d279865e907919033f6172e09b32473bd9eef832c3e30'}]}, 'timestamp': '2026-02-20 09:59:18.409546', '_unique_id': 'bc84ad709e7a489ea90095a6d364cf70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.410 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.411 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd468149f-d960-41f9-a3b1-b6e82e9f1661', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.411583', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24d775c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'c0cd07d1b60d254960152c229660b1ec68bc18d70f2d4a7375b5c420cb374398'}]}, 'timestamp': '2026-02-20 09:59:18.412090', '_unique_id': '278967fa9581497eb967cef853c0f15e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.412 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c1de490-9a13-4853-90fd-bf6ff0177128', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.414145', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24dd904-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '7ba451f341cef8c0be3c5d2a4ea25707b61223be8043a85d718bb92e07b1f611'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.414145', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24de976-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'c220d6a5f6b5bd337094faf221bc6b27e54d45603771b9fe422e1784b9c5958b'}]}, 'timestamp': '2026-02-20 09:59:18.414984', '_unique_id': 'b09a0fcd78e544229018be28bc17927f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.417 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.417 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cc2621a8-9323-4038-b6a2-b091b88f994b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.417542', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24e5f1e-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '0ca78064d1bdb496b6ab7091bb44ae854bc42b50a2e3b5b9dc10260cf65957df'}]}, 'timestamp': '2026-02-20 09:59:18.418027', '_unique_id': '75dcec9d16184fe1965586c5c1fb5d9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.418 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.420 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.420 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5dbf80db-866c-4b26-9996-2ce7534fa5a9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.420161', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24ec4c2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '75a51ebb29df916b0cb405ac8b18b1086a242a8ada8a4641ac0e2708a5575cd9'}]}, 'timestamp': '2026-02-20 09:59:18.420768', '_unique_id': '1c2e62ecd7d74b12b533d4e9ade8a2c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.421 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.422 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.422 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac274e2f-1a1f-4d75-b3ba-5e851cceb700', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.422705', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd24f273c-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': 'd6e99cec62f53e216e16615b8f39e8f8b1c1c19f40076498bde44b58337a8878'}]}, 'timestamp': '2026-02-20 09:59:18.423121', '_unique_id': '3cc8e9e7b1ac42948d5f0083b98b0413'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.423 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.424 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.425 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81fe8b6f-5d27-4b57-b588-a20c8d1b080d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T09:59:18.424581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'd24f70f2-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': '63e0c186b0a1374cbd6013846ba036e6c16ee85bc213c4870379b9d4b0abaf54'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T09:59:18.424581', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'd24f7f98-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.565903264, 'message_signature': 'ca87fa1ac095b3536ce2a2bbb162ca3a9696b27f9bf8424770e1185213d5572a'}]}, 'timestamp': '2026-02-20 09:59:18.425325', '_unique_id': 'ef7e86e4f864459eb29b361c3a1bcbf9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.426 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.427 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.427 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5a816ce-9419-4a92-92b2-4ca8bf31b2f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T09:59:18.427124', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'd24fd2cc-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.624978983, 'message_signature': '8081a672e73e09c3a3fd5459da9d844513568e5919625dd06d25c4452c8f922a'}]}, 'timestamp': '2026-02-20 09:59:18.427456', '_unique_id': 'caf19ab0db20431cb52bcc5fb7b32040'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.428 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d8e64da-fb24-4ae6-a820-bb5dc51c7704', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.428945', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd25018e0-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '692ef11da7b3e68911f28f2ef830042b29744ded1fbe81cd6040861912a49550'}]}, 'timestamp': '2026-02-20 09:59:18.429315', '_unique_id': '1b487574895e48ccaf0a805df87dc64c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.430 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6f9e969-5370-4383-b565-6380f819ef39', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T09:59:18.430994', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'd2506a98-0e42-11f1-9294-fa163ef029e2', 'monotonic_time': 12117.559948112, 'message_signature': '18a1e0f537eb3bce45e435b6f6b15aba1608eb6977baa1344ecba340e75dd4d2'}]}, 'timestamp': '2026-02-20 09:59:18.431374', '_unique_id': 'fd7578575f6547f18dd85410cdb8cdcf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 09:59:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 09:59:18.431 12 ERROR oslo_messaging.notify.messaging 
Feb 20 09:59:18 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:18.745 2 INFO neutron.agent.securitygroups_rpc [None req-00ebe7d1-26f1-436c-a8d3-18ae30d4ceca b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['350e41a6-6799-4255-abb2-bda7d280e893']
Feb 20 09:59:18 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:18.935 2 INFO neutron.agent.securitygroups_rpc [None req-ab9099b6-173a-4528-9f76-ddc0c1b400ee b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['350e41a6-6799-4255-abb2-bda7d280e893']
Feb 20 09:59:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:59:19 np0005625204.localdomain podman[321259]: 2026-02-20 09:59:19.149652615 +0000 UTC m=+0.087831997 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 09:59:19 np0005625204.localdomain podman[321259]: 2026-02-20 09:59:19.160734713 +0000 UTC m=+0.098914085 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:59:19 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: pgmap v444: 177 pgs: 177 active+clean; 835 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 36 KiB/s rd, 3.3 MiB/s wr, 56 op/s
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: osdmap e206: 6 total, 6 up, 6 in
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2185046675' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2185046675' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:59:19 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e207 e207: 6 total, 6 up, 6 in
Feb 20 09:59:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:19.464 2 INFO neutron.agent.securitygroups_rpc [None req-483e27bf-9a6d-411a-b87b-b6f37447f4e8 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']
Feb 20 09:59:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:19.680 2 INFO neutron.agent.securitygroups_rpc [None req-ac337b7c-a535-40e3-b3bd-b5580b0e941d b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']
Feb 20 09:59:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:19.817 2 INFO neutron.agent.securitygroups_rpc [None req-c5f864ad-b631-4012-870f-280605d80045 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']
Feb 20 09:59:19 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:19.986 2 INFO neutron.agent.securitygroups_rpc [None req-ac1e61bd-c67f-4839-a794-1523a2080faa b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']
Feb 20 09:59:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:20.194 2 INFO neutron.agent.securitygroups_rpc [None req-7e4bce2e-fb70-442e-b47b-26c11122b51c b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']
Feb 20 09:59:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:20.356 2 INFO neutron.agent.securitygroups_rpc [None req-9c6d9773-e163-46df-a8b7-894bf61ef867 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['0ead8c40-caeb-4988-8a84-cd8090714d6e']
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "17a556c8-9ef9-44cc-8e06-318686688991", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "17a556c8-9ef9-44cc-8e06-318686688991", "format": "json"}]: dispatch
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: osdmap e207: 6 total, 6 up, 6 in
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1875354009' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1875354009' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 20 09:59:20 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3226930422' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:20 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:20.777 2 INFO neutron.agent.securitygroups_rpc [None req-48000ec6-91b4-434a-ab1c-3ae5eaf7b735 b8c5cd85f3954e32bc8ce4cc39694a8d 25c20dca96c143a09e2486b79027be5d - - default default] Security group rule updated ['cefa71e1-4cfe-4451-bb5c-ca133ddcf1fd']
Feb 20 09:59:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:21.013 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:21.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:21.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:21.016 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:21.084 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:21.085 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e207 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:21 np0005625204.localdomain ceph-mon[301857]: pgmap v447: 177 pgs: 177 active+clean; 850 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 46 KiB/s rd, 6.7 MiB/s wr, 73 op/s
Feb 20 09:59:21 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3226930422' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e208 e208: 6 total, 6 up, 6 in
Feb 20 09:59:22 np0005625204.localdomain ceph-mon[301857]: osdmap e208: 6 total, 6 up, 6 in
Feb 20 09:59:22 np0005625204.localdomain ceph-mon[301857]: pgmap v449: 177 pgs: 177 active+clean; 886 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 111 KiB/s rd, 23 MiB/s wr, 173 op/s
Feb 20 09:59:22 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e209 e209: 6 total, 6 up, 6 in
Feb 20 09:59:22 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:22.576 2 INFO neutron.agent.securitygroups_rpc [None req-56dcf14a-a69d-4366-adc0-f7e0579b7cd8 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group rule updated ['9d889f17-f220-427e-bd61-2fb67b868596']
Feb 20 09:59:22 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:22.694 2 INFO neutron.agent.securitygroups_rpc [None req-87bed7c3-5c32-49ad-acc0-0a3642727263 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group rule updated ['9d889f17-f220-427e-bd61-2fb67b868596']
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e210 e210: 6 total, 6 up, 6 in
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: osdmap e209: 6 total, 6 up, 6 in
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "17a556c8-9ef9-44cc-8e06-318686688991", "format": "json"}]: dispatch
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "17a556c8-9ef9-44cc-8e06-318686688991", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:23 np0005625204.localdomain ceph-mon[301857]: osdmap e210: 6 total, 6 up, 6 in
Feb 20 09:59:24 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e211 e211: 6 total, 6 up, 6 in
Feb 20 09:59:25 np0005625204.localdomain ceph-mon[301857]: pgmap v452: 177 pgs: 177 active+clean; 886 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 121 KiB/s rd, 25 MiB/s wr, 190 op/s
Feb 20 09:59:25 np0005625204.localdomain ceph-mon[301857]: osdmap e211: 6 total, 6 up, 6 in
Feb 20 09:59:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2f097eba-f8b3-4922-a9b2-f188b7fb5c09", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2f097eba-f8b3-4922-a9b2-f188b7fb5c09", "format": "json"}]: dispatch
Feb 20 09:59:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e212 e212: 6 total, 6 up, 6 in
Feb 20 09:59:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:26.086 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:26.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:26.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:26.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:59:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:26.119 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:26.121 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:59:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:59:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:59:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:59:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:59:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:59:27 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:59:27 np0005625204.localdomain ceph-mon[301857]: pgmap v454: 177 pgs: 177 active+clean; 975 MiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 101 KiB/s rd, 31 MiB/s wr, 160 op/s
Feb 20 09:59:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:59:27 np0005625204.localdomain ceph-mon[301857]: osdmap e212: 6 total, 6 up, 6 in
Feb 20 09:59:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ec89c339-8feb-4567-a8fe-cb38fb00b60f", "format": "json"}]: dispatch
Feb 20 09:59:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ec89c339-8feb-4567-a8fe-cb38fb00b60f", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:27 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3674888499' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 09:59:27 np0005625204.localdomain systemd[1]: tmp-crun.SWTleF.mount: Deactivated successfully.
Feb 20 09:59:27 np0005625204.localdomain podman[321277]: 2026-02-20 09:59:27.148837785 +0000 UTC m=+0.078817382 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 09:59:27 np0005625204.localdomain podman[321277]: 2026-02-20 09:59:27.187253736 +0000 UTC m=+0.117233683 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 09:59:27 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:59:27 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 09:59:27.219 2 INFO neutron.agent.securitygroups_rpc [req-f264314a-f5fb-4167-9b9a-7fac156c481a req-f4a185a8-c20a-4c61-b6ac-a21285bd72eb 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group member updated ['9d889f17-f220-427e-bd61-2fb67b868596']
Feb 20 09:59:28 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e213 e213: 6 total, 6 up, 6 in
Feb 20 09:59:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1073711962' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:28 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1073711962' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:28 np0005625204.localdomain ceph-mon[301857]: osdmap e213: 6 total, 6 up, 6 in
Feb 20 09:59:29 np0005625204.localdomain ceph-mon[301857]: pgmap v456: 177 pgs: 177 active+clean; 975 MiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 82 KiB/s rd, 25 MiB/s wr, 129 op/s
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2f097eba-f8b3-4922-a9b2-f188b7fb5c09", "format": "json"}]: dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2f097eba-f8b3-4922-a9b2-f188b7fb5c09", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "93d1cbc6-d6ab-4443-b83a-ea63f0f8018d", "format": "json"}]: dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "93d1cbc6-d6ab-4443-b83a-ea63f0f8018d", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/751048439' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2224840685' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3380022144' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:30 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3380022144' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:31.122 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:31.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:31.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:31.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:31.150 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:31.151 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:59:31 np0005625204.localdomain ceph-mon[301857]: pgmap v458: 177 pgs: 177 active+clean; 983 MiB data, 3.3 GiB used, 39 GiB / 42 GiB avail; 96 KiB/s rd, 23 MiB/s wr, 143 op/s
Feb 20 09:59:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 09:59:31 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3197445632' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:31 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3197445632' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e214 e214: 6 total, 6 up, 6 in
Feb 20 09:59:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 09:59:33 np0005625204.localdomain systemd[1]: tmp-crun.RHrSIn.mount: Deactivated successfully.
Feb 20 09:59:33 np0005625204.localdomain podman[321300]: 2026-02-20 09:59:33.158831018 +0000 UTC m=+0.094599004 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 09:59:33 np0005625204.localdomain podman[321300]: 2026-02-20 09:59:33.167464571 +0000 UTC m=+0.103232567 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 09:59:33 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: pgmap v459: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 157 KiB/s rd, 35 MiB/s wr, 241 op/s
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "29f88fa1-d324-41a4-9d70-2be9681e5831", "format": "json"}]: dispatch
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "29f88fa1-d324-41a4-9d70-2be9681e5831", "force": true, "format": "json"}]: dispatch
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:33 np0005625204.localdomain ceph-mon[301857]: osdmap e214: 6 total, 6 up, 6 in
Feb 20 09:59:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 09:59:35 np0005625204.localdomain podman[321324]: 2026-02-20 09:59:35.139379481 +0000 UTC m=+0.079172843 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 20 09:59:35 np0005625204.localdomain podman[321324]: 2026-02-20 09:59:35.151329095 +0000 UTC m=+0.091122467 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-02-05T04:57:10Z, release=1770267347, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 09:59:35 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 09:59:35 np0005625204.localdomain ceph-mon[301857]: pgmap v461: 177 pgs: 177 active+clean; 1.1 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 102 KiB/s rd, 18 MiB/s wr, 154 op/s
Feb 20 09:59:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e215 e215: 6 total, 6 up, 6 in
Feb 20 09:59:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:36.153 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:36.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:36.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:36.155 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:36.184 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:36 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:36.184 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:36 np0005625204.localdomain ceph-mon[301857]: osdmap e215: 6 total, 6 up, 6 in
Feb 20 09:59:36 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:59:36 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 09:59:36 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 09:59:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:36 np0005625204.localdomain sshd[321344]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 09:59:37 np0005625204.localdomain sshd[321344]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 09:59:37 np0005625204.localdomain ceph-mon[301857]: pgmap v463: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 3.1 MiB/s rd, 36 MiB/s wr, 314 op/s
Feb 20 09:59:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:59:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:59:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/597628337' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/597628337' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e216 e216: 6 total, 6 up, 6 in
Feb 20 09:59:38 np0005625204.localdomain ceph-mon[301857]: osdmap e216: 6 total, 6 up, 6 in
Feb 20 09:59:38 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/359541798' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:38 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/359541798' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: pgmap v465: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 3.9 MiB/s rd, 23 MiB/s wr, 204 op/s
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "format": "json"}]: dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1664593204' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:39 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1664593204' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:40 np0005625204.localdomain ceph-mon[301857]: pgmap v466: 177 pgs: 177 active+clean; 1.0 GiB data, 3.6 GiB used, 38 GiB / 42 GiB avail; 3.5 MiB/s rd, 21 MiB/s wr, 209 op/s
Feb 20 09:59:40 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2948968927' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:40 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2948968927' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:41.186 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:41.187 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:41.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:41.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:41.222 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:41.223 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 09:59:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 09:59:42 np0005625204.localdomain podman[321347]: 2026-02-20 09:59:42.165281257 +0000 UTC m=+0.094365786 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 20 09:59:42 np0005625204.localdomain podman[321347]: 2026-02-20 09:59:42.204047478 +0000 UTC m=+0.133131967 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 09:59:42 np0005625204.localdomain systemd[1]: tmp-crun.ThTrLq.mount: Deactivated successfully.
Feb 20 09:59:42 np0005625204.localdomain podman[321346]: 2026-02-20 09:59:42.227217674 +0000 UTC m=+0.158760628 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 20 09:59:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:42Z|00398|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 09:59:42 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 09:59:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:42.294 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:42 np0005625204.localdomain podman[321346]: 2026-02-20 09:59:42.338299088 +0000 UTC m=+0.269842022 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 09:59:42 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 09:59:42 np0005625204.localdomain ceph-mon[301857]: pgmap v467: 177 pgs: 177 active+clean; 247 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.0 MiB/s rd, 17 MiB/s wr, 274 op/s
Feb 20 09:59:42 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 e217: 6 total, 6 up, 6 in
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "format": "json"}]: dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 09:59:43 np0005625204.localdomain ceph-mon[301857]: osdmap e217: 6 total, 6 up, 6 in
Feb 20 09:59:45 np0005625204.localdomain ceph-mon[301857]: pgmap v469: 177 pgs: 177 active+clean; 247 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 73 KiB/s rd, 46 KiB/s wr, 121 op/s
Feb 20 09:59:45 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3502069534' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:45 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3502069534' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:46.224 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:46 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 20 09:59:46 np0005625204.localdomain sudo[321390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 09:59:46 np0005625204.localdomain sudo[321390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:59:46 np0005625204.localdomain sudo[321390]: pam_unix(sudo:session): session closed for user root
Feb 20 09:59:46 np0005625204.localdomain sudo[321408]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 09:59:46 np0005625204.localdomain sudo[321408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:59:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:59:47 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 14K writes, 56K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 14K writes, 4683 syncs, 3.11 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 9472 writes, 33K keys, 9472 commit groups, 1.0 writes per commit group, ingest: 25.58 MB, 0.04 MB/s
                                                          Interval WAL: 9472 writes, 3987 syncs, 2.38 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:59:47 np0005625204.localdomain sudo[321408]: pam_unix(sudo:session): session closed for user root
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: pgmap v470: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 567 KiB/s rd, 3.1 MiB/s wr, 233 op/s
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "auth_id": "Joe", "tenant_id": "8fac2513a3ab4162a13f560c6301f671", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/52918c2e-6ed5-45c2-9872-88b3bd77010f/6bdf5028-e9f4-4a0d-812d-96892d0e92d0", "osd", "allow rw pool=manila_data namespace=fsvolumens_52918c2e-6ed5-45c2-9872-88b3bd77010f", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/52918c2e-6ed5-45c2-9872-88b3bd77010f/6bdf5028-e9f4-4a0d-812d-96892d0e92d0", "osd", "allow rw pool=manila_data namespace=fsvolumens_52918c2e-6ed5-45c2-9872-88b3bd77010f", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/257417761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/257417761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:59:47 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 09:59:47 np0005625204.localdomain sudo[321458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 09:59:47 np0005625204.localdomain sudo[321458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 09:59:47 np0005625204.localdomain sudo[321458]: pam_unix(sudo:session): session closed for user root
Feb 20 09:59:47 np0005625204.localdomain podman[241968]: time="2026-02-20T09:59:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 09:59:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:59:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 09:59:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:09:59:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1"
Feb 20 09:59:49 np0005625204.localdomain ceph-mon[301857]: pgmap v471: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 471 KiB/s rd, 2.6 MiB/s wr, 193 op/s
Feb 20 09:59:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3243055867' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3243055867' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "format": "json"}]: dispatch
Feb 20 09:59:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 09:59:50 np0005625204.localdomain podman[321476]: 2026-02-20 09:59:50.161798917 +0000 UTC m=+0.085580189 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute)
Feb 20 09:59:50 np0005625204.localdomain podman[321476]: 2026-02-20 09:59:50.202087054 +0000 UTC m=+0.125868306 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 09:59:50 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/813520502' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/813520502' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:51.226 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:51.231 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: pgmap v472: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 462 KiB/s rd, 2.6 MiB/s wr, 182 op/s
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/813520502' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/813520502' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 09:59:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 20K writes, 74K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 20K writes, 6880 syncs, 2.95 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 14K writes, 49K keys, 14K commit groups, 1.0 writes per commit group, ingest: 38.58 MB, 0.06 MB/s
                                                          Interval WAL: 14K writes, 5950 syncs, 2.40 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 09:59:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: pgmap v473: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 441 KiB/s rd, 2.6 MiB/s wr, 141 op/s
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1899022262' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1899022262' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 20 09:59:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b9bf80c2-3fd5-47b7-9223-4a556ce7ef61", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b9bf80c2-3fd5-47b7-9223-4a556ce7ef61", "format": "json"}]: dispatch
Feb 20 09:59:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 09:59:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "auth_id": "Joe", "tenant_id": "f656f9df86ae4c53b02f471da5bd5ad7", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1818888118' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 09:59:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1818888118' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 09:59:55 np0005625204.localdomain ceph-mon[301857]: pgmap v474: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 416 KiB/s rd, 2.5 MiB/s wr, 133 op/s
Feb 20 09:59:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:56.260 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:56.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 09:59:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:56.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5030 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 09:59:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:56.262 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:56.263 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:56.263 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-622295165", "format": "json"} : dispatch
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-622295165", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e5c48d9-dcb2-469b-91b0-c7f808d95c49/17cc3f20-7a80-42d5-908d-fb1e4bba5d82", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-622295165", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e5c48d9-dcb2-469b-91b0-c7f808d95c49/17cc3f20-7a80-42d5-908d-fb1e4bba5d82", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 09:59:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:59:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 09:59:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:59:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   09:59:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 09:59:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: pgmap v475: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 398 KiB/s rd, 2.2 MiB/s wr, 161 op/s
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "auth_id": "tempest-cephx-id-622295165", "tenant_id": "f656f9df86ae4c53b02f471da5bd5ad7", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "format": "json"}]: dispatch
Feb 20 09:59:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 09:59:58 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 09:59:58 np0005625204.localdomain systemd[1]: tmp-crun.neYjNm.mount: Deactivated successfully.
Feb 20 09:59:58 np0005625204.localdomain podman[321496]: 2026-02-20 09:59:58.167917655 +0000 UTC m=+0.103347150 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 09:59:58 np0005625204.localdomain podman[321496]: 2026-02-20 09:59:58.204041505 +0000 UTC m=+0.139471000 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 09:59:58 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 09:59:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:58.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:58.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 09:59:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:58.940 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:59:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:58.941 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:58 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:58.942 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 09:59:59 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 09:59:59.274 264355 INFO neutron.agent.linux.ip_lib [None req-f95d6219-14d7-4ca9-bfb9-013969994773 - - - - - -] Device tapda8c9dd1-5a cannot be used as it has no MAC address
Feb 20 09:59:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:59.305 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:59 np0005625204.localdomain kernel: device tapda8c9dd1-5a entered promiscuous mode
Feb 20 09:59:59 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581599.3182] manager: (tapda8c9dd1-5a): new Generic device (/org/freedesktop/NetworkManager/Devices/62)
Feb 20 09:59:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:59Z|00399|binding|INFO|Claiming lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 for this chassis.
Feb 20 09:59:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:59Z|00400|binding|INFO|da8c9dd1-5a21-4397-88e4-37d2dfab4a31: Claiming unknown
Feb 20 09:59:59 np0005625204.localdomain systemd-udevd[321530]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 09:59:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:59.322 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:59.330 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55965a8332c94f2da5d707adc081ab9c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b15549-1f91-4b7d-aa8a-1aa5d439e964, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=da8c9dd1-5a21-4397-88e4-37d2dfab4a31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 09:59:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:59.333 162652 INFO neutron.agent.ovn.metadata.agent [-] Port da8c9dd1-5a21-4397-88e4-37d2dfab4a31 in datapath 3615f6b8-3945-4c93-ab04-14a8ee32065e bound to our chassis
Feb 20 09:59:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:59.336 162652 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3615f6b8-3945-4c93-ab04-14a8ee32065e or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 20 09:59:59 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 09:59:59.337 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d50e37c5-caa2-413c-a2df-161412833f45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:59Z|00401|binding|INFO|Setting lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 ovn-installed in OVS
Feb 20 09:59:59 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T09:59:59Z|00402|binding|INFO|Setting lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 up in Southbound
Feb 20 09:59:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:59.360 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:59.361 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain ceph-mon[301857]: pgmap v476: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 76 KiB/s wr, 80 op/s
Feb 20 09:59:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 09:59:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 09:59:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:59.398 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 09:59:59 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tapda8c9dd1-5a: No such device
Feb 20 09:59:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 09:59:59.433 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:00 np0005625204.localdomain podman[321600]: 
Feb 20 10:00:00 np0005625204.localdomain podman[321600]: 2026-02-20 10:00:00.377372991 +0000 UTC m=+0.095375987 container create ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 10:00:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 20 10:00:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 20 10:00:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2891047774' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:00 np0005625204.localdomain ceph-mon[301857]: overall HEALTH_OK
Feb 20 10:00:00 np0005625204.localdomain podman[321600]: 2026-02-20 10:00:00.329356108 +0000 UTC m=+0.047359134 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 10:00:00 np0005625204.localdomain systemd[1]: Started libpod-conmon-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a.scope.
Feb 20 10:00:00 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 10:00:00 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/141d473fcfd8681de9cd6501789f09d5aafaf3822466c87fc964c4d88a2476a2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 10:00:00 np0005625204.localdomain podman[321600]: 2026-02-20 10:00:00.459864045 +0000 UTC m=+0.177867041 container init ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 10:00:00 np0005625204.localdomain podman[321600]: 2026-02-20 10:00:00.470454058 +0000 UTC m=+0.188457054 container start ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 20 10:00:00 np0005625204.localdomain dnsmasq[321618]: started, version 2.85 cachesize 150
Feb 20 10:00:00 np0005625204.localdomain dnsmasq[321618]: DNS service limited to local subnets
Feb 20 10:00:00 np0005625204.localdomain dnsmasq[321618]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 10:00:00 np0005625204.localdomain dnsmasq[321618]: warning: no upstream servers configured
Feb 20 10:00:00 np0005625204.localdomain dnsmasq-dhcp[321618]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 10:00:00 np0005625204.localdomain dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 0 addresses
Feb 20 10:00:00 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host
Feb 20 10:00:00 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts
Feb 20 10:00:00 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:00.583 264355 INFO neutron.agent.dhcp.agent [None req-dd6ae2f9-123b-49a7-803c-a7ded59e57e7 - - - - - -] DHCP configuration for ports {'74fd2a4a-9b9e-4edb-b114-d1121b443c64'} is completed
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:01 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "snap_name": "25515e00-8c31-4609-a265-84e19f94da1a", "format": "json"}]: dispatch
Feb 20 10:00:01 np0005625204.localdomain ceph-mon[301857]: pgmap v477: 177 pgs: 177 active+clean; 280 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 54 KiB/s rd, 76 KiB/s wr, 81 op/s
Feb 20 10:00:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1697830735' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.740 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.740 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.741 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.742 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:00:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:01.760 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/738363458' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/738363458' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1177894248' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.185 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.253 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.254 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/738363458' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/738363458' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1177894248' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 10:00:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.492 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.494 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11250MB free_disk=41.70030212402344GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.494 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.495 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.573 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.574 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.574 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:00:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:02.616 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:00:02 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:02.682 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:00:02Z, description=, device_id=06de8864-90cd-41d9-8a7d-9e83a5e36d4c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5818970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5818220>], id=f42f7f41-995c-4a84-bc94-975a54360372, ip_allocation=immediate, mac_address=fa:16:3e:9e:1f:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:59:57Z, description=, dns_domain=, id=3615f6b8-3945-4c93-ab04-14a8ee32065e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1823604132-network, port_security_enabled=True, project_id=55965a8332c94f2da5d707adc081ab9c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36031, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3267, status=ACTIVE, subnets=['336a109d-999b-46ec-9331-c83fb6320087'], tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T09:59:57Z, vlan_transparent=None, network_id=3615f6b8-3945-4c93-ab04-14a8ee32065e, port_security_enabled=False, project_id=55965a8332c94f2da5d707adc081ab9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3279, status=DOWN, tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T10:00:02Z on network 3615f6b8-3945-4c93-ab04-14a8ee32065e
Feb 20 10:00:02 np0005625204.localdomain dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 1 addresses
Feb 20 10:00:02 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host
Feb 20 10:00:02 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts
Feb 20 10:00:02 np0005625204.localdomain podman[321678]: 2026-02-20 10:00:02.914487202 +0000 UTC m=+0.060265418 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 10:00:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:02.945 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1389774980' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:03.048 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:00:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:03.055 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:00:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:03.072 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:00:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:03.075 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:00:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:03.075 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:00:03 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:03.139 264355 INFO neutron.agent.dhcp.agent [None req-77ba4fff-74cb-401d-a84c-fa0d92a6d99c - - - - - -] DHCP configuration for ports {'f42f7f41-995c-4a84-bc94-975a54360372'} is completed
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f56cf5ff-1882-43b3-8a1d-c3448d693134", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f56cf5ff-1882-43b3-8a1d-c3448d693134", "format": "json"}]: dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: pgmap v478: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 115 KiB/s wr, 83 op/s
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "auth_id": "tempest-cephx-id-622295165", "format": "json"}]: dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-622295165", "format": "json"} : dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-622295165"} : dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-622295165"}]': finished
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "auth_id": "tempest-cephx-id-622295165", "format": "json"}]: dispatch
Feb 20 10:00:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1389774980' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:00:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:04.078 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:04.080 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:04 np0005625204.localdomain podman[321700]: 2026-02-20 10:00:04.149885292 +0000 UTC m=+0.085610260 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 10:00:04 np0005625204.localdomain podman[321700]: 2026-02-20 10:00:04.165141956 +0000 UTC m=+0.100866904 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 10:00:04 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:00:04 np0005625204.localdomain ceph-mon[301857]: pgmap v479: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 73 KiB/s wr, 48 op/s
Feb 20 10:00:05 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:05.022 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:00:02Z, description=, device_id=06de8864-90cd-41d9-8a7d-9e83a5e36d4c, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5861820>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5861250>], id=f42f7f41-995c-4a84-bc94-975a54360372, ip_allocation=immediate, mac_address=fa:16:3e:9e:1f:34, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T09:59:57Z, description=, dns_domain=, id=3615f6b8-3945-4c93-ab04-14a8ee32065e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesSnapshotTestJSON-1823604132-network, port_security_enabled=True, project_id=55965a8332c94f2da5d707adc081ab9c, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=36031, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3267, status=ACTIVE, subnets=['336a109d-999b-46ec-9331-c83fb6320087'], tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T09:59:57Z, vlan_transparent=None, network_id=3615f6b8-3945-4c93-ab04-14a8ee32065e, port_security_enabled=False, project_id=55965a8332c94f2da5d707adc081ab9c, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3279, status=DOWN, tags=[], tenant_id=55965a8332c94f2da5d707adc081ab9c, updated_at=2026-02-20T10:00:02Z on network 3615f6b8-3945-4c93-ab04-14a8ee32065e
Feb 20 10:00:05 np0005625204.localdomain podman[321739]: 2026-02-20 10:00:05.241126679 +0000 UTC m=+0.060340859 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 20 10:00:05 np0005625204.localdomain dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 1 addresses
Feb 20 10:00:05 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host
Feb 20 10:00:05 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts
Feb 20 10:00:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:00:05 np0005625204.localdomain podman[321752]: 2026-02-20 10:00:05.345478079 +0000 UTC m=+0.078075790 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git)
Feb 20 10:00:05 np0005625204.localdomain podman[321752]: 2026-02-20 10:00:05.389090777 +0000 UTC m=+0.121688478 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, vcs-type=git, build-date=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=9.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 20 10:00:05 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:00:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "34a9640c-9d47-4ee3-b8f2-22b14cf6ac88", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:00:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "34a9640c-9d47-4ee3-b8f2-22b14cf6ac88", "format": "json"}]: dispatch
Feb 20 10:00:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:05 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:05.569 264355 INFO neutron.agent.dhcp.agent [None req-b4114ec0-08bd-4e83-86c0-406fd7915555 - - - - - -] DHCP configuration for ports {'f42f7f41-995c-4a84-bc94-975a54360372'} is completed
Feb 20 10:00:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:05.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:05.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:00:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:06.022 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:00:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:00:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:00:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:06.328 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: pgmap v480: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 104 KiB/s wr, 52 op/s
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "snap_name": "25515e00-8c31-4609-a265-84e19f94da1a_bb185c5e-7bce-4b96-b50a-2749adcb4cc3", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "snap_name": "25515e00-8c31-4609-a265-84e19f94da1a", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:06.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:06.718 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1240294107' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.820 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.820 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.820 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:00:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:07.821 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:00:08 np0005625204.localdomain sshd[321777]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:00:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:08.585 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:00:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:08.607 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:00:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:08.607 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:00:08 np0005625204.localdomain ceph-mon[301857]: pgmap v481: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 70 KiB/s wr, 9 op/s
Feb 20 10:00:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3814731526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/373438564' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e218 e218: 6 total, 6 up, 6 in
Feb 20 10:00:09 np0005625204.localdomain sshd[321777]: Invalid user claude from 196.189.116.182 port 46220
Feb 20 10:00:09 np0005625204.localdomain sshd[321777]: Received disconnect from 196.189.116.182 port 46220:11: Bye Bye [preauth]
Feb 20 10:00:09 np0005625204.localdomain sshd[321777]: Disconnected from invalid user claude 196.189.116.182 port 46220 [preauth]
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e219 e219: 6 total, 6 up, 6 in
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f56cf5ff-1882-43b3-8a1d-c3448d693134", "format": "json"}]: dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f56cf5ff-1882-43b3-8a1d-c3448d693134", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: osdmap e218: 6 total, 6 up, 6 in
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3827509993' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "auth_id": "admin", "tenant_id": "8fac2513a3ab4162a13f560c6301f671", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "format": "json"}]: dispatch
Feb 20 10:00:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c382485c-3d04-46af-88c4-8eb360a0c45a", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:10 np0005625204.localdomain ceph-mon[301857]: pgmap v483: 177 pgs: 177 active+clean; 281 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 85 KiB/s wr, 12 op/s
Feb 20 10:00:10 np0005625204.localdomain ceph-mon[301857]: osdmap e219: 6 total, 6 up, 6 in
Feb 20 10:00:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e220 e220: 6 total, 6 up, 6 in
Feb 20 10:00:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:11.332 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:11.334 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:11.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:11.335 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:11.391 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:11.392 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:12 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1439580111' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:12 np0005625204.localdomain ceph-mon[301857]: osdmap e220: 6 total, 6 up, 6 in
Feb 20 10:00:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:00:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:00:13 np0005625204.localdomain ceph-mon[301857]: pgmap v486: 177 pgs: 177 active+clean; 282 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 132 KiB/s wr, 51 op/s
Feb 20 10:00:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "34a9640c-9d47-4ee3-b8f2-22b14cf6ac88", "format": "json"}]: dispatch
Feb 20 10:00:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "34a9640c-9d47-4ee3-b8f2-22b14cf6ac88", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3101837055' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:13 np0005625204.localdomain systemd[1]: tmp-crun.NSrdvE.mount: Deactivated successfully.
Feb 20 10:00:13 np0005625204.localdomain podman[321780]: 2026-02-20 10:00:13.164816868 +0000 UTC m=+0.094315485 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:00:13 np0005625204.localdomain podman[321780]: 2026-02-20 10:00:13.197284498 +0000 UTC m=+0.126783145 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:00:13 np0005625204.localdomain systemd[1]: tmp-crun.9yztSu.mount: Deactivated successfully.
Feb 20 10:00:13 np0005625204.localdomain podman[321779]: 2026-02-20 10:00:13.206049195 +0000 UTC m=+0.138831792 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 10:00:13 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:00:13 np0005625204.localdomain podman[321779]: 2026-02-20 10:00:13.266281629 +0000 UTC m=+0.199064266 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:00:13 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e221 e221: 6 total, 6 up, 6 in
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b9bf80c2-3fd5-47b7-9223-4a556ce7ef61", "format": "json"}]: dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b9bf80c2-3fd5-47b7-9223-4a556ce7ef61", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "auth_id": "david", "tenant_id": "8fac2513a3ab4162a13f560c6301f671", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/7f87993f-62fd-4706-b657-9586f12f2a62/5d470986-550e-47dc-98aa-07cb4fcc93dc", "osd", "allow rw pool=manila_data namespace=fsvolumens_7f87993f-62fd-4706-b657-9586f12f2a62", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/7f87993f-62fd-4706-b657-9586f12f2a62/5d470986-550e-47dc-98aa-07cb4fcc93dc", "osd", "allow rw pool=manila_data namespace=fsvolumens_7f87993f-62fd-4706-b657-9586f12f2a62", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:14 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:15 np0005625204.localdomain ceph-mon[301857]: pgmap v487: 177 pgs: 177 active+clean; 282 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 132 KiB/s wr, 51 op/s
Feb 20 10:00:15 np0005625204.localdomain ceph-mon[301857]: osdmap e221: 6 total, 6 up, 6 in
Feb 20 10:00:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e222 e222: 6 total, 6 up, 6 in
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: osdmap e222: 6 total, 6 up, 6 in
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:16.393 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:16.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:16.422 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:16 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:16.423 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:17 np0005625204.localdomain ceph-mon[301857]: pgmap v490: 177 pgs: 177 active+clean; 546 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 152 KiB/s rd, 45 MiB/s wr, 250 op/s
Feb 20 10:00:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:00:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:00:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "format": "json"}]: dispatch
Feb 20 10:00:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e223 e223: 6 total, 6 up, 6 in
Feb 20 10:00:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:00:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:00:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:00:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 20 10:00:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:00:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18836 "" "Go-http-client/1.1"
Feb 20 10:00:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e224 e224: 6 total, 6 up, 6 in
Feb 20 10:00:18 np0005625204.localdomain ceph-mon[301857]: pgmap v492: 177 pgs: 177 active+clean; 546 MiB data, 1.9 GiB used, 40 GiB / 42 GiB avail; 125 KiB/s rd, 44 MiB/s wr, 198 op/s
Feb 20 10:00:18 np0005625204.localdomain ceph-mon[301857]: osdmap e223: 6 total, 6 up, 6 in
Feb 20 10:00:18 np0005625204.localdomain ceph-mon[301857]: osdmap e224: 6 total, 6 up, 6 in
Feb 20 10:00:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a251c3bc-737c-4438-9523-36041c19a61e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a251c3bc-737c-4438-9523-36041c19a61e", "format": "json"}]: dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/675717513' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:19 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 20 10:00:20 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e225 e225: 6 total, 6 up, 6 in
Feb 20 10:00:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "auth_id": "david", "tenant_id": "f656f9df86ae4c53b02f471da5bd5ad7", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:20 np0005625204.localdomain ceph-mon[301857]: pgmap v494: 177 pgs: 177 active+clean; 610 MiB data, 2.1 GiB used, 40 GiB / 42 GiB avail; 135 KiB/s rd, 59 MiB/s wr, 218 op/s
Feb 20 10:00:20 np0005625204.localdomain ceph-mon[301857]: osdmap e225: 6 total, 6 up, 6 in
Feb 20 10:00:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:00:21 np0005625204.localdomain podman[321822]: 2026-02-20 10:00:21.154988613 +0000 UTC m=+0.086102445 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 20 10:00:21 np0005625204.localdomain sshd[321837]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:00:21 np0005625204.localdomain podman[321822]: 2026-02-20 10:00:21.191206087 +0000 UTC m=+0.122319919 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 20 10:00:21 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:00:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:21.424 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:21.426 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:21.426 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:21.426 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:21.441 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:21.442 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:21 np0005625204.localdomain sshd[321837]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:00:22 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e226 e226: 6 total, 6 up, 6 in
Feb 20 10:00:23 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: pgmap v496: 177 pgs: 177 active+clean; 935 MiB data, 3.1 GiB used, 39 GiB / 42 GiB avail; 110 KiB/s rd, 65 MiB/s wr, 195 op/s
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: osdmap e226: 6 total, 6 up, 6 in
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 10:00:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e227 e227: 6 total, 6 up, 6 in
Feb 20 10:00:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:00:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "auth_id": "david", "format": "json"}]: dispatch
Feb 20 10:00:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "auth_id": "david", "format": "json"}]: dispatch
Feb 20 10:00:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a251c3bc-737c-4438-9523-36041c19a61e", "format": "json"}]: dispatch
Feb 20 10:00:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a251c3bc-737c-4438-9523-36041c19a61e", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:24 np0005625204.localdomain ceph-mon[301857]: osdmap e227: 6 total, 6 up, 6 in
Feb 20 10:00:25 np0005625204.localdomain ceph-mon[301857]: pgmap v499: 177 pgs: 177 active+clean; 935 MiB data, 3.1 GiB used, 39 GiB / 42 GiB avail; 119 KiB/s rd, 70 MiB/s wr, 210 op/s
Feb 20 10:00:25 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 10:00:25 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e228 e228: 6 total, 6 up, 6 in
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: osdmap e228: 6 total, 6 up, 6 in
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Feb 20 10:00:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:26.443 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:26.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:26.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:26.444 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:26.445 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:26 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:26.448 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:00:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:00:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:00:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:00:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:00:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:00:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e229 e229: 6 total, 6 up, 6 in
Feb 20 10:00:27 np0005625204.localdomain ceph-mon[301857]: pgmap v501: 177 pgs: 177 active+clean; 1.3 GiB data, 4.2 GiB used, 38 GiB / 42 GiB avail; 156 KiB/s rd, 75 MiB/s wr, 254 op/s
Feb 20 10:00:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "auth_id": "david", "format": "json"}]: dispatch
Feb 20 10:00:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "auth_id": "david", "format": "json"}]: dispatch
Feb 20 10:00:27 np0005625204.localdomain ceph-mon[301857]: osdmap e229: 6 total, 6 up, 6 in
Feb 20 10:00:28 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e230 e230: 6 total, 6 up, 6 in
Feb 20 10:00:29 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:00:29 np0005625204.localdomain ceph-mon[301857]: pgmap v503: 177 pgs: 177 active+clean; 1.3 GiB data, 4.2 GiB used, 38 GiB / 42 GiB avail; 137 KiB/s rd, 66 MiB/s wr, 224 op/s
Feb 20 10:00:29 np0005625204.localdomain ceph-mon[301857]: osdmap e230: 6 total, 6 up, 6 in
Feb 20 10:00:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 10:00:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 10:00:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 10:00:29 np0005625204.localdomain podman[321844]: 2026-02-20 10:00:29.185629602 +0000 UTC m=+0.120094430 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 10:00:29 np0005625204.localdomain podman[321844]: 2026-02-20 10:00:29.21905801 +0000 UTC m=+0.153522828 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:00:29 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:00:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 10:00:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 10:00:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "format": "json"}]: dispatch
Feb 20 10:00:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7f44da3d-55b0-4b00-94b4-a7254a3d21a8", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:30 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e231 e231: 6 total, 6 up, 6 in
Feb 20 10:00:31 np0005625204.localdomain ceph-mon[301857]: pgmap v505: 177 pgs: 177 active+clean; 1.1 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 135 KiB/s rd, 62 MiB/s wr, 234 op/s
Feb 20 10:00:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "908a48ac-0297-49f9-ba67-7193ff5e6e97", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:00:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "908a48ac-0297-49f9-ba67-7193ff5e6e97", "format": "json"}]: dispatch
Feb 20 10:00:31 np0005625204.localdomain ceph-mon[301857]: osdmap e231: 6 total, 6 up, 6 in
Feb 20 10:00:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:31.450 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:31.450 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:31.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:31 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:31.451 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:32 np0005625204.localdomain neutron_sriov_agent[257177]: 2026-02-20 10:00:32.128 2 INFO neutron.agent.securitygroups_rpc [req-dff2b32a-81fc-4277-8af6-ada27919a489 req-cd96c648-ee5c-4ce2-9c39-be0fe03b42e9 2ba1a8d771344f0a918e0a8bed2efd06 9fdf2c09b98d48c0bc67cc1c7702a8f4 - - default default] Security group member updated ['9d889f17-f220-427e-bd61-2fb67b868596']
Feb 20 10:00:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:32.161 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:32.162 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 10:00:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:33 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e232 e232: 6 total, 6 up, 6 in
Feb 20 10:00:33 np0005625204.localdomain ceph-mon[301857]: pgmap v507: 177 pgs: 177 active+clean; 283 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 94 KiB/s rd, 89 KiB/s wr, 181 op/s
Feb 20 10:00:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 10:00:33 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "format": "json"}]: dispatch
Feb 20 10:00:33 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4233530869' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:00:34 np0005625204.localdomain sshd[321865]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:00:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2e5c48d9-dcb2-469b-91b0-c7f808d95c49", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "908a48ac-0297-49f9-ba67-7193ff5e6e97", "format": "json"}]: dispatch
Feb 20 10:00:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "908a48ac-0297-49f9-ba67-7193ff5e6e97", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:34 np0005625204.localdomain ceph-mon[301857]: pgmap v508: 177 pgs: 177 active+clean; 283 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 83 KiB/s rd, 79 KiB/s wr, 160 op/s
Feb 20 10:00:34 np0005625204.localdomain ceph-mon[301857]: osdmap e232: 6 total, 6 up, 6 in
Feb 20 10:00:34 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e233 e233: 6 total, 6 up, 6 in
Feb 20 10:00:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:00:35 np0005625204.localdomain podman[321867]: 2026-02-20 10:00:35.129052687 +0000 UTC m=+0.067729845 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 10:00:35 np0005625204.localdomain podman[321867]: 2026-02-20 10:00:35.138876097 +0000 UTC m=+0.077553275 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 10:00:35 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:00:35 np0005625204.localdomain sshd[321865]: Invalid user claude from 86.99.116.54 port 39858
Feb 20 10:00:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:00:35 np0005625204.localdomain systemd[1]: tmp-crun.PmSZuu.mount: Deactivated successfully.
Feb 20 10:00:35 np0005625204.localdomain podman[321890]: 2026-02-20 10:00:35.5272588 +0000 UTC m=+0.085628760 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 20 10:00:35 np0005625204.localdomain podman[321890]: 2026-02-20 10:00:35.539228634 +0000 UTC m=+0.097598544 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, release=1770267347, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Feb 20 10:00:35 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:00:35 np0005625204.localdomain sshd[321865]: Received disconnect from 86.99.116.54 port 39858:11: Bye Bye [preauth]
Feb 20 10:00:35 np0005625204.localdomain sshd[321865]: Disconnected from invalid user claude 86.99.116.54 port 39858 [preauth]
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: mgrmap e51: np0005625202.arwxwo(active, since 12m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: osdmap e233: 6 total, 6 up, 6 in
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice", "format": "json"}]: dispatch
Feb 20 10:00:35 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e234 e234: 6 total, 6 up, 6 in
Feb 20 10:00:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "format": "json"}]: dispatch
Feb 20 10:00:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "52918c2e-6ed5-45c2-9872-88b3bd77010f", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:36 np0005625204.localdomain ceph-mon[301857]: pgmap v511: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 144 KiB/s rd, 96 KiB/s wr, 246 op/s
Feb 20 10:00:36 np0005625204.localdomain ceph-mon[301857]: osdmap e234: 6 total, 6 up, 6 in
Feb 20 10:00:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e235 e235: 6 total, 6 up, 6 in
Feb 20 10:00:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:37.165 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:37.167 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:37.167 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:37.168 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:37.189 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:37.190 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/711179978' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/711179978' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/660812735' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/660812735' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: osdmap e235: 6 total, 6 up, 6 in
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/711179978' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/711179978' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/660812735' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:37 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/660812735' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:38 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e236 e236: 6 total, 6 up, 6 in
Feb 20 10:00:39 np0005625204.localdomain ceph-mon[301857]: pgmap v514: 177 pgs: 177 active+clean; 204 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 109 KiB/s wr, 136 op/s
Feb 20 10:00:39 np0005625204.localdomain ceph-mon[301857]: osdmap e236: 6 total, 6 up, 6 in
Feb 20 10:00:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e237 e237: 6 total, 6 up, 6 in
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "auth_id": "admin", "format": "json"}]: dispatch
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: osdmap e237: 6 total, 6 up, 6 in
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2569557107' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:00:40 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2569557107' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:41 np0005625204.localdomain sshd[321912]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:00:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "format": "json"}]: dispatch
Feb 20 10:00:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7f87993f-62fd-4706-b657-9586f12f2a62", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:41 np0005625204.localdomain ceph-mon[301857]: pgmap v517: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 3.2 KiB/s rd, 94 KiB/s wr, 15 op/s
Feb 20 10:00:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2569557107' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2569557107' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:41 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:00:41Z|00403|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:00:41 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:41.442 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 10:00:42 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 10:00:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:42.190 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:42.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:42 np0005625204.localdomain sshd[321912]: Invalid user max from 188.166.218.64 port 58878
Feb 20 10:00:42 np0005625204.localdomain sshd[321912]: Received disconnect from 188.166.218.64 port 58878:11: Bye Bye [preauth]
Feb 20 10:00:42 np0005625204.localdomain sshd[321912]: Disconnected from invalid user max 188.166.218.64 port 58878 [preauth]
Feb 20 10:00:43 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e238 e238: 6 total, 6 up, 6 in
Feb 20 10:00:43 np0005625204.localdomain ceph-mon[301857]: pgmap v518: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 102 KiB/s rd, 72 KiB/s wr, 144 op/s
Feb 20 10:00:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:43 np0005625204.localdomain ceph-mon[301857]: osdmap e238: 6 total, 6 up, 6 in
Feb 20 10:00:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:00:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:00:44 np0005625204.localdomain systemd[1]: tmp-crun.SO1pgj.mount: Deactivated successfully.
Feb 20 10:00:44 np0005625204.localdomain podman[321915]: 2026-02-20 10:00:44.172419011 +0000 UTC m=+0.104789585 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true)
Feb 20 10:00:44 np0005625204.localdomain podman[321915]: 2026-02-20 10:00:44.182035973 +0000 UTC m=+0.114406547 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 20 10:00:44 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:00:44 np0005625204.localdomain podman[321914]: 2026-02-20 10:00:44.270748136 +0000 UTC m=+0.205191463 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 20 10:00:44 np0005625204.localdomain podman[321914]: 2026-02-20 10:00:44.341711488 +0000 UTC m=+0.276154855 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 10:00:44 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:00:45 np0005625204.localdomain ceph-mon[301857]: pgmap v520: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 100 KiB/s rd, 70 KiB/s wr, 141 op/s
Feb 20 10:00:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:45.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:00:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:45.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 10:00:46 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 10:00:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2664060444' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2664060444' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:47.193 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:47.195 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:00:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:47.195 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:00:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:47.195 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:47.196 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:47.196 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:00:47 np0005625204.localdomain ceph-mon[301857]: pgmap v521: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 119 KiB/s rd, 94 KiB/s wr, 170 op/s
Feb 20 10:00:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e239 e239: 6 total, 6 up, 6 in
Feb 20 10:00:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:00:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:00:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:00:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157081 "" "Go-http-client/1.1"
Feb 20 10:00:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:00:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18826 "" "Go-http-client/1.1"
Feb 20 10:00:47 np0005625204.localdomain sudo[321958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:00:47 np0005625204.localdomain sudo[321958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:00:47 np0005625204.localdomain sudo[321958]: pam_unix(sudo:session): session closed for user root
Feb 20 10:00:47 np0005625204.localdomain sudo[321976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:00:47 np0005625204.localdomain sudo[321976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3370685851' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3370685851' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: osdmap e239: 6 total, 6 up, 6 in
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3370685851' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3370685851' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 20 10:00:48 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 20 10:00:48 np0005625204.localdomain sudo[321976]: pam_unix(sudo:session): session closed for user root
Feb 20 10:00:49 np0005625204.localdomain sudo[322026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:00:49 np0005625204.localdomain sudo[322026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:00:49 np0005625204.localdomain sudo[322026]: pam_unix(sudo:session): session closed for user root
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: pgmap v523: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 111 KiB/s rd, 41 KiB/s wr, 153 op/s
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "65f6f73f-d610-45f7-b23e-11db4d12fdda", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:00:49 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:00:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "65f6f73f-d610-45f7-b23e-11db4d12fdda", "format": "json"}]: dispatch
Feb 20 10:00:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3919460560' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:00:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3919460560' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:00:50 np0005625204.localdomain dnsmasq[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/addn_hosts - 0 addresses
Feb 20 10:00:50 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/host
Feb 20 10:00:50 np0005625204.localdomain dnsmasq-dhcp[321618]: read /var/lib/neutron/dhcp/3615f6b8-3945-4c93-ab04-14a8ee32065e/opts
Feb 20 10:00:50 np0005625204.localdomain podman[322061]: 2026-02-20 10:00:50.650955859 +0000 UTC m=+0.071827439 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:00:50 np0005625204.localdomain systemd[1]: tmp-crun.InfOxR.mount: Deactivated successfully.
Feb 20 10:00:50 np0005625204.localdomain kernel: device tapda8c9dd1-5a left promiscuous mode
Feb 20 10:00:50 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:00:50Z|00404|binding|INFO|Releasing lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 from this chassis (sb_readonly=0)
Feb 20 10:00:50 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:00:50Z|00405|binding|INFO|Setting lport da8c9dd1-5a21-4397-88e4-37d2dfab4a31 down in Southbound
Feb 20 10:00:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:50.890 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:50 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:50.898 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3615f6b8-3945-4c93-ab04-14a8ee32065e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '55965a8332c94f2da5d707adc081ab9c', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6b15549-1f91-4b7d-aa8a-1aa5d439e964, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=da8c9dd1-5a21-4397-88e4-37d2dfab4a31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:00:50 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:50.902 162652 INFO neutron.agent.ovn.metadata.agent [-] Port da8c9dd1-5a21-4397-88e4-37d2dfab4a31 in datapath 3615f6b8-3945-4c93-ab04-14a8ee32065e unbound from our chassis
Feb 20 10:00:50 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:50.905 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3615f6b8-3945-4c93-ab04-14a8ee32065e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:00:50 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:00:50.907 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[3b068111-fbec-4c9f-a72b-0de63a4775a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:00:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:50.916 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:51 np0005625204.localdomain ceph-mon[301857]: pgmap v524: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 76 KiB/s wr, 62 op/s
Feb 20 10:00:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:00:52 np0005625204.localdomain podman[322084]: 2026-02-20 10:00:52.165925857 +0000 UTC m=+0.100315388 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 20 10:00:52 np0005625204.localdomain podman[322084]: 2026-02-20 10:00:52.179951194 +0000 UTC m=+0.114340755 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 20 10:00:52 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:00:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:52.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:52 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:00:52Z|00406|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:00:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:52.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e240 e240: 6 total, 6 up, 6 in
Feb 20 10:00:53 np0005625204.localdomain dnsmasq[321618]: exiting on receipt of SIGTERM
Feb 20 10:00:53 np0005625204.localdomain systemd[1]: tmp-crun.dgjJm7.mount: Deactivated successfully.
Feb 20 10:00:53 np0005625204.localdomain podman[322121]: 2026-02-20 10:00:53.379802431 +0000 UTC m=+0.078256415 container kill ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:00:53 np0005625204.localdomain systemd[1]: libpod-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a.scope: Deactivated successfully.
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: pgmap v525: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 91 KiB/s rd, 74 KiB/s wr, 129 op/s
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: osdmap e240: 6 total, 6 up, 6 in
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.412796) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653412848, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2908, "num_deletes": 273, "total_data_size": 4087565, "memory_usage": 4222976, "flush_reason": "Manual Compaction"}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653424790, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 2672399, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25896, "largest_seqno": 28799, "table_properties": {"data_size": 2660779, "index_size": 7293, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3269, "raw_key_size": 29325, "raw_average_key_size": 22, "raw_value_size": 2635779, "raw_average_value_size": 2044, "num_data_blocks": 311, "num_entries": 1289, "num_filter_entries": 1289, "num_deletions": 273, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581543, "oldest_key_time": 1771581543, "file_creation_time": 1771581653, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 12047 microseconds, and 6083 cpu microseconds.
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.424845) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 2672399 bytes OK
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.424871) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.426746) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.426775) EVENT_LOG_v1 {"time_micros": 1771581653426768, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.426801) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 4073584, prev total WAL file size 4073584, number of live WAL files 2.
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.427582) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(2609KB)], [39(18MB)]
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653427646, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 21603699, "oldest_snapshot_seqno": -1}
Feb 20 10:00:53 np0005625204.localdomain podman[322134]: 2026-02-20 10:00:53.467586116 +0000 UTC m=+0.069421846 container died ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 13874 keys, 19893963 bytes, temperature: kUnknown
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653519810, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 19893963, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19810293, "index_size": 47921, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34693, "raw_key_size": 369904, "raw_average_key_size": 26, "raw_value_size": 19570192, "raw_average_value_size": 1410, "num_data_blocks": 1820, "num_entries": 13874, "num_filter_entries": 13874, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581653, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.520437) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 19893963 bytes
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.522835) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 233.9 rd, 215.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 18.1 +0.0 blob) out(19.0 +0.0 blob), read-write-amplify(15.5) write-amplify(7.4) OK, records in: 14430, records dropped: 556 output_compression: NoCompression
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.522877) EVENT_LOG_v1 {"time_micros": 1771581653522860, "job": 22, "event": "compaction_finished", "compaction_time_micros": 92371, "compaction_time_cpu_micros": 52560, "output_level": 6, "num_output_files": 1, "total_output_size": 19893963, "num_input_records": 14430, "num_output_records": 13874, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653523893, "job": 22, "event": "table_file_deletion", "file_number": 41}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581653528030, "job": 22, "event": "table_file_deletion", "file_number": 39}
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.427477) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528183) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528194) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:00:53 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:00:53.528205) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:00:53 np0005625204.localdomain podman[322134]: 2026-02-20 10:00:53.591167402 +0000 UTC m=+0.193003092 container cleanup ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:00:53 np0005625204.localdomain systemd[1]: libpod-conmon-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a.scope: Deactivated successfully.
Feb 20 10:00:53 np0005625204.localdomain podman[322136]: 2026-02-20 10:00:53.614371479 +0000 UTC m=+0.203972026 container remove ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3615f6b8-3945-4c93-ab04-14a8ee32065e, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 10:00:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:53.640 264355 INFO neutron.agent.dhcp.agent [None req-56cd3450-5a3d-4da5-ac04-6a75b95f3e88 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:00:53 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:00:53.641 264355 INFO neutron.agent.dhcp.agent [None req-56cd3450-5a3d-4da5-ac04-6a75b95f3e88 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:00:54 np0005625204.localdomain systemd[1]: tmp-crun.AAus7f.mount: Deactivated successfully.
Feb 20 10:00:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-141d473fcfd8681de9cd6501789f09d5aafaf3822466c87fc964c4d88a2476a2-merged.mount: Deactivated successfully.
Feb 20 10:00:54 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea5ffec0d360cee29b294c889d02b3ba6d014bf65380688286ce6dbb32f08a4a-userdata-shm.mount: Deactivated successfully.
Feb 20 10:00:54 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d3615f6b8\x2d3945\x2d4c93\x2dab04\x2d14a8ee32065e.mount: Deactivated successfully.
Feb 20 10:00:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "65f6f73f-d610-45f7-b23e-11db4d12fdda", "format": "json"}]: dispatch
Feb 20 10:00:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "65f6f73f-d610-45f7-b23e-11db4d12fdda", "force": true, "format": "json"}]: dispatch
Feb 20 10:00:54 np0005625204.localdomain ceph-mon[301857]: pgmap v527: 177 pgs: 177 active+clean; 205 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 60 KiB/s rd, 43 KiB/s wr, 83 op/s
Feb 20 10:00:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:00:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:00:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 10:00:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 10:00:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:00:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:00:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:00:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:00:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:00:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:00:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:00:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:00:56 np0005625204.localdomain sshd[322163]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:00:57 np0005625204.localdomain ceph-mon[301857]: pgmap v528: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 68 KiB/s wr, 83 op/s
Feb 20 10:00:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:57.277 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:00:57 np0005625204.localdomain sshd[322163]: Invalid user sol from 45.148.10.240 port 56354
Feb 20 10:00:57 np0005625204.localdomain sshd[322163]: Connection closed by invalid user sol 45.148.10.240 port 56354 [preauth]
Feb 20 10:00:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e241 e241: 6 total, 6 up, 6 in
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: pgmap v529: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 57 KiB/s wr, 69 op/s
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: osdmap e241: 6 total, 6 up, 6 in
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "r", "format": "json"}]: dispatch
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow r pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:00:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e242 e242: 6 total, 6 up, 6 in
Feb 20 10:00:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:00:59.738 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:01:00 np0005625204.localdomain podman[322165]: 2026-02-20 10:01:00.15337282 +0000 UTC m=+0.081612558 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:01:00 np0005625204.localdomain podman[322165]: 2026-02-20 10:01:00.166427408 +0000 UTC m=+0.094667206 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 10:01:00 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:01:00 np0005625204.localdomain ceph-mon[301857]: osdmap e242: 6 total, 6 up, 6 in
Feb 20 10:01:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e243 e243: 6 total, 6 up, 6 in
Feb 20 10:01:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:00.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:01 np0005625204.localdomain CROND[322190]: (root) CMD (run-parts /etc/cron.hourly)
Feb 20 10:01:01 np0005625204.localdomain run-parts[322193]: (/etc/cron.hourly) starting 0anacron
Feb 20 10:01:01 np0005625204.localdomain run-parts[322199]: (/etc/cron.hourly) finished 0anacron
Feb 20 10:01:01 np0005625204.localdomain CROND[322189]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 20 10:01:01 np0005625204.localdomain ceph-mon[301857]: pgmap v532: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 156 B/s rd, 89 KiB/s wr, 8 op/s
Feb 20 10:01:01 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:01 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "format": "json"}]: dispatch
Feb 20 10:01:01 np0005625204.localdomain ceph-mon[301857]: osdmap e243: 6 total, 6 up, 6 in
Feb 20 10:01:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3769969177' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e243 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1908739206' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1908739206' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.279 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.281 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.316 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.317 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2951262390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: pgmap v534: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 63 KiB/s wr, 39 op/s
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1908739206' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1908739206' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:01:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:01:02.490 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.492 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:01:02.493 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.746 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:01:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:02.746 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:01:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e244 e244: 6 total, 6 up, 6 in
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.103314) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663103411, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 466, "num_deletes": 250, "total_data_size": 413035, "memory_usage": 422512, "flush_reason": "Manual Compaction"}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663107976, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 270910, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28805, "largest_seqno": 29265, "table_properties": {"data_size": 268258, "index_size": 699, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 7400, "raw_average_key_size": 21, "raw_value_size": 262609, "raw_average_value_size": 746, "num_data_blocks": 31, "num_entries": 352, "num_filter_entries": 352, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581653, "oldest_key_time": 1771581653, "file_creation_time": 1771581663, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4696 microseconds, and 1741 cpu microseconds.
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.108024) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 270910 bytes OK
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.108047) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.109570) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.109590) EVENT_LOG_v1 {"time_micros": 1771581663109583, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.109615) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 410099, prev total WAL file size 410423, number of live WAL files 2.
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.111326) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303036' seq:72057594037927935, type:22 .. '6D6772737461740034323537' seq:0, type:0; will stop at (end)
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(264KB)], [42(18MB)]
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663111402, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 20164873, "oldest_snapshot_seqno": -1}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 13703 keys, 18046533 bytes, temperature: kUnknown
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663167157, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 18046533, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17968485, "index_size": 42683, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34309, "raw_key_size": 366697, "raw_average_key_size": 26, "raw_value_size": 17735848, "raw_average_value_size": 1294, "num_data_blocks": 1601, "num_entries": 13703, "num_filter_entries": 13703, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581663, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.167414) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 18046533 bytes
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.168597) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 360.9 rd, 323.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 19.0 +0.0 blob) out(17.2 +0.0 blob), read-write-amplify(141.0) write-amplify(66.6) OK, records in: 14226, records dropped: 523 output_compression: NoCompression
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.168612) EVENT_LOG_v1 {"time_micros": 1771581663168605, "job": 24, "event": "compaction_finished", "compaction_time_micros": 55872, "compaction_time_cpu_micros": 25635, "output_level": 6, "num_output_files": 1, "total_output_size": 18046533, "num_input_records": 14226, "num_output_records": 13703, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663168750, "job": 24, "event": "table_file_deletion", "file_number": 44}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581663170223, "job": 24, "event": "table_file_deletion", "file_number": 42}
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.111208) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170298) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170314) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:01:03.170317) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/884298509' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.257 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.308 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.308 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.511 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.512 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11235MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.512 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.513 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.569 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.570 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.570 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:01:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:03.615 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: osdmap e244: 6 total, 6 up, 6 in
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/884298509' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "snap_name": "ba024a11-8c4d-4adf-9f1b-141c894e0dc3", "format": "json"}]: dispatch
Feb 20 10:01:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:01:04 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2041852401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:04.123 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:01:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:04.131 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:01:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:04.148 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:01:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:04.151 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:01:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:04.151 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:01:04 np0005625204.localdomain ceph-mon[301857]: pgmap v536: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 69 KiB/s wr, 43 op/s
Feb 20 10:01:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2041852401' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:05.153 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:05.153 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e245 e245: 6 total, 6 up, 6 in
Feb 20 10:01:05 np0005625204.localdomain sshd[322244]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:01:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:01:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 20 10:01:05 np0005625204.localdomain ceph-mon[301857]: osdmap e245: 6 total, 6 up, 6 in
Feb 20 10:01:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:01:05 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:01:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:01:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:01:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:01:06.023 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:01:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:01:06.024 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:01:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:01:06 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:01:06 np0005625204.localdomain podman[322247]: 2026-02-20 10:01:06.135101219 +0000 UTC m=+0.073239912 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:01:06 np0005625204.localdomain podman[322247]: 2026-02-20 10:01:06.14495459 +0000 UTC m=+0.083093313 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 10:01:06 np0005625204.localdomain sshd[322244]: Received disconnect from 57.128.218.144 port 44056:11: Bye Bye [preauth]
Feb 20 10:01:06 np0005625204.localdomain sshd[322244]: Disconnected from authenticating user root 57.128.218.144 port 44056 [preauth]
Feb 20 10:01:06 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:01:06 np0005625204.localdomain podman[322246]: 2026-02-20 10:01:06.21091084 +0000 UTC m=+0.150706603 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.expose-services=, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, release=1770267347, container_name=openstack_network_exporter, version=9.7, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Feb 20 10:01:06 np0005625204.localdomain podman[322246]: 2026-02-20 10:01:06.227271307 +0000 UTC m=+0.167067080 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, architecture=x86_64, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 10:01:06 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:01:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e245 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:06.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:07 np0005625204.localdomain ceph-mon[301857]: pgmap v538: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 67 KiB/s rd, 51 KiB/s wr, 99 op/s
Feb 20 10:01:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e246 e246: 6 total, 6 up, 6 in
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.346 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.349 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.810 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.810 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:01:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:07.810 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:01:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e247 e247: 6 total, 6 up, 6 in
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.302 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:01:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "snap_name": "ba024a11-8c4d-4adf-9f1b-141c894e0dc3", "target_sub_name": "97e63579-f59d-4812-9af1-a8d227932ace", "format": "json"}]: dispatch
Feb 20 10:01:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "97e63579-f59d-4812-9af1-a8d227932ace", "format": "json"}]: dispatch
Feb 20 10:01:08 np0005625204.localdomain ceph-mon[301857]: osdmap e246: 6 total, 6 up, 6 in
Feb 20 10:01:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1804254351' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:08 np0005625204.localdomain ceph-mon[301857]: osdmap e247: 6 total, 6 up, 6 in
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.329 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.330 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.330 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.331 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.331 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.739 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.739 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 10:01:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:08.757 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 10:01:09 np0005625204.localdomain ceph-mon[301857]: pgmap v540: 177 pgs: 177 active+clean; 206 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 47 KiB/s wr, 65 op/s
Feb 20 10:01:09 np0005625204.localdomain ceph-mon[301857]: mgrmap e52: np0005625202.arwxwo(active, since 12m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 10:01:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1460565932' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:01:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1460565932' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:01:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3375971377' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:01:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:09 np0005625204.localdomain sshd[322289]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:01:09 np0005625204.localdomain sshd[322289]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:01:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "format": "json"}]: dispatch
Feb 20 10:01:11 np0005625204.localdomain ceph-mon[301857]: pgmap v542: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 44 KiB/s rd, 130 KiB/s wr, 72 op/s
Feb 20 10:01:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:11 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:01:11.495 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 10:01:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:12.393 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:12.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:12.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5045 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:12.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:12.396 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:12.397 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:12 np0005625204.localdomain sshd[322291]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e248 e248: 6 total, 6 up, 6 in
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: pgmap v543: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 80 KiB/s wr, 53 op/s
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb,allow rw path=/volumes/_nogroup/8be201ef-8dd5-4872-91e4-0290b94f4b6d/d3136894-05be-4407-a87e-f697b6b3efd1", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5,allow rw pool=manila_data namespace=fsvolumens_8be201ef-8dd5-4872-91e4-0290b94f4b6d"]} : dispatch
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb,allow rw path=/volumes/_nogroup/8be201ef-8dd5-4872-91e4-0290b94f4b6d/d3136894-05be-4407-a87e-f697b6b3efd1", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5,allow rw pool=manila_data namespace=fsvolumens_8be201ef-8dd5-4872-91e4-0290b94f4b6d"]}]': finished
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 20 10:01:13 np0005625204.localdomain ceph-mon[301857]: osdmap e248: 6 total, 6 up, 6 in
Feb 20 10:01:13 np0005625204.localdomain sshd[322291]: Invalid user michele from 103.191.14.210 port 53302
Feb 20 10:01:14 np0005625204.localdomain sshd[322291]: Received disconnect from 103.191.14.210 port 53302:11: Bye Bye [preauth]
Feb 20 10:01:14 np0005625204.localdomain sshd[322291]: Disconnected from invalid user michele 103.191.14.210 port 53302 [preauth]
Feb 20 10:01:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:01:14 np0005625204.localdomain podman[322293]: 2026-02-20 10:01:14.37514635 +0000 UTC m=+0.090383504 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 20 10:01:14 np0005625204.localdomain podman[322293]: 2026-02-20 10:01:14.393021784 +0000 UTC m=+0.108258958 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 20 10:01:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:01:14 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e249 e249: 6 total, 6 up, 6 in
Feb 20 10:01:14 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:01:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "auth_id": "bob", "tenant_id": "8e04bc360fa14db4a793bc5de7a0a299", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:01:14 np0005625204.localdomain podman[322311]: 2026-02-20 10:01:14.489925657 +0000 UTC m=+0.083995360 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Feb 20 10:01:14 np0005625204.localdomain podman[322311]: 2026-02-20 10:01:14.560042464 +0000 UTC m=+0.154112227 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:01:14 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:01:15 np0005625204.localdomain ceph-mon[301857]: pgmap v545: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 81 KiB/s wr, 54 op/s
Feb 20 10:01:15 np0005625204.localdomain ceph-mon[301857]: osdmap e249: 6 total, 6 up, 6 in
Feb 20 10:01:15 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e250 e250: 6 total, 6 up, 6 in
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: osdmap e250: 6 total, 6 up, 6 in
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: pgmap v548: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 35 KiB/s wr, 71 op/s
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "auth_id": "bob", "format": "json"}]: dispatch
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5"]} : dispatch
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/500c5eac-e7f6-4365-974d-923d6976cbd5/25e42ac7-68f5-4e72-8fae-18ed5a220bcb", "osd", "allow rw pool=manila_data namespace=fsvolumens_500c5eac-e7f6-4365-974d-923d6976cbd5"]}]': finished
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "auth_id": "bob", "format": "json"}]: dispatch
Feb 20 10:01:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:17.398 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:17.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:17.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:17.400 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:17.417 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:17.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:17 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1384276159' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:01:17 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1384276159' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:01:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:01:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:01:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:01:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:01:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:01:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18358 "" "Go-http-client/1.1"
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.321 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.326 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fc0f878e-2266-47f8-b48a-659ec4f14efc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.322275', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19c705d0-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '3568909fd73a0d0e57e5812a6e4f8bc9dc3a3776ec3e0fc29e2553d4c80c0331'}]}, 'timestamp': '2026-02-20 10:01:18.327044', '_unique_id': 'c9aa78174bdc4a6e9cc5dddffd597d15'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.328 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.329 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.330 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '25578c2c-45ed-4df3-b693-36ddce741b38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.330217', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19c79806-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '089e6df68f40619e40920f696b7fb41614e3cbf54edbdd8cb4b6ce26a27dce1d'}]}, 'timestamp': '2026-02-20 10:01:18.330772', '_unique_id': 'f83e8445e94e49b39f2c805afdcc6de3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.331 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.360 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.361 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5d7a547-1189-479f-8fa8-28b4a188eb7f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.333003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19cc3884-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'c0536afcfac39382aecad9fc7dabe368aa66a9024d51d5c3bbdd20153d1eae59'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.333003', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cc4eb4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'b5a262126a489fc8db61acaf40f0593491e62e5c4222c5545f30a99549f987fb'}]}, 'timestamp': '2026-02-20 10:01:18.361590', '_unique_id': 'e951c0696c05409ab628459ccbc76d02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.362 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.364 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6d72a56-af11-4579-88f2-5788aa4307e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.364392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19ccd0e6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '87315e7022be5ec22d4a966aa4147a343a36c7f1fdcdd265ee53d22c2dd84992'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.364392', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cce252-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '24254e51b70c7e7d49c79eb08195e60dd0882c91f7114c4f93e6607cfaab5828'}]}, 'timestamp': '2026-02-20 10:01:18.365397', '_unique_id': 'dd066772099241fa98ece0de0a00cd17'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.366 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.367 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.367 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.368 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4c42d48-e74d-4565-a66f-f83ccb4e3f61', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.367701', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19cd4ff8-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '076c7f73906380475a7b4ebe24ef1eeeadcebce1ae7b95c1c1b07733c19bb8de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.367701', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cd607e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '66751623b0588b8c3aff6df1b25fb6864fedebc95c8e5733b9719c140bf42648'}]}, 'timestamp': '2026-02-20 10:01:18.368682', '_unique_id': '63da96fd5e834f8abeb82936f6d3074c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.369 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.370 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '20f357e3-a487-41a5-988a-e2f22faa529b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.370933', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19cdce88-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '9cc17a93d06fef89893835990cf1d971bc6c6a7546ea7c32d54bab23dc8a2c3e'}]}, 'timestamp': '2026-02-20 10:01:18.371426', '_unique_id': 'a0d548b93f43414cacc0e4649cf94b00'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.372 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.373 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.373 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e583d93-ecc0-4b3e-90b9-a781e5b1a57b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.373785', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19ce3f8a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '71cb6bd37cda3c1aa8c1c4b9b95e5af73e64110d77f5e2638759d54f33c2e520'}]}, 'timestamp': '2026-02-20 10:01:18.374325', '_unique_id': '7016c2d6f3c44bd2be60f74fa6993604'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.375 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.376 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.376 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '777ae504-7d3e-4530-832b-e3bf56f62268', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.376874', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19ceb690-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': 'd990bfe87c888eddbec86e1f2aee5198882b83a168ae1adecdf05735f6859ca1'}]}, 'timestamp': '2026-02-20 10:01:18.377415', '_unique_id': '0aac0b11e44f4912853d41e400a9f7cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.378 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.379 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.379 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.380 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '728e4541-d2c3-4cd4-9ab7-5efd98e45b3d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.379884', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19cf2c06-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '30a9b0029950a77850f9d7e26fd22f0f299c0b103828a15ce689653b7bf092cb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.379884', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19cf3cc8-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'bb0bf2f40c1fded486f98d76c81879fba81de77c8502cb99e848d1d4c5238153'}]}, 'timestamp': '2026-02-20 10:01:18.380827', '_unique_id': '1ea80a1688e046c5b452206fdc40d586'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.381 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.382 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.383 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '974d112f-b86a-4c33-9152-dc5ad90bfa24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.383038', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19cfa67c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '4e783cfe5259bfa2effb8fc868ada59038f423faa59577d7c2a35574ee2e3fdc'}]}, 'timestamp': '2026-02-20 10:01:18.383506', '_unique_id': '03acec0b5a6a4fbebf2f74b2d913f87f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.384 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.385 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.395 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.396 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be672f60-7189-4021-a2c9-ff510744b02f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.385657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d19a18-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': 'ddc71601dbab6a5b5ffeff7c057d5bc86e30ccda093cb6981e0106147525ca96'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.385657', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d1ab48-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '5f41f31d8df78df448604c95d817b40fb4617e54f90c1c38314c71447b69c9fa'}]}, 'timestamp': '2026-02-20 10:01:18.396845', '_unique_id': 'a9f93066114045eaa5f99cb48e041027'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.397 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.399 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a7af843c-a626-4541-8522-7640854138e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.399756', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d236e4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': 'a6ea6e53ce71a145e82cedcc098e88d012fba96d8e8a1a717869cdb3c904dd5a'}]}, 'timestamp': '2026-02-20 10:01:18.400392', '_unique_id': 'd1419ff9453f45e3a17995eec0da61ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.401 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.403 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8a25ee31-3f7f-403b-a4d9-fefab8ab5af2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.402892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d2adf4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': 'e6224391fac70279059422152e04effa36950e2dd3442d018d3340a57d1638cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.402892', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d2bde4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '05c082d8b153b9e6756de67cb9cce889c9921fb33a0f0eadaed57043d2ab35c6'}]}, 'timestamp': '2026-02-20 10:01:18.403766', '_unique_id': '698f7511a5df46d99d29421dcfbd6723'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.406 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.406 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c17d3bd3-8d42-42fe-8b9d-9dfcf57e2aa2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.406030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d32892-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '70f136de0b70bbcc921fd367d466f24e98abd5d1c6631d02410d95295d971a53'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.406030', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d3399a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '3822c1e081ad6128a972a423616f2ef054167b146ab39c9b9043314ef15480e2'}]}, 'timestamp': '2026-02-20 10:01:18.406903', '_unique_id': 'c4c331f009834f17be17b2f18bb928d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.407 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.408 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.423 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 19490000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b48210cc-0f0b-4560-bda3-c0447455412a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 19490000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:01:18.409011', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '19d5d7cc-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.662670307, 'message_signature': '32543b785726a3389c018482c3ce5129a68d625593a29b079f39b1d779f1f112'}]}, 'timestamp': '2026-02-20 10:01:18.424080', '_unique_id': '2a18c618657d465095bb199bfc2937c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.425 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.426 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.426 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.426 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35dcce18-1e32-4140-a9fc-9b1fc5768c01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:01:18.426326', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '19d64126-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.662670307, 'message_signature': 'd32d41bf37fcbcf0dd4787fb142f2a96a46d6dbf141689ad59111726b4c157e7'}]}, 'timestamp': '2026-02-20 10:01:18.426804', '_unique_id': '6de8f8a3cec745b089055341735e193e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.427 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.428 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd47b0cd7-c847-4c8f-bc3e-1a452d5ae3c3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.428880', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d6a562-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '84ab443d8540a7e261a9c62893a36baf5ec09535f960d27f00a4171e4a3fc1b6'}]}, 'timestamp': '2026-02-20 10:01:18.429349', '_unique_id': '040a6fe23c3d45c9a03bb8659a47d22c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.431 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '090f1568-9ddd-423a-a585-fb489adc93a4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.431508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d70cbe-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '5de362648e3bcc4f9b8ac5da2ec3194e0871817d8d42f0b03da69da925d31b42'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.431508', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d71cc2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.624880757, 'message_signature': '3c24bcd018c4fa4afaae5b971892a9b7f0bc286d101917fee874af7da7c63c26'}]}, 'timestamp': '2026-02-20 10:01:18.432378', '_unique_id': '9acc2c78cd4f4f799395ab35ca392188'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.433 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.434 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.434 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f58ec99-d937-4aaf-b1d7-308b521d1626', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.434615', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d786bc-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': 'ca801fc0cb8c423f9cc400f4137200bfb9f71a1eb27f9a119b766508e7697776'}]}, 'timestamp': '2026-02-20 10:01:18.435115', '_unique_id': 'e325c7b43af04b18b3634f0a12854be5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.437 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.437 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.438 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff97e4ba-e6e3-4957-9072-4e79bf9b8caa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:01:18.437761', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '19d80024-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '82c9301498c945c5b1147bb86fd1bd7c7192bff63b85887cbe55088a4c686d70'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:01:18.437761', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '19d8100a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.572199131, 'message_signature': '40bc3c16fa0d01b9daabd1fb5255f964625e0b95d26ccaeead6e13fcbf735dea'}]}, 'timestamp': '2026-02-20 10:01:18.438605', '_unique_id': '1d6c943342524338843b1cb2418c5098'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.439 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.440 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c88ca35-d4a4-475d-8e88-5915a563825b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:01:18.440737', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '19d8748c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12237.561473444, 'message_signature': '31c09eed5822bf1dd51e3cd7f18a09c807e06d0a888009dbf8e87a3487f468d4'}]}, 'timestamp': '2026-02-20 10:01:18.441204', '_unique_id': '8606542adc944842bc58f0e9cd235848'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:01:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:01:18.442 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:01:19 np0005625204.localdomain ceph-mon[301857]: pgmap v549: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 31 KiB/s wr, 21 op/s
Feb 20 10:01:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "bob", "format": "json"}]: dispatch
Feb 20 10:01:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 20 10:01:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 20 10:01:20 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Feb 20 10:01:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "auth_id": "bob", "format": "json"}]: dispatch
Feb 20 10:01:21 np0005625204.localdomain ceph-mon[301857]: pgmap v550: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 10 KiB/s rd, 84 KiB/s wr, 24 op/s
Feb 20 10:01:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:22 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:22.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:22.420 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:22.421 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:22.421 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:22.448 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:22.449 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:01:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e251 e251: 6 total, 6 up, 6 in
Feb 20 10:01:23 np0005625204.localdomain podman[322338]: 2026-02-20 10:01:23.129872215 +0000 UTC m=+0.070116271 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 20 10:01:23 np0005625204.localdomain podman[322338]: 2026-02-20 10:01:23.146313146 +0000 UTC m=+0.086557242 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:01:23 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:01:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "format": "json"}]: dispatch
Feb 20 10:01:23 np0005625204.localdomain ceph-mon[301857]: pgmap v551: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 71 KiB/s wr, 55 op/s
Feb 20 10:01:23 np0005625204.localdomain ceph-mon[301857]: osdmap e251: 6 total, 6 up, 6 in
Feb 20 10:01:23 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:01:23Z|00407|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 20 10:01:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "format": "json"}]: dispatch
Feb 20 10:01:24 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8be201ef-8dd5-4872-91e4-0290b94f4b6d", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:25 np0005625204.localdomain ceph-mon[301857]: pgmap v553: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 46 KiB/s wr, 38 op/s
Feb 20 10:01:26 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "snap_name": "607e1fd9-dd9b-4387-8d26-3ad0a4a7fc6e", "format": "json"}]: dispatch
Feb 20 10:01:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:01:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:01:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:01:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:01:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:01:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:01:27 np0005625204.localdomain ceph-mon[301857]: pgmap v554: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 57 KiB/s wr, 33 op/s
Feb 20 10:01:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "format": "json"}]: dispatch
Feb 20 10:01:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "500c5eac-e7f6-4365-974d-923d6976cbd5", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:27.483 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:27.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:27.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:27.484 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:27.485 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:27.488 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:29 np0005625204.localdomain ceph-mon[301857]: pgmap v555: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 57 KiB/s wr, 33 op/s
Feb 20 10:01:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "snap_name": "607e1fd9-dd9b-4387-8d26-3ad0a4a7fc6e_7d12ad99-df51-4085-a46f-32f38bb8f276", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "snap_name": "607e1fd9-dd9b-4387-8d26-3ad0a4a7fc6e", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "format": "json"}]: dispatch
Feb 20 10:01:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:01:30 np0005625204.localdomain podman[322358]: 2026-02-20 10:01:30.87147616 +0000 UTC m=+0.087192482 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:01:30 np0005625204.localdomain podman[322358]: 2026-02-20 10:01:30.909071426 +0000 UTC m=+0.124787748 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 10:01:30 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:01:31 np0005625204.localdomain ceph-mon[301857]: pgmap v556: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 53 KiB/s wr, 33 op/s
Feb 20 10:01:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "format": "json"}]: dispatch
Feb 20 10:01:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:32 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e252 e252: 6 total, 6 up, 6 in
Feb 20 10:01:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "format": "json"}]: dispatch
Feb 20 10:01:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "afaf9e02-272e-4ffc-b272-cdee923c5a1d", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:32.487 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:33 np0005625204.localdomain sshd[322382]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:01:33 np0005625204.localdomain ceph-mon[301857]: pgmap v557: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 52 KiB/s wr, 6 op/s
Feb 20 10:01:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "format": "json"}]: dispatch
Feb 20 10:01:33 np0005625204.localdomain ceph-mon[301857]: osdmap e252: 6 total, 6 up, 6 in
Feb 20 10:01:34 np0005625204.localdomain sshd[322382]: Invalid user admin from 203.228.30.198 port 54400
Feb 20 10:01:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "snap_name": "ddab1265-f000-456b-af11-3f6f3cbbac23", "format": "json"}]: dispatch
Feb 20 10:01:34 np0005625204.localdomain sshd[322382]: Received disconnect from 203.228.30.198 port 54400:11: Bye Bye [preauth]
Feb 20 10:01:34 np0005625204.localdomain sshd[322382]: Disconnected from invalid user admin 203.228.30.198 port 54400 [preauth]
Feb 20 10:01:35 np0005625204.localdomain ceph-mon[301857]: pgmap v559: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 52 KiB/s wr, 6 op/s
Feb 20 10:01:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:01:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/3176bf32-687c-42ee-9751-803dd8a1fea3/40c90f7c-2187-4f07-808e-7e98f84f2bd9", "osd", "allow rw pool=manila_data namespace=fsvolumens_3176bf32-687c-42ee-9751-803dd8a1fea3", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:01:35 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/3176bf32-687c-42ee-9751-803dd8a1fea3/40c90f7c-2187-4f07-808e-7e98f84f2bd9", "osd", "allow rw pool=manila_data namespace=fsvolumens_3176bf32-687c-42ee-9751-803dd8a1fea3", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:01:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9ee8c4a3-7471-4713-8c2c-cbe769628e3f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9ee8c4a3-7471-4713-8c2c-cbe769628e3f", "format": "json"}]: dispatch
Feb 20 10:01:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:01:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:01:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:01:37 np0005625204.localdomain podman[322385]: 2026-02-20 10:01:37.126979572 +0000 UTC m=+0.059301071 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:01:37 np0005625204.localdomain podman[322384]: 2026-02-20 10:01:37.203213498 +0000 UTC m=+0.137156426 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, release=1770267347, version=9.7, architecture=x86_64, container_name=openstack_network_exporter, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 10:01:37 np0005625204.localdomain podman[322384]: 2026-02-20 10:01:37.213367527 +0000 UTC m=+0.147310475 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, release=1770267347, version=9.7, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, architecture=x86_64, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal)
Feb 20 10:01:37 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:01:37 np0005625204.localdomain podman[322385]: 2026-02-20 10:01:37.27440877 +0000 UTC m=+0.206730279 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:01:37 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:01:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "04824cf7-cd0e-41ba-8d21-04797c7215a6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "04824cf7-cd0e-41ba-8d21-04797c7215a6", "format": "json"}]: dispatch
Feb 20 10:01:37 np0005625204.localdomain ceph-mon[301857]: pgmap v560: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 75 KiB/s wr, 7 op/s
Feb 20 10:01:37 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:37.491 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:37.493 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:37.493 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:37.493 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:37.539 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:37.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e253 e253: 6 total, 6 up, 6 in
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "16497a5d-c9a5-4ff1-bddc-189b04ad6dac", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "16497a5d-c9a5-4ff1-bddc-189b04ad6dac", "format": "json"}]: dispatch
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "snap_name": "ddab1265-f000-456b-af11-3f6f3cbbac23_b97127c3-1060-4d01-b54f-99018ca9ac64", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "snap_name": "ddab1265-f000-456b-af11-3f6f3cbbac23", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: pgmap v561: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 75 KiB/s wr, 7 op/s
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9ee8c4a3-7471-4713-8c2c-cbe769628e3f", "format": "json"}]: dispatch
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9ee8c4a3-7471-4713-8c2c-cbe769628e3f", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: osdmap e253: 6 total, 6 up, 6 in
Feb 20 10:01:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "97e63579-f59d-4812-9af1-a8d227932ace", "format": "json"}]: dispatch
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e254 e254: 6 total, 6 up, 6 in
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "97e63579-f59d-4812-9af1-a8d227932ace", "format": "json"}]: dispatch
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: osdmap e254: 6 total, 6 up, 6 in
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:01:39 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "04824cf7-cd0e-41ba-8d21-04797c7215a6", "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "04824cf7-cd0e-41ba-8d21-04797c7215a6", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: pgmap v564: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 280 B/s rd, 57 KiB/s wr, 5 op/s
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "3176bf32-687c-42ee-9751-803dd8a1fea3", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "27c01f2a-f558-4311-a88e-0bf402c95817", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8782d0aa-7804-4d19-9c02-5e9c58554cbc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8782d0aa-7804-4d19-9c02-5e9c58554cbc", "format": "json"}]: dispatch
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "16497a5d-c9a5-4ff1-bddc-189b04ad6dac", "format": "json"}]: dispatch
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "16497a5d-c9a5-4ff1-bddc-189b04ad6dac", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:41 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:42 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "format": "json"}]: dispatch
Feb 20 10:01:42 np0005625204.localdomain ceph-mon[301857]: pgmap v565: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 151 KiB/s wr, 13 op/s
Feb 20 10:01:42 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:42 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "format": "json"}]: dispatch
Feb 20 10:01:42 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:42.540 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:42.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:42.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:42.543 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:42.580 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:42.581 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:43 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 e255: 6 total, 6 up, 6 in
Feb 20 10:01:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "58028d35-31fb-4b5c-a80b-11eaf91192f8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "58028d35-31fb-4b5c-a80b-11eaf91192f8", "format": "json"}]: dispatch
Feb 20 10:01:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:43 np0005625204.localdomain ceph-mon[301857]: osdmap e255: 6 total, 6 up, 6 in
Feb 20 10:01:44 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "8782d0aa-7804-4d19-9c02-5e9c58554cbc", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 20 10:01:44 np0005625204.localdomain ceph-mon[301857]: pgmap v567: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 132 KiB/s wr, 12 op/s
Feb 20 10:01:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:01:45 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:01:45 np0005625204.localdomain podman[322428]: 2026-02-20 10:01:45.155498411 +0000 UTC m=+0.086432587 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:01:45 np0005625204.localdomain systemd[1]: tmp-crun.zXNtak.mount: Deactivated successfully.
Feb 20 10:01:45 np0005625204.localdomain podman[322429]: 2026-02-20 10:01:45.212065398 +0000 UTC m=+0.138104905 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 10:01:45 np0005625204.localdomain podman[322428]: 2026-02-20 10:01:45.224075284 +0000 UTC m=+0.155009500 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 10:01:45 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:01:45 np0005625204.localdomain podman[322429]: 2026-02-20 10:01:45.245181578 +0000 UTC m=+0.171221065 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:01:45 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:01:45 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "snap_name": "62a01af2-9c69-4d90-af43-120db3783b58", "format": "json"}]: dispatch
Feb 20 10:01:45 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:01:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:01:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e325a34-23a3-4cc0-9c2e-4df32d08d36f/82f72010-739f-40e2-832b-6d6c7666a8fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:01:45 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/2e325a34-23a3-4cc0-9c2e-4df32d08d36f/82f72010-739f-40e2-832b-6d6c7666a8fb", "osd", "allow rw pool=manila_data namespace=fsvolumens_2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:01:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:46 np0005625204.localdomain ceph-mon[301857]: pgmap v568: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 167 KiB/s wr, 17 op/s
Feb 20 10:01:46 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "58028d35-31fb-4b5c-a80b-11eaf91192f8", "format": "json"}]: dispatch
Feb 20 10:01:46 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "58028d35-31fb-4b5c-a80b-11eaf91192f8", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:47.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:47.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:47.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:47.584 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:47.630 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:47.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8782d0aa-7804-4d19-9c02-5e9c58554cbc", "format": "json"}]: dispatch
Feb 20 10:01:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8782d0aa-7804-4d19-9c02-5e9c58554cbc", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:01:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:01:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:01:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:01:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:01:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18352 "" "Go-http-client/1.1"
Feb 20 10:01:49 np0005625204.localdomain ceph-mon[301857]: pgmap v569: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 148 KiB/s wr, 15 op/s
Feb 20 10:01:49 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "snap_name": "62a01af2-9c69-4d90-af43-120db3783b58", "target_sub_name": "e59c51a9-964d-4ac9-9a67-d46c3cec7b52", "format": "json"}]: dispatch
Feb 20 10:01:49 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e59c51a9-964d-4ac9-9a67-d46c3cec7b52", "format": "json"}]: dispatch
Feb 20 10:01:49 np0005625204.localdomain sudo[322469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:01:49 np0005625204.localdomain sudo[322469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:01:49 np0005625204.localdomain sudo[322469]: pam_unix(sudo:session): session closed for user root
Feb 20 10:01:49 np0005625204.localdomain sudo[322487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:01:49 np0005625204.localdomain sudo[322487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:01:49 np0005625204.localdomain sudo[322487]: pam_unix(sudo:session): session closed for user root
Feb 20 10:01:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:01:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:01:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:01:50 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:01:50 np0005625204.localdomain sudo[322537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:01:50 np0005625204.localdomain sudo[322537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:01:50 np0005625204.localdomain sudo[322537]: pam_unix(sudo:session): session closed for user root
Feb 20 10:01:51 np0005625204.localdomain ceph-mon[301857]: pgmap v570: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 126 KiB/s wr, 13 op/s
Feb 20 10:01:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:52 np0005625204.localdomain sshd[322555]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:01:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:52.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:01:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:01:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:52.659 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:52.660 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:01:53 np0005625204.localdomain sshd[322555]: Received disconnect from 154.91.170.41 port 54918:11: Bye Bye [preauth]
Feb 20 10:01:53 np0005625204.localdomain sshd[322555]: Disconnected from authenticating user root 154.91.170.41 port 54918 [preauth]
Feb 20 10:01:53 np0005625204.localdomain ceph-mon[301857]: pgmap v571: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 99 KiB/s wr, 10 op/s
Feb 20 10:01:53 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ca81ba90-4b7a-404f-aec6-b73bd51b8d77", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:53 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:01:53 np0005625204.localdomain podman[322557]: 2026-02-20 10:01:53.297282666 +0000 UTC m=+0.070150971 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 10:01:53 np0005625204.localdomain podman[322557]: 2026-02-20 10:01:53.3370743 +0000 UTC m=+0.109942565 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 10:01:53 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:01:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:01:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:01:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:01:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ca81ba90-4b7a-404f-aec6-b73bd51b8d77", "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: pgmap v572: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 970 B/s rd, 94 KiB/s wr, 10 op/s
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "2e325a34-23a3-4cc0-9c2e-4df32d08d36f", "force": true, "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "format": "json"}]: dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:55 np0005625204.localdomain ceph-mon[301857]: mgrmap e53: np0005625202.arwxwo(active, since 13m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 10:01:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "ca81ba90-4b7a-404f-aec6-b73bd51b8d77", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Feb 20 10:01:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:01:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:01:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:01:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:01:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:01:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:01:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:01:57 np0005625204.localdomain sshd[322576]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:01:57 np0005625204.localdomain ceph-mon[301857]: pgmap v573: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 134 KiB/s wr, 13 op/s
Feb 20 10:01:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "817a644a-a040-452f-9ef0-baf961087441", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "817a644a-a040-452f-9ef0-baf961087441", "format": "json"}]: dispatch
Feb 20 10:01:57 np0005625204.localdomain sshd[322576]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:01:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:57.661 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:01:57.664 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:01:58 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:01:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1559371662", "format": "json"} : dispatch
Feb 20 10:01:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1559371662", "caps": ["mds", "allow rw path=/volumes/_nogroup/50918518-d5fc-4598-a9e4-c7aeadda4e5c/790d1933-deae-4553-8b28-0d1b6f0fc428", "osd", "allow rw pool=manila_data namespace=fsvolumens_50918518-d5fc-4598-a9e4-c7aeadda4e5c", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:01:58 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-1559371662", "caps": ["mds", "allow rw path=/volumes/_nogroup/50918518-d5fc-4598-a9e4-c7aeadda4e5c/790d1933-deae-4553-8b28-0d1b6f0fc428", "osd", "allow rw pool=manila_data namespace=fsvolumens_50918518-d5fc-4598-a9e4-c7aeadda4e5c", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b164674c-a82b-4878-a588-09120b66d1e5", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b164674c-a82b-4878-a588-09120b66d1e5", "format": "json"}]: dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: pgmap v574: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 95 KiB/s wr, 9 op/s
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "auth_id": "tempest-cephx-id-1559371662", "tenant_id": "e2c7618200d34da3a2f64f252dae7492", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "auth_id": "tempest-cephx-id-1559371662", "format": "json"}]: dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-1559371662", "format": "json"} : dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-1559371662"} : dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-1559371662"}]': finished
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "auth_id": "tempest-cephx-id-1559371662", "format": "json"}]: dispatch
Feb 20 10:01:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "format": "json"}]: dispatch
Feb 20 10:02:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "50918518-d5fc-4598-a9e4-c7aeadda4e5c", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ca81ba90-4b7a-404f-aec6-b73bd51b8d77", "format": "json"}]: dispatch
Feb 20 10:02:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ca81ba90-4b7a-404f-aec6-b73bd51b8d77", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ac8c96bc-6c82-492e-a472-5bf05b6575ac", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ac8c96bc-6c82-492e-a472-5bf05b6575ac", "format": "json"}]: dispatch
Feb 20 10:02:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:00.739 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:01 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:02:01 np0005625204.localdomain systemd[1]: tmp-crun.IoefkG.mount: Deactivated successfully.
Feb 20 10:02:01 np0005625204.localdomain podman[322578]: 2026-02-20 10:02:01.146210005 +0000 UTC m=+0.085438138 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:02:01 np0005625204.localdomain podman[322578]: 2026-02-20 10:02:01.158813459 +0000 UTC m=+0.098041572 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:02:01 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:02:01 np0005625204.localdomain ceph-mon[301857]: pgmap v575: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 95 KiB/s wr, 9 op/s
Feb 20 10:02:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/b164674c-a82b-4878-a588-09120b66d1e5/72089254-fcf7-474f-aaf6-ba53e49ce9b2", "osd", "allow rw pool=manila_data namespace=fsvolumens_b164674c-a82b-4878-a588-09120b66d1e5", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:02:01 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/b164674c-a82b-4878-a588-09120b66d1e5/72089254-fcf7-474f-aaf6-ba53e49ce9b2", "osd", "allow rw pool=manila_data namespace=fsvolumens_b164674c-a82b-4878-a588-09120b66d1e5", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:02:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/644961451' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/644961451' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "b164674c-a82b-4878-a588-09120b66d1e5", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2402776497' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/644961451' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:02:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/644961451' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:02:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:02.572 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:02:02 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:02.573 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 10:02:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:02.607 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:02.662 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:02.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:02.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:03 np0005625204.localdomain ceph-mon[301857]: pgmap v576: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 162 KiB/s wr, 15 op/s
Feb 20 10:02:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8e58eb7a-11d1-42b2-ad27-1874f562bebd", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8e58eb7a-11d1-42b2-ad27-1874f562bebd", "format": "json"}]: dispatch
Feb 20 10:02:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2401010244' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:04 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ac8c96bc-6c82-492e-a472-5bf05b6575ac", "format": "json"}]: dispatch
Feb 20 10:02:04 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ac8c96bc-6c82-492e-a472-5bf05b6575ac", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:04 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:04 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:02:04 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.767 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.767 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.768 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.768 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:02:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:04.769 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2750696122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.218 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.280 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.281 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: pgmap v577: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 118 KiB/s wr, 11 op/s
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "b164674c-a82b-4878-a588-09120b66d1e5", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "b164674c-a82b-4878-a588-09120b66d1e5", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b164674c-a82b-4878-a588-09120b66d1e5", "format": "json"}]: dispatch
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b164674c-a82b-4878-a588-09120b66d1e5", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2750696122' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.479 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.481 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11230MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.481 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.481 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.708 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.709 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.797 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing inventories for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.933 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating ProviderTree inventory for provider 41976f9f-3656-482f-8ad0-c81e454a3952 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.933 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Updating inventory in ProviderTree for provider 41976f9f-3656-482f-8ad0-c81e454a3952 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.953 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing aggregate associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 20 10:02:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:05.973 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Refreshing trait associations for resource provider 41976f9f-3656-482f-8ad0-c81e454a3952, traits: HW_CPU_X86_SSE2,HW_CPU_X86_AVX,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSSE3,HW_CPU_X86_AVX2,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_FMA3,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NODE,HW_CPU_X86_F16C,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_SECURITY_TPM_2_0,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_ACCELERATORS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_STORAGE_BUS_SATA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_SVM,HW_CPU_X86_MMX,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_SSE42,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_FDC,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_IMAGE_TYPE_ISO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 20 10:02:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:06.008 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:02:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:06.024 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:02:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:06.024 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:02:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:06.025 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:02:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:06 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:02:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2507481714' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:06.490 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:02:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:06.497 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:02:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:06.511 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:02:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:06.514 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:02:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:06.515 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.033s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "format": "json"}]: dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8e58eb7a-11d1-42b2-ad27-1874f562bebd", "format": "json"}]: dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8e58eb7a-11d1-42b2-ad27-1874f562bebd", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: pgmap v578: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 164 KiB/s wr, 16 op/s
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "817a644a-a040-452f-9ef0-baf961087441", "format": "json"}]: dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "817a644a-a040-452f-9ef0-baf961087441", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2507481714' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:02:07 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.515 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.516 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.670 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.714 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.714 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5046 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.715 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.716 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:07.717 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:02:08 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:02:08 np0005625204.localdomain podman[322646]: 2026-02-20 10:02:08.142920602 +0000 UTC m=+0.079944399 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-02-05T04:57:10Z, name=ubi9/ubi-minimal, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, architecture=x86_64, release=1770267347, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7)
Feb 20 10:02:08 np0005625204.localdomain podman[322646]: 2026-02-20 10:02:08.156760324 +0000 UTC m=+0.093784141 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, architecture=x86_64, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1770267347, vcs-type=git, managed_by=edpm_ansible, version=9.7, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 20 10:02:08 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:02:08 np0005625204.localdomain podman[322647]: 2026-02-20 10:02:08.205196202 +0000 UTC m=+0.136243737 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 10:02:08 np0005625204.localdomain podman[322647]: 2026-02-20 10:02:08.216152296 +0000 UTC m=+0.147199881 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 10:02:08 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:02:08 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:02:08 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:08.576 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 10:02:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:08.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:08.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:02:09 np0005625204.localdomain ceph-mon[301857]: pgmap v579: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 112 KiB/s wr, 11 op/s
Feb 20 10:02:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2191019720' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:09.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:09.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:02:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:09.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:02:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:10.047 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:02:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:10.047 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:02:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:10.048 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:02:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:10.048 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:02:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "snap_name": "acd1b19e-a73f-46df-b23c-a5b5d955cb9c", "format": "json"}]: dispatch
Feb 20 10:02:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "d1f2e6a9-4e51-4824-ae46-7f0ab0897ac3", "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "97e63579-f59d-4812-9af1-a8d227932ace", "format": "json"}]: dispatch
Feb 20 10:02:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "97e63579-f59d-4812-9af1-a8d227932ace", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:10 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1875780044' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:02:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:11.406 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:02:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:11.461 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:02:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:11.462 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:02:11 np0005625204.localdomain ceph-mon[301857]: pgmap v580: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 112 KiB/s wr, 11 op/s
Feb 20 10:02:11 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:12.458 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: pgmap v581: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 164 KiB/s wr, 16 op/s
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "d1f2e6a9-4e51-4824-ae46-7f0ab0897ac3", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "snap_name": "ba024a11-8c4d-4adf-9f1b-141c894e0dc3_0b43dd8b-5f2b-4576-9ac7-b968c8ab5c96", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:12 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "snap_name": "ba024a11-8c4d-4adf-9f1b-141c894e0dc3", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:12.717 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:12.719 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "snap_name": "acd1b19e-a73f-46df-b23c-a5b5d955cb9c_200d92f6-c0eb-4bdf-bc03-a206af0c82a7", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "snap_name": "acd1b19e-a73f-46df-b23c-a5b5d955cb9c", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:15 np0005625204.localdomain ceph-mon[301857]: pgmap v582: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 938 B/s rd, 97 KiB/s wr, 10 op/s
Feb 20 10:02:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:02:15 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:02:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:02:16 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:02:16 np0005625204.localdomain podman[322690]: 2026-02-20 10:02:16.13766706 +0000 UTC m=+0.078406963 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:02:16 np0005625204.localdomain podman[322691]: 2026-02-20 10:02:16.184372856 +0000 UTC m=+0.123885942 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:02:16 np0005625204.localdomain podman[322690]: 2026-02-20 10:02:16.205146009 +0000 UTC m=+0.145885822 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 20 10:02:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:02:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "f80fe8d3-ac9e-4618-9a8e-1111bb6ccdac", "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:16 np0005625204.localdomain podman[322691]: 2026-02-20 10:02:16.215170515 +0000 UTC m=+0.154683591 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 10:02:16 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:02:16 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:02:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "format": "json"}]: dispatch
Feb 20 10:02:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5df751b6-6f21-4d96-be03-f1dd0a841066", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "format": "json"}]: dispatch
Feb 20 10:02:17 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e2d3b5dd-fcdf-415f-b600-a30d6ac0ed75", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:17 np0005625204.localdomain ceph-mon[301857]: pgmap v583: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 143 KiB/s wr, 16 op/s
Feb 20 10:02:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e256 e256: 6 total, 6 up, 6 in
Feb 20 10:02:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:02:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:02:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:02:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.720 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.721 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.722 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.723 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:17 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:17.724 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:02:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1"
Feb 20 10:02:18 np0005625204.localdomain ceph-mon[301857]: osdmap e256: 6 total, 6 up, 6 in
Feb 20 10:02:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:02:18 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:02:19 np0005625204.localdomain ceph-mon[301857]: pgmap v585: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 117 KiB/s wr, 13 op/s
Feb 20 10:02:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:19 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "f80fe8d3-ac9e-4618-9a8e-1111bb6ccdac", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:21 np0005625204.localdomain ceph-mon[301857]: pgmap v586: 177 pgs: 177 active+clean; 212 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 117 KiB/s wr, 13 op/s
Feb 20 10:02:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:22 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:02:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:02:22 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:02:22 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:22.726 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:22.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:22.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:02:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:22.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:22.746 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:22 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:22.747 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 e257: 6 total, 6 up, 6 in
Feb 20 10:02:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "af681d3a-0a9b-41df-acfc-a86a81e0ae12", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "af681d3a-0a9b-41df-acfc-a86a81e0ae12", "format": "json"}]: dispatch
Feb 20 10:02:23 np0005625204.localdomain ceph-mon[301857]: pgmap v587: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 99 KiB/s wr, 11 op/s
Feb 20 10:02:23 np0005625204.localdomain ceph-mon[301857]: osdmap e257: 6 total, 6 up, 6 in
Feb 20 10:02:24 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:02:24 np0005625204.localdomain systemd[1]: tmp-crun.na0Jjr.mount: Deactivated successfully.
Feb 20 10:02:24 np0005625204.localdomain podman[322732]: 2026-02-20 10:02:24.166999814 +0000 UTC m=+0.104518410 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 20 10:02:24 np0005625204.localdomain podman[322732]: 2026-02-20 10:02:24.204375934 +0000 UTC m=+0.141894510 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 10:02:24 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:02:24 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:24.497 264355 INFO neutron.agent.linux.ip_lib [None req-ba4ff21f-2e79-43e0-b9d1-7b69c8ff3641 - - - - - -] Device tap01c64444-62 cannot be used as it has no MAC address
Feb 20 10:02:24 np0005625204.localdomain ceph-mon[301857]: pgmap v589: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 55 KiB/s wr, 5 op/s
Feb 20 10:02:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:24.571 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:24 np0005625204.localdomain kernel: device tap01c64444-62 entered promiscuous mode
Feb 20 10:02:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:24Z|00408|binding|INFO|Claiming lport 01c64444-6242-477e-b8a5-acc9b61d7649 for this chassis.
Feb 20 10:02:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:24Z|00409|binding|INFO|01c64444-6242-477e-b8a5-acc9b61d7649: Claiming unknown
Feb 20 10:02:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:24.582 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:24 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581744.5850] manager: (tap01c64444-62): new Generic device (/org/freedesktop/NetworkManager/Devices/63)
Feb 20 10:02:24 np0005625204.localdomain systemd-udevd[322761]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 10:02:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:24.596 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3d0b83eb9d040b2a1ee21f2d4ef3fce', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d227202d-dab9-4451-8037-c79279634b88, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=01c64444-6242-477e-b8a5-acc9b61d7649) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:02:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:24.599 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 01c64444-6242-477e-b8a5-acc9b61d7649 in datapath 0948d27a-4e54-4f2c-b484-b44317772f0a bound to our chassis
Feb 20 10:02:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:24.601 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port fd0005c6-52cb-4f7c-a677-21aa510edd4c IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 10:02:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:24.602 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0948d27a-4e54-4f2c-b484-b44317772f0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:02:24 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:24.603 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[8546ceba-4c20-4ad9-9bc9-020a6f6311f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:24Z|00410|binding|INFO|Setting lport 01c64444-6242-477e-b8a5-acc9b61d7649 ovn-installed in OVS
Feb 20 10:02:24 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:24Z|00411|binding|INFO|Setting lport 01c64444-6242-477e-b8a5-acc9b61d7649 up in Southbound
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:24.628 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap01c64444-62: No such device
Feb 20 10:02:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:24.668 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:24.704 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:24 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:24.769 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "af681d3a-0a9b-41df-acfc-a86a81e0ae12", "format": "json"}]: dispatch
Feb 20 10:02:25 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "af681d3a-0a9b-41df-acfc-a86a81e0ae12", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:25 np0005625204.localdomain podman[322832]: 
Feb 20 10:02:25 np0005625204.localdomain podman[322832]: 2026-02-20 10:02:25.727468705 +0000 UTC m=+0.094222155 container create 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:02:25 np0005625204.localdomain podman[322832]: 2026-02-20 10:02:25.684739792 +0000 UTC m=+0.051493282 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 10:02:25 np0005625204.localdomain systemd[1]: Started libpod-conmon-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7.scope.
Feb 20 10:02:25 np0005625204.localdomain systemd[1]: tmp-crun.JMgZx7.mount: Deactivated successfully.
Feb 20 10:02:25 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 10:02:25 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4e93c2b209248131595b33d1ed560347197deb6725ee1319908263122cb78ab7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 10:02:25 np0005625204.localdomain podman[322832]: 2026-02-20 10:02:25.830605892 +0000 UTC m=+0.197359342 container init 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 10:02:25 np0005625204.localdomain podman[322832]: 2026-02-20 10:02:25.840558286 +0000 UTC m=+0.207311726 container start 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:02:25 np0005625204.localdomain dnsmasq[322851]: started, version 2.85 cachesize 150
Feb 20 10:02:25 np0005625204.localdomain dnsmasq[322851]: DNS service limited to local subnets
Feb 20 10:02:25 np0005625204.localdomain dnsmasq[322851]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 10:02:25 np0005625204.localdomain dnsmasq[322851]: warning: no upstream servers configured
Feb 20 10:02:25 np0005625204.localdomain dnsmasq-dhcp[322851]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 10:02:25 np0005625204.localdomain dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 0 addresses
Feb 20 10:02:25 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host
Feb 20 10:02:25 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts
Feb 20 10:02:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:25.912 264355 INFO neutron.agent.dhcp.agent [None req-6592b2d0-af6e-446e-a0bc-e8e1c28584e1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:02:24Z, description=, device_id=33ec31b2-fecf-477f-8148-61437b8399e4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5861100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5861250>], id=ca945224-72bd-424a-80e3-db458bb34395, ip_allocation=immediate, mac_address=fa:16:3e:ce:45:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:02:21Z, description=, dns_domain=, id=0948d27a-4e54-4f2c-b484-b44317772f0a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1085427905-network, port_security_enabled=True, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8987, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3715, status=ACTIVE, subnets=['7bc845c7-cd31-4b75-b7fe-a9e74c7b267a'], tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:22Z, vlan_transparent=None, network_id=0948d27a-4e54-4f2c-b484-b44317772f0a, port_security_enabled=False, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3723, status=DOWN, tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:25Z on network 0948d27a-4e54-4f2c-b484-b44317772f0a
Feb 20 10:02:25 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:25.964 264355 INFO neutron.agent.dhcp.agent [None req-eea541f1-666d-447c-af2e-8cd526a60bd9 - - - - - -] DHCP configuration for ports {'acbf0195-d8e8-4b8e-88cb-5f48f4fbccad'} is completed
Feb 20 10:02:26 np0005625204.localdomain dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 1 addresses
Feb 20 10:02:26 np0005625204.localdomain podman[322869]: 2026-02-20 10:02:26.134306198 +0000 UTC m=+0.061482437 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 20 10:02:26 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host
Feb 20 10:02:26 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts
Feb 20 10:02:26 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:26.319 264355 INFO neutron.agent.dhcp.agent [None req-3e5402b6-cae1-4880-9ecd-45409f348abd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:02:24Z, description=, device_id=33ec31b2-fecf-477f-8148-61437b8399e4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df580a9d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df580a610>], id=ca945224-72bd-424a-80e3-db458bb34395, ip_allocation=immediate, mac_address=fa:16:3e:ce:45:e2, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:02:21Z, description=, dns_domain=, id=0948d27a-4e54-4f2c-b484-b44317772f0a, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingNegativeTest-1085427905-network, port_security_enabled=True, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=8987, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3715, status=ACTIVE, subnets=['7bc845c7-cd31-4b75-b7fe-a9e74c7b267a'], tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:22Z, vlan_transparent=None, network_id=0948d27a-4e54-4f2c-b484-b44317772f0a, port_security_enabled=False, project_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3723, status=DOWN, tags=[], tenant_id=b3d0b83eb9d040b2a1ee21f2d4ef3fce, updated_at=2026-02-20T10:02:25Z on network 0948d27a-4e54-4f2c-b484-b44317772f0a
Feb 20 10:02:26 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:26.397 264355 INFO neutron.agent.dhcp.agent [None req-c808fd63-e0d3-48ec-9c48-307b46aa9ff0 - - - - - -] DHCP configuration for ports {'ca945224-72bd-424a-80e3-db458bb34395'} is completed
Feb 20 10:02:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:26 np0005625204.localdomain dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 1 addresses
Feb 20 10:02:26 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host
Feb 20 10:02:26 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts
Feb 20 10:02:26 np0005625204.localdomain podman[322908]: 2026-02-20 10:02:26.546458464 +0000 UTC m=+0.065360716 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:02:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:02:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:02:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:02:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:02:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:02:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:02:26 np0005625204.localdomain ceph-mon[301857]: pgmap v590: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 726 B/s rd, 111 KiB/s wr, 11 op/s
Feb 20 10:02:26 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:26.842 264355 INFO neutron.agent.dhcp.agent [None req-77e057be-5cba-42f8-82f6-5b331cedcdb2 - - - - - -] DHCP configuration for ports {'ca945224-72bd-424a-80e3-db458bb34395'} is completed
Feb 20 10:02:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "format": "json"}]: dispatch
Feb 20 10:02:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:27.747 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:27.750 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: pgmap v591: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 93 KiB/s wr, 9 op/s
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "99dd1946-0a71-4d2e-9886-17bc70e2e74a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "99dd1946-0a71-4d2e-9886-17bc70e2e74a", "format": "json"}]: dispatch
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"} : dispatch
Feb 20 10:02:29 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-408485567", "caps": ["mds", "allow rw path=/volumes/_nogroup/cd0ea541-924c-41c6-95b4-11e7d85bd173/2edc2d31-21fe-4d10-b523-1775f0f273db", "osd", "allow rw pool=manila_data namespace=fsvolumens_cd0ea541-924c-41c6-95b4-11e7d85bd173", "mon", "allow r"], "format": "json"}]': finished
Feb 20 10:02:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "tenant_id": "ad1dd0d43ec7421b9a4a06c39e3657c6", "access_level": "rw", "format": "json"}]: dispatch
Feb 20 10:02:30 np0005625204.localdomain sshd[322928]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:02:31 np0005625204.localdomain sshd[322928]: error: kex_exchange_identification: Connection closed by remote host
Feb 20 10:02:31 np0005625204.localdomain sshd[322928]: Connection closed by 51.158.205.203 port 61000
Feb 20 10:02:31 np0005625204.localdomain ceph-mon[301857]: pgmap v592: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 94 KiB/s wr, 9 op/s
Feb 20 10:02:31 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "73ef4b22-cb69-44e4-9b94-352c732420be", "format": "json"}]: dispatch
Feb 20 10:02:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:32 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:02:32 np0005625204.localdomain podman[322929]: 2026-02-20 10:02:32.152376797 +0000 UTC m=+0.085804520 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 10:02:32 np0005625204.localdomain podman[322929]: 2026-02-20 10:02:32.165043472 +0000 UTC m=+0.098471226 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 10:02:32 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:02:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "99dd1946-0a71-4d2e-9886-17bc70e2e74a", "format": "json"}]: dispatch
Feb 20 10:02:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "99dd1946-0a71-4d2e-9886-17bc70e2e74a", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-408485567", "format": "json"} : dispatch
Feb 20 10:02:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"} : dispatch
Feb 20 10:02:32 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-408485567"}]': finished
Feb 20 10:02:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:32.751 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:32.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:32.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:02:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:32.754 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:32.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:32.790 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:33 np0005625204.localdomain ceph-mon[301857]: pgmap v593: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 108 KiB/s wr, 10 op/s
Feb 20 10:02:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:33 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "auth_id": "tempest-cephx-id-408485567", "format": "json"}]: dispatch
Feb 20 10:02:34 np0005625204.localdomain dnsmasq[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/addn_hosts - 0 addresses
Feb 20 10:02:34 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/host
Feb 20 10:02:34 np0005625204.localdomain dnsmasq-dhcp[322851]: read /var/lib/neutron/dhcp/0948d27a-4e54-4f2c-b484-b44317772f0a/opts
Feb 20 10:02:34 np0005625204.localdomain podman[322967]: 2026-02-20 10:02:34.145276862 +0000 UTC m=+0.064574961 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 20 10:02:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "1e66955e-3f1a-40d4-80db-e506894b4fe7", "format": "json"}]: dispatch
Feb 20 10:02:34 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:34 np0005625204.localdomain kernel: device tap01c64444-62 left promiscuous mode
Feb 20 10:02:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:34.371 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:34Z|00412|binding|INFO|Releasing lport 01c64444-6242-477e-b8a5-acc9b61d7649 from this chassis (sb_readonly=0)
Feb 20 10:02:34 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:34Z|00413|binding|INFO|Setting lport 01c64444-6242-477e-b8a5-acc9b61d7649 down in Southbound
Feb 20 10:02:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:34.380 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0948d27a-4e54-4f2c-b484-b44317772f0a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3d0b83eb9d040b2a1ee21f2d4ef3fce', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d227202d-dab9-4451-8037-c79279634b88, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=01c64444-6242-477e-b8a5-acc9b61d7649) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:02:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:34.381 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 01c64444-6242-477e-b8a5-acc9b61d7649 in datapath 0948d27a-4e54-4f2c-b484-b44317772f0a unbound from our chassis
Feb 20 10:02:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:34.384 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0948d27a-4e54-4f2c-b484-b44317772f0a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:02:34 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:02:34.385 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[59e0f160-6a0e-4f67-98c6-92f5ed29b994]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:02:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:34.393 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:34 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:34.395 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:35 np0005625204.localdomain ceph-mon[301857]: pgmap v594: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 388 B/s rd, 103 KiB/s wr, 9 op/s
Feb 20 10:02:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "format": "json"}]: dispatch
Feb 20 10:02:35 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:35Z|00414|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:02:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:35.471 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:36 np0005625204.localdomain dnsmasq[322851]: exiting on receipt of SIGTERM
Feb 20 10:02:36 np0005625204.localdomain podman[323007]: 2026-02-20 10:02:36.002410075 +0000 UTC m=+0.061172538 container kill 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 10:02:36 np0005625204.localdomain systemd[1]: libpod-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7.scope: Deactivated successfully.
Feb 20 10:02:36 np0005625204.localdomain podman[323020]: 2026-02-20 10:02:36.075256158 +0000 UTC m=+0.055573847 container died 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 10:02:36 np0005625204.localdomain systemd[1]: tmp-crun.YKA1tR.mount: Deactivated successfully.
Feb 20 10:02:36 np0005625204.localdomain podman[323020]: 2026-02-20 10:02:36.118792476 +0000 UTC m=+0.099110125 container cleanup 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 10:02:36 np0005625204.localdomain systemd[1]: libpod-conmon-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7.scope: Deactivated successfully.
Feb 20 10:02:36 np0005625204.localdomain podman[323021]: 2026-02-20 10:02:36.196175607 +0000 UTC m=+0.171781092 container remove 798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0948d27a-4e54-4f2c-b484-b44317772f0a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 10:02:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:36.228 264355 INFO neutron.agent.dhcp.agent [None req-892f4de4-d440-436e-8e71-26a59f2eae94 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:02:36 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:02:36.228 264355 INFO neutron.agent.dhcp.agent [None req-892f4de4-d440-436e-8e71-26a59f2eae94 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:02:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "format": "json"}]: dispatch
Feb 20 10:02:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cd0ea541-924c-41c6-95b4-11e7d85bd173", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-4e93c2b209248131595b33d1ed560347197deb6725ee1319908263122cb78ab7-merged.mount: Deactivated successfully.
Feb 20 10:02:37 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-798fb327be5422ce43bec4b339dd015800a053ac8c0dcb48df0f2ff8bfb5c2f7-userdata-shm.mount: Deactivated successfully.
Feb 20 10:02:37 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d0948d27a\x2d4e54\x2d4f2c\x2db484\x2db44317772f0a.mount: Deactivated successfully.
Feb 20 10:02:37 np0005625204.localdomain ceph-mon[301857]: pgmap v595: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 125 KiB/s wr, 12 op/s
Feb 20 10:02:37 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:37.824 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "snap_name": "8b9be1a2-e688-4f67-bcb5-d692465b7436", "format": "json"}]: dispatch
Feb 20 10:02:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "1e66955e-3f1a-40d4-80db-e506894b4fe7_5d64b720-4a43-4716-92a1-dde09c619e84", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "1e66955e-3f1a-40d4-80db-e506894b4fe7", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:02:39 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:02:39 np0005625204.localdomain podman[323051]: 2026-02-20 10:02:39.161515573 +0000 UTC m=+0.096650690 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.component=ubi9-minimal-container, version=9.7, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1770267347, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 10:02:39 np0005625204.localdomain podman[323051]: 2026-02-20 10:02:39.203280447 +0000 UTC m=+0.138415614 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.expose-services=, architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1770267347, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, version=9.7, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, org.opencontainers.image.created=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 10:02:39 np0005625204.localdomain systemd[1]: tmp-crun.eUASep.mount: Deactivated successfully.
Feb 20 10:02:39 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:02:39 np0005625204.localdomain podman[323052]: 2026-02-20 10:02:39.226689621 +0000 UTC m=+0.158742004 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 10:02:39 np0005625204.localdomain podman[323052]: 2026-02-20 10:02:39.26435275 +0000 UTC m=+0.196405193 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:02:39 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:02:39 np0005625204.localdomain ceph-mon[301857]: pgmap v596: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 84 KiB/s wr, 7 op/s
Feb 20 10:02:39 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e59c51a9-964d-4ac9-9a67-d46c3cec7b52", "format": "json"}]: dispatch
Feb 20 10:02:41 np0005625204.localdomain ceph-mon[301857]: pgmap v597: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 84 KiB/s wr, 8 op/s
Feb 20 10:02:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:42 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:42.826 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:42.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:42.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:02:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:42.827 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:42.856 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:42.857 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:02:43 np0005625204.localdomain ceph-mon[301857]: pgmap v598: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 110 KiB/s wr, 10 op/s
Feb 20 10:02:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e59c51a9-964d-4ac9-9a67-d46c3cec7b52", "format": "json"}]: dispatch
Feb 20 10:02:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "55bcf7f1-01f4-42f0-8f90-e7ef14438392", "format": "json"}]: dispatch
Feb 20 10:02:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "snap_name": "8b9be1a2-e688-4f67-bcb5-d692465b7436_5acfc6c8-db4e-4899-8b99-eb2dcdeecf08", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "snap_name": "8b9be1a2-e688-4f67-bcb5-d692465b7436", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:43.808 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:44 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e59c51a9-964d-4ac9-9a67-d46c3cec7b52", "format": "json"}]: dispatch
Feb 20 10:02:44 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e59c51a9-964d-4ac9-9a67-d46c3cec7b52", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:45 np0005625204.localdomain ceph-mon[301857]: pgmap v599: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 61 KiB/s wr, 6 op/s
Feb 20 10:02:45 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "format": "json"}]: dispatch
Feb 20 10:02:45 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "61e1c482-9c73-4bad-8fd8-0dc280a18a86", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:46 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "55bcf7f1-01f4-42f0-8f90-e7ef14438392_264ca94b-ea48-4597-a2c2-a0904eedca44", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:46 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "55bcf7f1-01f4-42f0-8f90-e7ef14438392", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:46.631 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:46 np0005625204.localdomain sshd[323094]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:02:46 np0005625204.localdomain sshd[323094]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:02:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:02:47 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:02:47 np0005625204.localdomain podman[323096]: 2026-02-20 10:02:47.097307882 +0000 UTC m=+0.087591914 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:02:47 np0005625204.localdomain podman[323097]: 2026-02-20 10:02:47.156018323 +0000 UTC m=+0.141783747 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 20 10:02:47 np0005625204.localdomain podman[323096]: 2026-02-20 10:02:47.172148045 +0000 UTC m=+0.162432107 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 20 10:02:47 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:02:47 np0005625204.localdomain podman[323097]: 2026-02-20 10:02:47.19230869 +0000 UTC m=+0.178074114 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:02:47 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: pgmap v600: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 97 KiB/s wr, 8 op/s
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4d276c40-22b5-4463-ba95-b271179ed697", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4d276c40-22b5-4463-ba95-b271179ed697", "format": "json"}]: dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "snap_name": "62a01af2-9c69-4d90-af43-120db3783b58_ffe8c3f4-7f8d-4e50-9ffc-1b3c43840e07", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "snap_name": "62a01af2-9c69-4d90-af43-120db3783b58", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "format": "json"}]: dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:02:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e258 e258: 6 total, 6 up, 6 in
Feb 20 10:02:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:02:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:02:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:02:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:02:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:02:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18353 "" "Go-http-client/1.1"
Feb 20 10:02:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:47.889 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:48 np0005625204.localdomain ceph-mon[301857]: osdmap e258: 6 total, 6 up, 6 in
Feb 20 10:02:48 np0005625204.localdomain ceph-mon[301857]: pgmap v602: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 74 KiB/s wr, 6 op/s
Feb 20 10:02:48 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "88e0b4cc-0c15-4363-8336-4c11c356e179", "format": "json"}]: dispatch
Feb 20 10:02:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e259 e259: 6 total, 6 up, 6 in
Feb 20 10:02:48 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 20 10:02:49 np0005625204.localdomain ceph-mon[301857]: osdmap e259: 6 total, 6 up, 6 in
Feb 20 10:02:50 np0005625204.localdomain sudo[323140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:02:50 np0005625204.localdomain sudo[323140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:02:50 np0005625204.localdomain sudo[323140]: pam_unix(sudo:session): session closed for user root
Feb 20 10:02:50 np0005625204.localdomain sudo[323158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:02:50 np0005625204.localdomain sudo[323158]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:02:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "4d276c40-22b5-4463-ba95-b271179ed697", "snap_name": "22d02fe9-b677-4d4d-ab93-02062be24947", "format": "json"}]: dispatch
Feb 20 10:02:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "format": "json"}]: dispatch
Feb 20 10:02:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5f3b4204-930b-4bc2-9cff-7e982286eac6", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:50 np0005625204.localdomain ceph-mon[301857]: pgmap v604: 177 pgs: 177 active+clean; 215 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 53 KiB/s wr, 4 op/s
Feb 20 10:02:50 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "snap_name": "f9be8701-f7e0-4195-b63f-b542fb246c4d", "format": "json"}]: dispatch
Feb 20 10:02:51 np0005625204.localdomain sudo[323158]: pam_unix(sudo:session): session closed for user root
Feb 20 10:02:51 np0005625204.localdomain sudo[323207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:02:51 np0005625204.localdomain sudo[323207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:02:51 np0005625204.localdomain sudo[323207]: pam_unix(sudo:session): session closed for user root
Feb 20 10:02:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:02:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:02:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:02:51 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:02:52 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:02:52Z|00415|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:02:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:52.226 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:52 np0005625204.localdomain ceph-mon[301857]: pgmap v605: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 143 KiB/s wr, 12 op/s
Feb 20 10:02:52 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "88e0b4cc-0c15-4363-8336-4c11c356e179_b7815798-7284-4b52-8029-4e2d096c3965", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:52 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "88e0b4cc-0c15-4363-8336-4c11c356e179", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:52 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:52.925 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:02:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e260 e260: 6 total, 6 up, 6 in
Feb 20 10:02:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e261 e261: 6 total, 6 up, 6 in
Feb 20 10:02:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4d276c40-22b5-4463-ba95-b271179ed697", "snap_name": "22d02fe9-b677-4d4d-ab93-02062be24947_54d3ff10-d9fa-4f52-83b0-fa0cfd9ed293", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:54 np0005625204.localdomain ceph-mon[301857]: osdmap e260: 6 total, 6 up, 6 in
Feb 20 10:02:54 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "4d276c40-22b5-4463-ba95-b271179ed697", "snap_name": "22d02fe9-b677-4d4d-ab93-02062be24947", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:02:55 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:02:55 np0005625204.localdomain podman[323225]: 2026-02-20 10:02:55.151846555 +0000 UTC m=+0.084702136 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:02:55 np0005625204.localdomain podman[323225]: 2026-02-20 10:02:55.164924954 +0000 UTC m=+0.097780485 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 20 10:02:55 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e262 e262: 6 total, 6 up, 6 in
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: pgmap v607: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 119 KiB/s wr, 11 op/s
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: osdmap e261: 6 total, 6 up, 6 in
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "snap_name": "f9be8701-f7e0-4195-b63f-b542fb246c4d_982c66f3-6550-45d5-a41b-e698844f2247", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: mgrmap e54: np0005625202.arwxwo(active, since 14m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 10:02:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:02:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:02:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:02:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:02:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:02:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:02:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e263 e263: 6 total, 6 up, 6 in
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "snap_name": "f9be8701-f7e0-4195-b63f-b542fb246c4d", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "faea6402-68f1-475f-9cea-5f24a4d2c2b9", "format": "json"}]: dispatch
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: pgmap v609: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.7 KiB/s rd, 177 KiB/s wr, 16 op/s
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: osdmap e262: 6 total, 6 up, 6 in
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4d276c40-22b5-4463-ba95-b271179ed697", "format": "json"}]: dispatch
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4d276c40-22b5-4463-ba95-b271179ed697", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:57 np0005625204.localdomain ceph-mon[301857]: osdmap e263: 6 total, 6 up, 6 in
Feb 20 10:02:57 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:02:57.927 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:02:58 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e264 e264: 6 total, 6 up, 6 in
Feb 20 10:02:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "format": "json"}]: dispatch
Feb 20 10:02:59 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "e03d5299-b683-42d5-861f-4ae99be1be37", "force": true, "format": "json"}]: dispatch
Feb 20 10:02:59 np0005625204.localdomain ceph-mon[301857]: pgmap v612: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 905 B/s rd, 115 KiB/s wr, 9 op/s
Feb 20 10:02:59 np0005625204.localdomain ceph-mon[301857]: osdmap e264: 6 total, 6 up, 6 in
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "faea6402-68f1-475f-9cea-5f24a4d2c2b9_25b67a37-0482-4bde-b478-a143700074f7", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "faea6402-68f1-475f-9cea-5f24a4d2c2b9", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.223325) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780223374, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 2576, "num_deletes": 259, "total_data_size": 4391198, "memory_usage": 4562016, "flush_reason": "Manual Compaction"}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780237126, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 2872263, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29270, "largest_seqno": 31841, "table_properties": {"data_size": 2861951, "index_size": 6369, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25539, "raw_average_key_size": 22, "raw_value_size": 2839917, "raw_average_value_size": 2478, "num_data_blocks": 273, "num_entries": 1146, "num_filter_entries": 1146, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581663, "oldest_key_time": 1771581663, "file_creation_time": 1771581780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 13860 microseconds, and 7323 cpu microseconds.
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.237180) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 2872263 bytes OK
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.237210) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.239469) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.239492) EVENT_LOG_v1 {"time_micros": 1771581780239485, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.239518) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 4378955, prev total WAL file size 4378955, number of live WAL files 2.
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.240725) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(2804KB)], [45(17MB)]
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780240775, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 20918796, "oldest_snapshot_seqno": -1}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 14308 keys, 19489804 bytes, temperature: kUnknown
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780326751, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 19489804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19406886, "index_size": 46081, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35781, "raw_key_size": 381426, "raw_average_key_size": 26, "raw_value_size": 19162839, "raw_average_value_size": 1339, "num_data_blocks": 1735, "num_entries": 14308, "num_filter_entries": 14308, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.327114) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 19489804 bytes
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.329374) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 243.0 rd, 226.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 17.2 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(14.1) write-amplify(6.8) OK, records in: 14849, records dropped: 541 output_compression: NoCompression
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.329409) EVENT_LOG_v1 {"time_micros": 1771581780329391, "job": 26, "event": "compaction_finished", "compaction_time_micros": 86093, "compaction_time_cpu_micros": 53580, "output_level": 6, "num_output_files": 1, "total_output_size": 19489804, "num_input_records": 14849, "num_output_records": 14308, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780330004, "job": 26, "event": "table_file_deletion", "file_number": 47}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581780332810, "job": 26, "event": "table_file_deletion", "file_number": 45}
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.240596) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332924) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332930) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332934) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332937) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:00 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:00.332940) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:01 np0005625204.localdomain ceph-mon[301857]: pgmap v614: 177 pgs: 177 active+clean; 216 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 777 B/s rd, 100 KiB/s wr, 9 op/s
Feb 20 10:03:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:01.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:02 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/536207374' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:03:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/536207374' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:03:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e265 e265: 6 total, 6 up, 6 in
Feb 20 10:03:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:03:02 np0005625204.localdomain podman[323244]: 2026-02-20 10:03:02.65452737 +0000 UTC m=+0.089957156 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:03:02 np0005625204.localdomain podman[323244]: 2026-02-20 10:03:02.669214648 +0000 UTC m=+0.104644464 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:03:02 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:03:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:02.959 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:02.960 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:02.961 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5028 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:03:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:02.961 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:02.965 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:02.965 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e266 e266: 6 total, 6 up, 6 in
Feb 20 10:03:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ccc69125-8271-465f-a7cf-99b18598188c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:03 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ccc69125-8271-465f-a7cf-99b18598188c", "format": "json"}]: dispatch
Feb 20 10:03:03 np0005625204.localdomain ceph-mon[301857]: pgmap v615: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 100 KiB/s wr, 7 op/s
Feb 20 10:03:03 np0005625204.localdomain ceph-mon[301857]: osdmap e265: 6 total, 6 up, 6 in
Feb 20 10:03:03 np0005625204.localdomain ceph-mon[301857]: osdmap e266: 6 total, 6 up, 6 in
Feb 20 10:03:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:03.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:04 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "7729a724-86fe-4d43-a1c1-675788136bbb", "format": "json"}]: dispatch
Feb 20 10:03:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2069351798' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:05.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:05.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:03:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:05.745 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:03:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:05.746 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:03:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:05.747 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:03:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:05.747 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:03:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:06.026 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:03:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:03:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:06.028 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:03:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:03:06 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1846256840' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.184 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.249 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.250 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.490 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.493 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11205MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.493 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.494 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:03:06 np0005625204.localdomain ceph-mon[301857]: pgmap v618: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 100 KiB/s wr, 7 op/s
Feb 20 10:03:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/495270293' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.565 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.566 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.566 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:03:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:06.597 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:03:07 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:03:07 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2235935279' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.051 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.057 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.079 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.081 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.082 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:03:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:07.091 264355 INFO neutron.agent.linux.ip_lib [None req-313a2c9a-588b-4ef7-9f9b-edba28ce1bc5 - - - - - -] Device tap736aea51-80 cannot be used as it has no MAC address
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.114 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:07 np0005625204.localdomain kernel: device tap736aea51-80 entered promiscuous mode
Feb 20 10:03:07 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581787.1205] manager: (tap736aea51-80): new Generic device (/org/freedesktop/NetworkManager/Devices/64)
Feb 20 10:03:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:07Z|00416|binding|INFO|Claiming lport 736aea51-8061-4aa8-b593-31839f1f2534 for this chassis.
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.123 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:07Z|00417|binding|INFO|736aea51-8061-4aa8-b593-31839f1f2534: Claiming unknown
Feb 20 10:03:07 np0005625204.localdomain systemd-udevd[323322]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.131 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a47e12e114b4e778ff94aca4e5dad8b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88fdeb8d-ebba-4734-9a0b-e4eba3120811, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=736aea51-8061-4aa8-b593-31839f1f2534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.133 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 736aea51-8061-4aa8-b593-31839f1f2534 in datapath fd32aaea-98e9-4dfa-ad52-36d30939560e bound to our chassis
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.135 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 30127f4b-6d7e-49a7-8ad3-e1d1e43df164 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.135 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd32aaea-98e9-4dfa-ad52-36d30939560e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.136 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d9648fb4-ae99-438a-973e-97da56528624]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:07Z|00418|binding|INFO|Setting lport 736aea51-8061-4aa8-b593-31839f1f2534 ovn-installed in OVS
Feb 20 10:03:07 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:07Z|00419|binding|INFO|Setting lport 736aea51-8061-4aa8-b593-31839f1f2534 up in Southbound
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.170 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap736aea51-80: No such device
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.213 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.243 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:07 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "ccc69125-8271-465f-a7cf-99b18598188c", "snap_name": "7873ec63-b44a-47a7-8bbe-8f944c5b9a9d", "format": "json"}]: dispatch
Feb 20 10:03:07 np0005625204.localdomain ceph-mon[301857]: pgmap v619: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.3 KiB/s rd, 124 KiB/s wr, 9 op/s
Feb 20 10:03:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1846256840' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2235935279' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.738 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:03:07 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:07.740 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.774 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:07.971 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:08.002 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:08.082 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:08.082 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:08.083 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:08 np0005625204.localdomain podman[323394]: 
Feb 20 10:03:08 np0005625204.localdomain podman[323394]: 2026-02-20 10:03:08.137164131 +0000 UTC m=+0.087233233 container create 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:03:08 np0005625204.localdomain podman[323394]: 2026-02-20 10:03:08.087258278 +0000 UTC m=+0.037327410 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 10:03:08 np0005625204.localdomain systemd[1]: Started libpod-conmon-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf.scope.
Feb 20 10:03:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e267 e267: 6 total, 6 up, 6 in
Feb 20 10:03:08 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 10:03:08 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/febc23c981ad25c134ca1cbca59507084ea6219789b24f92013e13e04132bbc8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 10:03:08 np0005625204.localdomain podman[323394]: 2026-02-20 10:03:08.226379263 +0000 UTC m=+0.176448365 container init 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:03:08 np0005625204.localdomain podman[323394]: 2026-02-20 10:03:08.238015908 +0000 UTC m=+0.188085050 container start 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:03:08 np0005625204.localdomain systemd[1]: tmp-crun.Ne9YZP.mount: Deactivated successfully.
Feb 20 10:03:08 np0005625204.localdomain dnsmasq[323412]: started, version 2.85 cachesize 150
Feb 20 10:03:08 np0005625204.localdomain dnsmasq[323412]: DNS service limited to local subnets
Feb 20 10:03:08 np0005625204.localdomain dnsmasq[323412]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 10:03:08 np0005625204.localdomain dnsmasq[323412]: warning: no upstream servers configured
Feb 20 10:03:08 np0005625204.localdomain dnsmasq-dhcp[323412]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 10:03:08 np0005625204.localdomain dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 0 addresses
Feb 20 10:03:08 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host
Feb 20 10:03:08 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts
Feb 20 10:03:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:08.391 264355 INFO neutron.agent.dhcp.agent [None req-d1d35632-d433-4be7-a245-09c192536fe6 - - - - - -] DHCP configuration for ports {'1344df8e-cf17-4eaf-a761-77690035f079'} is completed
Feb 20 10:03:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:08.411 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:08Z, description=, device_id=91bb17b9-dbc2-4da3-ba37-b6215b1cc229, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df58999a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df57b6460>], id=96452e5e-5c94-4f20-aee2-527568a64031, ip_allocation=immediate, mac_address=fa:16:3e:5f:7d:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:05Z, description=, dns_domain=, id=fd32aaea-98e9-4dfa-ad52-36d30939560e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-664160433-network, port_security_enabled=True, project_id=0a47e12e114b4e778ff94aca4e5dad8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15575, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3833, status=ACTIVE, subnets=['17f91002-6186-43f7-bdd4-354cdf443dac'], tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:05Z, vlan_transparent=None, network_id=fd32aaea-98e9-4dfa-ad52-36d30939560e, port_security_enabled=False, project_id=0a47e12e114b4e778ff94aca4e5dad8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3841, status=DOWN, tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:08Z on network fd32aaea-98e9-4dfa-ad52-36d30939560e
Feb 20 10:03:08 np0005625204.localdomain dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 1 addresses
Feb 20 10:03:08 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host
Feb 20 10:03:08 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts
Feb 20 10:03:08 np0005625204.localdomain podman[323429]: 2026-02-20 10:03:08.649347238 +0000 UTC m=+0.070408529 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:03:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:08.717 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:08 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:08.945 264355 INFO neutron.agent.dhcp.agent [None req-227c1349-320a-4a90-897d-8015e4b38232 - - - - - -] DHCP configuration for ports {'96452e5e-5c94-4f20-aee2-527568a64031'} is completed
Feb 20 10:03:09 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:09.110 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:08Z, description=, device_id=91bb17b9-dbc2-4da3-ba37-b6215b1cc229, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a6f490>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a6fd60>], id=96452e5e-5c94-4f20-aee2-527568a64031, ip_allocation=immediate, mac_address=fa:16:3e:5f:7d:2f, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:05Z, description=, dns_domain=, id=fd32aaea-98e9-4dfa-ad52-36d30939560e, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIMysqlTest-664160433-network, port_security_enabled=True, project_id=0a47e12e114b4e778ff94aca4e5dad8b, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=15575, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3833, status=ACTIVE, subnets=['17f91002-6186-43f7-bdd4-354cdf443dac'], tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:05Z, vlan_transparent=None, network_id=fd32aaea-98e9-4dfa-ad52-36d30939560e, port_security_enabled=False, project_id=0a47e12e114b4e778ff94aca4e5dad8b, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3841, status=DOWN, tags=[], tenant_id=0a47e12e114b4e778ff94aca4e5dad8b, updated_at=2026-02-20T10:03:08Z on network fd32aaea-98e9-4dfa-ad52-36d30939560e
Feb 20 10:03:09 np0005625204.localdomain ceph-mon[301857]: pgmap v620: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 117 KiB/s wr, 8 op/s
Feb 20 10:03:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "7729a724-86fe-4d43-a1c1-675788136bbb_64b5f976-f108-4782-8a7f-3fdbde23df86", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "7729a724-86fe-4d43-a1c1-675788136bbb", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:09 np0005625204.localdomain ceph-mon[301857]: osdmap e267: 6 total, 6 up, 6 in
Feb 20 10:03:09 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:09 np0005625204.localdomain systemd[1]: tmp-crun.IeLgkq.mount: Deactivated successfully.
Feb 20 10:03:09 np0005625204.localdomain dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 1 addresses
Feb 20 10:03:09 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host
Feb 20 10:03:09 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts
Feb 20 10:03:09 np0005625204.localdomain podman[323466]: 2026-02-20 10:03:09.317110142 +0000 UTC m=+0.062107136 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 10:03:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:03:09 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:03:09 np0005625204.localdomain podman[323479]: 2026-02-20 10:03:09.447671796 +0000 UTC m=+0.093559506 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.openshift.expose-services=, version=9.7, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_id=openstack_network_exporter)
Feb 20 10:03:09 np0005625204.localdomain podman[323479]: 2026-02-20 10:03:09.48646511 +0000 UTC m=+0.132352790 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-02-05T04:57:10Z, managed_by=edpm_ansible, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, version=9.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1770267347, io.buildah.version=1.33.7, distribution-scope=public, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 10:03:09 np0005625204.localdomain podman[323480]: 2026-02-20 10:03:09.503317674 +0000 UTC m=+0.147783560 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:03:09 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:03:09 np0005625204.localdomain podman[323480]: 2026-02-20 10:03:09.535739883 +0000 UTC m=+0.180205789 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 10:03:09 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:03:09 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:09.587 264355 INFO neutron.agent.dhcp.agent [None req-8697ac76-559c-4a0f-b2d5-736492b4e829 - - - - - -] DHCP configuration for ports {'96452e5e-5c94-4f20-aee2-527568a64031'} is completed
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.805 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.806 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.806 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:03:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:09.807 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:03:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c93903cf-3015-40b5-a970-9c042e7db919", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:10 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c93903cf-3015-40b5-a970-9c042e7db919", "format": "json"}]: dispatch
Feb 20 10:03:10 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2808071945' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:11 np0005625204.localdomain ceph-mon[301857]: pgmap v622: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 549 B/s rd, 45 KiB/s wr, 3 op/s
Feb 20 10:03:11 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3652150787' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:03:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:11.834 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:03:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:11.865 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:03:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:11.865 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:03:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:11.866 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:03:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:11.866 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:03:12 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e268 e268: 6 total, 6 up, 6 in
Feb 20 10:03:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:12.969 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:12.975 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:13 np0005625204.localdomain ceph-mon[301857]: pgmap v623: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 480 B/s rd, 80 KiB/s wr, 5 op/s
Feb 20 10:03:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c93903cf-3015-40b5-a970-9c042e7db919", "format": "json"}]: dispatch
Feb 20 10:03:13 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c93903cf-3015-40b5-a970-9c042e7db919", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:13 np0005625204.localdomain ceph-mon[301857]: osdmap e268: 6 total, 6 up, 6 in
Feb 20 10:03:13 np0005625204.localdomain dnsmasq[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/addn_hosts - 0 addresses
Feb 20 10:03:13 np0005625204.localdomain podman[323544]: 2026-02-20 10:03:13.452805586 +0000 UTC m=+0.051874233 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 20 10:03:13 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/host
Feb 20 10:03:13 np0005625204.localdomain dnsmasq-dhcp[323412]: read /var/lib/neutron/dhcp/fd32aaea-98e9-4dfa-ad52-36d30939560e/opts
Feb 20 10:03:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:13Z|00420|binding|INFO|Releasing lport 736aea51-8061-4aa8-b593-31839f1f2534 from this chassis (sb_readonly=0)
Feb 20 10:03:13 np0005625204.localdomain kernel: device tap736aea51-80 left promiscuous mode
Feb 20 10:03:13 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:13Z|00421|binding|INFO|Setting lport 736aea51-8061-4aa8-b593-31839f1f2534 down in Southbound
Feb 20 10:03:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:13.951 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:13.964 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd32aaea-98e9-4dfa-ad52-36d30939560e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a47e12e114b4e778ff94aca4e5dad8b', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=88fdeb8d-ebba-4734-9a0b-e4eba3120811, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=736aea51-8061-4aa8-b593-31839f1f2534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:03:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:13.966 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 736aea51-8061-4aa8-b593-31839f1f2534 in datapath fd32aaea-98e9-4dfa-ad52-36d30939560e unbound from our chassis
Feb 20 10:03:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:13.968 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd32aaea-98e9-4dfa-ad52-36d30939560e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:03:13 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:13.969 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[1cefd803-5a48-4c8a-a809-7f0518704e6f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:03:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:13.973 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "73ef4b22-cb69-44e4-9b94-352c732420be_2d5a5261-1879-492b-a5b5-9562795ecfa9", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:14 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "snap_name": "73ef4b22-cb69-44e4-9b94-352c732420be", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:15 np0005625204.localdomain ceph-mon[301857]: pgmap v625: 177 pgs: 177 active+clean; 217 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s wr, 2 op/s
Feb 20 10:03:15 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:15Z|00422|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:03:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:15.729 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "949fccd6-b67a-49fe-87fe-ecca6c52f167", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "949fccd6-b67a-49fe-87fe-ecca6c52f167", "format": "json"}]: dispatch
Feb 20 10:03:16 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:16 np0005625204.localdomain dnsmasq[323412]: exiting on receipt of SIGTERM
Feb 20 10:03:16 np0005625204.localdomain podman[323585]: 2026-02-20 10:03:16.332540671 +0000 UTC m=+0.050443990 container kill 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 10:03:16 np0005625204.localdomain systemd[1]: libpod-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf.scope: Deactivated successfully.
Feb 20 10:03:16 np0005625204.localdomain podman[323599]: 2026-02-20 10:03:16.403543428 +0000 UTC m=+0.051676008 container died 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 20 10:03:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf-userdata-shm.mount: Deactivated successfully.
Feb 20 10:03:16 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-febc23c981ad25c134ca1cbca59507084ea6219789b24f92013e13e04132bbc8-merged.mount: Deactivated successfully.
Feb 20 10:03:16 np0005625204.localdomain podman[323599]: 2026-02-20 10:03:16.440260618 +0000 UTC m=+0.088393188 container remove 31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd32aaea-98e9-4dfa-ad52-36d30939560e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 20 10:03:16 np0005625204.localdomain systemd[1]: libpod-conmon-31e701782881d7a92e5b7acf710ffb8b1cc6373f39fc54db64ae7ba98d945fbf.scope: Deactivated successfully.
Feb 20 10:03:16 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2dfd32aaea\x2d98e9\x2d4dfa\x2dad52\x2d36d30939560e.mount: Deactivated successfully.
Feb 20 10:03:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:16.473 264355 INFO neutron.agent.dhcp.agent [None req-fa505e1f-dc27-493d-91a7-862d9d01285a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:03:16 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:16.473 264355 INFO neutron.agent.dhcp.agent [None req-fa505e1f-dc27-493d-91a7-862d9d01285a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:03:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:17 np0005625204.localdomain ceph-mon[301857]: pgmap v626: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 97 KiB/s wr, 6 op/s
Feb 20 10:03:17 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e269 e269: 6 total, 6 up, 6 in
Feb 20 10:03:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:03:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:03:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:03:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:03:17 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:17.743 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 10:03:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:03:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18351 "" "Go-http-client/1.1"
Feb 20 10:03:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:18.010 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:03:18 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:03:18 np0005625204.localdomain podman[323623]: 2026-02-20 10:03:18.159328849 +0000 UTC m=+0.098158957 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 10:03:18 np0005625204.localdomain podman[323623]: 2026-02-20 10:03:18.212012536 +0000 UTC m=+0.150842594 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 20 10:03:18 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e270 e270: 6 total, 6 up, 6 in
Feb 20 10:03:18 np0005625204.localdomain systemd[1]: tmp-crun.7x7yGd.mount: Deactivated successfully.
Feb 20 10:03:18 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:03:18 np0005625204.localdomain podman[323624]: 2026-02-20 10:03:18.229802719 +0000 UTC m=+0.164656745 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 20 10:03:18 np0005625204.localdomain podman[323624]: 2026-02-20 10:03:18.265427506 +0000 UTC m=+0.200281522 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 10:03:18 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.322 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.323 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.324 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.329 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '702c5655-bec4-4378-82ee-bc0ccec7c4bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.324176', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '614e0c82-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '41b36c66105151bfab5a0726ed4b087b84dfa11439cdf84af9425ea6d6884788'}]}, 'timestamp': '2026-02-20 10:03:18.330272', '_unique_id': 'e595ff8bf216475aa1471b7e1a644149'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.332 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.333 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.355 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a567cfa8-5142-40d5-8400-c52cb6cb74bb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:03:18.334088', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': '61521502-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.594674855, 'message_signature': '05e3ba206579408ccc88daad9f27719333fa847ec0361cfb695cb23e1598e153'}]}, 'timestamp': '2026-02-20 10:03:18.356744', '_unique_id': 'ce6f8791c6964841992e5343f7d60a28'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.358 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "format": "json"}]: dispatch
Feb 20 10:03:18 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "59be1d8c-08e1-4dc1-89a5-72e5d1dcad7d", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:18 np0005625204.localdomain ceph-mon[301857]: osdmap e269: 6 total, 6 up, 6 in
Feb 20 10:03:18 np0005625204.localdomain ceph-mon[301857]: osdmap e270: 6 total, 6 up, 6 in
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.394 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.395 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e38618-2f01-4190-abe2-7c8e3fb63b95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.360016', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615800a2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '72a34631e2e96741f340407ab1ed7c7c73f83bbd7dc5ff5661df2d263d813f22'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.360016', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615817f4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '3955404c7f61e8b190069987391f9b35b0d2548145fbf3e731c04845bb099041'}]}, 'timestamp': '2026-02-20 10:03:18.396006', '_unique_id': 'f34a009c24724655a5890b9fe5867019'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.397 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.399 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.410 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.411 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59d7f5c5-30ad-42b0-9fef-4580968b5273', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.399342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615a713e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': '65424c96d37bfa61c6d761bb8c8fe9adf7326266c359ac85a28d1f1747af41ed'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.399342', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615a82aa-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': 'eede08b5d9574db5b0ee3e1eb945a20cf458d905213b69c0ab3e368ef78bbe78'}]}, 'timestamp': '2026-02-20 10:03:18.411788', '_unique_id': '869278872f6e4d3b96c3e5d7c778e4df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.413 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.414 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.415 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4972adcd-d0f7-43e5-9430-890eeee6d117', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.414499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615b052c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': '7e98f2203ab3082462e5216b4a2da961b9c59b1919c0d1db3d23a6de808ae606'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.414499', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615b1a12-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': 'cdc8fed0f3eb9860d36dc4dc04192aa3872ddeace97b83b826b5e816d576fb1e'}]}, 'timestamp': '2026-02-20 10:03:18.415667', '_unique_id': 'c72f9b606e0d4b79a56a2da6cce24c90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.418 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.418 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef75ff5a-0339-4c90-87bf-7a259070333a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.418315', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615b95aa-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '8acd775f90acc89a025e1b5b692c2cc532c0cb7b72da854d99287f479ca1738f'}]}, 'timestamp': '2026-02-20 10:03:18.418888', '_unique_id': '9bf4afeeaab647789981ac9d70df32de'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.420 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.421 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.421 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d73f51e-aaf8-4b8c-adf4-59cf1810d00f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.421869', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615c1fa2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': 'ca789ca7422d532a43a11ba9caa5d8d1e079398109a0d63ef452318a65dee0cd'}]}, 'timestamp': '2026-02-20 10:03:18.422350', '_unique_id': '9a875e2e41cd42d5a9419e5657822387'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.423 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.424 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.425 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daa3636c-bfd5-4791-9e31-4eff6923e5cc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.424748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615c91e4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': '5e40d349ffb559cd04394865bb739f5e2e9360d828ea7943bc453ba0dffa8de2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.424748', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615ca36e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.638597934, 'message_signature': 'eec9087cedf3e84fc57c888d52af1276ac3a0abad5b68d61b7eb9c62c72534ed'}]}, 'timestamp': '2026-02-20 10:03:18.425724', '_unique_id': '6340bf86769c4192bd77fd12e67c663b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.426 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.428 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a3b5c40-0fc5-4c4a-be72-47b4ac9257e5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.428266', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615d19a2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '0abab6b30c8d8713e7f430691754f0f1641d502fff37d7f0ddaf05654390b248'}]}, 'timestamp': '2026-02-20 10:03:18.428789', '_unique_id': 'b85d4a2d678b474a91d5aa92017b84be'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.429 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.430 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.430 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd34adab-3c89-4c14-9cd1-3a448df0c562', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.430929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615d811c-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'e6b4898baec9da6696b5d87ecb2243b4c29a512290d643563650b8bcd95dd3a1'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.430929', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615d922e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '21a5478d68dc783ab09914790578d79297cc555fe97f92c0cf1fab4486c0f440'}]}, 'timestamp': '2026-02-20 10:03:18.431854', '_unique_id': '3b879691680b4d059e0e599e40b9bc9e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.432 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.433 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.434 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88249f59-6a2e-4d8a-9947-eba10480b2d9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.434258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615e0470-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'd4733aec1d9eba5a5b1450cb82202b5fadae798b65d77e3aa19e75fbf60254f3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.434258', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615e18b6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '7cc443dfc06e2b791e219613d718b336619e666d9da6567acf9a2dd49ab79624'}]}, 'timestamp': '2026-02-20 10:03:18.435249', '_unique_id': '11a8f9afe7894172af94b7eb86d65701'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.437 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b757f67-f1f5-47fc-8cfb-e672d35e1e95', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.437509', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615e83d2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': 'a7279612f286b56a4d2170cc72e38f41440c03c270c0809cc7bec0da036dd183'}]}, 'timestamp': '2026-02-20 10:03:18.438066', '_unique_id': '5e22831bb09045d4940386d1242c0c6a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.439 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ede0b88c-d828-49f6-ae99-6c8c8c2bf588', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.440474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '615ef7ea-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'c088c47c56bb13117e2173d9e45b09470c314b97aeac049370ccb66b384b3d76'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.440474', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '615f08d4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'b99741234e6f2c01a08b6b10b802c805f9c90b7924b8e7a9952a281c1968542c'}]}, 'timestamp': '2026-02-20 10:03:18.441391', '_unique_id': '4bc8984b05474964ac9331876a158d87'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.442 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.443 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.443 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '198c6523-bc7c-4517-9680-a60ffe97530b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.443583', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615f7120-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '01366cae0aaa048058f5562f339f37a16843bb768295e38e5c6cec237992048f'}]}, 'timestamp': '2026-02-20 10:03:18.444127', '_unique_id': 'c7588b7aa632416ca538e961aba42d35'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.445 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.446 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.446 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3c8ef41b-6add-489c-a35d-354528167a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.446486', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '615fe240-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '05ee99959dc8695a83d4bbe7d0a2d96bf2a6f70aad6ba15d052cc46312927152'}]}, 'timestamp': '2026-02-20 10:03:18.446996', '_unique_id': '235f8704b1b843dca50430940faeebac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.447 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.449 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e6090d9a-d4e9-40fe-aee5-e6d5e662deeb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.449275', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '61604c94-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '1b3c822981a5ed36f23b2070c668f3e8f23676dbdb27657dac1bfcc5ca784ece'}]}, 'timestamp': '2026-02-20 10:03:18.449716', '_unique_id': 'de3ecf2cdb2d4fffb485efdf0f0e3c6c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.450 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.451 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.451 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.451 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1381914-1d71-4164-aaac-b35650146146', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.451185', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6160951e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': 'a7c68bf53eccadd0c70cbe24afd2638f55f39d06434faa8f1ab7e057a17f4bad'}]}, 'timestamp': '2026-02-20 10:03:18.451480', '_unique_id': '63d10638e7ec4ef09533a1d4df1e67c6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.452 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '777dfd6c-7b7f-4821-b8d4-cb155d321a24', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:03:18.453044', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': '6160ddbc-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.56339943, 'message_signature': '60f8059c3a265aa2686bd5183708015f83abcaf4ed7ddd350acebbe9b2d0fc58'}]}, 'timestamp': '2026-02-20 10:03:18.453338', '_unique_id': '771b01dd4a174a0fbee8f217de9062cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.453 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.454 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.454 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6e73b26-29c4-4943-ad09-9413b4645149', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.454745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6161202e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': 'a72c4eb45fc1b94136b9928659a1ed9a56aed8ee1b16fa2b607db1cb07335aa4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.454745', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '61612af6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '47cbb9c15d58381e666561742218fdc9102f1c11d8d2428ac61780dd8de6bb36'}]}, 'timestamp': '2026-02-20 10:03:18.455296', '_unique_id': '1d68fba5c4294c409d87825d2543f653'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.455 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.456 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.456 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 20130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b048b44-6563-4583-8e2e-57174a74924c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20130000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:03:18.456733', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': '61616e8a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.594674855, 'message_signature': '1ac0fca5e1b8d7768cdd047fae873ebea9d91e4a4c79e801671e0f32985dd274'}]}, 'timestamp': '2026-02-20 10:03:18.457049', '_unique_id': '40001bbf432543428ab60b137cc5d1bc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.457 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.458 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.458 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.458 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7431d514-41eb-4d0d-b976-a0f1c2a4bc0e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:03:18.458423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': '6161b1a6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '84da3c2c7d49d75016ffdd4ac4b16960e5186e88e06998043d335f991445379e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:03:18.458423', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': '6161bfde-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12357.599257184, 'message_signature': '125735c0c6e700cba44d13ddbc960d1e7586cfb74b6ce936114c8bb0ffe604fc'}]}, 'timestamp': '2026-02-20 10:03:18.459112', '_unique_id': '5e196f9b3b464ed69bb52384863c934b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.459 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:03:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:03:18.460 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:03:19 np0005625204.localdomain ceph-mon[301857]: pgmap v628: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 97 KiB/s wr, 6 op/s
Feb 20 10:03:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "949fccd6-b67a-49fe-87fe-ecca6c52f167", "format": "json"}]: dispatch
Feb 20 10:03:20 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "949fccd6-b67a-49fe-87fe-ecca6c52f167", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:21 np0005625204.localdomain ceph-mon[301857]: pgmap v630: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 549 B/s rd, 58 KiB/s wr, 4 op/s
Feb 20 10:03:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e270 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:22 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:23.012 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:23.014 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:23.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:03:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:23.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:23.046 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:23.047 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:23 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 e271: 6 total, 6 up, 6 in
Feb 20 10:03:23 np0005625204.localdomain ceph-mon[301857]: pgmap v631: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 131 KiB/s wr, 7 op/s
Feb 20 10:03:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d3ef7044-f1cf-4b2a-bbcd-13fc0eea0006", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:23 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d3ef7044-f1cf-4b2a-bbcd-13fc0eea0006", "format": "json"}]: dispatch
Feb 20 10:03:23 np0005625204.localdomain ceph-mon[301857]: osdmap e271: 6 total, 6 up, 6 in
Feb 20 10:03:25 np0005625204.localdomain ceph-mon[301857]: pgmap v633: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 642 B/s rd, 97 KiB/s wr, 5 op/s
Feb 20 10:03:25 np0005625204.localdomain ceph-mon[301857]: mgrmap e55: np0005625202.arwxwo(active, since 15m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 10:03:26 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:03:26 np0005625204.localdomain systemd[1]: tmp-crun.3jxZ8N.mount: Deactivated successfully.
Feb 20 10:03:26 np0005625204.localdomain podman[323666]: 2026-02-20 10:03:26.15878203 +0000 UTC m=+0.098164866 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 10:03:26 np0005625204.localdomain podman[323666]: 2026-02-20 10:03:26.172151609 +0000 UTC m=+0.111534435 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:03:26 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:03:26 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d3ef7044-f1cf-4b2a-bbcd-13fc0eea0006", "format": "json"}]: dispatch
Feb 20 10:03:26 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d3ef7044-f1cf-4b2a-bbcd-13fc0eea0006", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:03:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:03:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:03:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:03:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:03:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:03:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:27 np0005625204.localdomain ceph-mon[301857]: pgmap v634: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 97 KiB/s wr, 6 op/s
Feb 20 10:03:27 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:28.048 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:28.049 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:28.049 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:03:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:28.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:28.050 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:28.055 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:28 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:28 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "format": "json"}]: dispatch
Feb 20 10:03:28 np0005625204.localdomain ceph-mon[301857]: pgmap v635: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 645 B/s rd, 81 KiB/s wr, 5 op/s
Feb 20 10:03:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "29715001-dc5b-4019-b356-72b85ec77e38", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "29715001-dc5b-4019-b356-72b85ec77e38", "format": "json"}]: dispatch
Feb 20 10:03:29 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:30 np0005625204.localdomain ceph-mon[301857]: pgmap v636: 177 pgs: 177 active+clean; 218 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 78 KiB/s wr, 5 op/s
Feb 20 10:03:30 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "snap_name": "16d51dc9-4db6-4a28-af31-2025dc25f7ce", "format": "json"}]: dispatch
Feb 20 10:03:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:32 np0005625204.localdomain ceph-mon[301857]: pgmap v637: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 67 KiB/s wr, 4 op/s
Feb 20 10:03:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "29715001-dc5b-4019-b356-72b85ec77e38", "format": "json"}]: dispatch
Feb 20 10:03:32 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "29715001-dc5b-4019-b356-72b85ec77e38", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:33.054 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:33.055 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:33.056 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:03:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:33.056 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:33 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:03:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:33.074 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:33.074 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:33 np0005625204.localdomain podman[323685]: 2026-02-20 10:03:33.166786412 +0000 UTC m=+0.080828527 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:03:33 np0005625204.localdomain podman[323685]: 2026-02-20 10:03:33.202873353 +0000 UTC m=+0.116915428 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 10:03:33 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:03:33 np0005625204.localdomain sshd[323708]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:03:34 np0005625204.localdomain auditd[725]: Audit daemon rotating log files
Feb 20 10:03:34 np0005625204.localdomain sshd[323708]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:03:35 np0005625204.localdomain ceph-mon[301857]: pgmap v638: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 389 B/s rd, 64 KiB/s wr, 4 op/s
Feb 20 10:03:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "snap_name": "16d51dc9-4db6-4a28-af31-2025dc25f7ce_1e999bb9-0d1d-43f6-a0e7-cfabc45a8c84", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:35 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "snap_name": "16d51dc9-4db6-4a28-af31-2025dc25f7ce", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4cb8de8b-c2e5-47d8-bfa6-1319d78b7e6c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 20 10:03:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4cb8de8b-c2e5-47d8-bfa6-1319d78b7e6c", "format": "json"}]: dispatch
Feb 20 10:03:36 np0005625204.localdomain ceph-mon[301857]: from='client.25631 172.18.0.34:0/3503108328' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 20 10:03:36 np0005625204.localdomain sshd[323711]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:03:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:37 np0005625204.localdomain ceph-mon[301857]: pgmap v639: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 96 KiB/s wr, 5 op/s
Feb 20 10:03:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e272 e272: 6 total, 6 up, 6 in
Feb 20 10:03:37 np0005625204.localdomain sshd[323711]: Invalid user sol from 45.148.10.240 port 57676
Feb 20 10:03:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:38.075 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:38.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:38.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:03:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:38.077 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:38.124 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:38.125 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:03:38 np0005625204.localdomain sshd[323711]: Connection closed by invalid user sol 45.148.10.240 port 57676 [preauth]
Feb 20 10:03:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "format": "json"}]: dispatch
Feb 20 10:03:38 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7e87e16b-7877-4c1f-b090-d18ae7e41b0b", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:38 np0005625204.localdomain ceph-mon[301857]: osdmap e272: 6 total, 6 up, 6 in
Feb 20 10:03:39 np0005625204.localdomain ceph-mon[301857]: pgmap v641: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 99 KiB/s wr, 5 op/s
Feb 20 10:03:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:03:40 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:03:40 np0005625204.localdomain podman[323713]: 2026-02-20 10:03:40.1484618 +0000 UTC m=+0.086887952 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1770267347, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=9.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.33.7, vendor=Red Hat, Inc., config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, build-date=2026-02-05T04:57:10Z, org.opencontainers.image.created=2026-02-05T04:57:10Z)
Feb 20 10:03:40 np0005625204.localdomain podman[323713]: 2026-02-20 10:03:40.161692014 +0000 UTC m=+0.100118186 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, managed_by=edpm_ansible, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, vendor=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vcs-type=git, architecture=x86_64, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 10:03:40 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:03:40 np0005625204.localdomain systemd[1]: tmp-crun.67b3Ka.mount: Deactivated successfully.
Feb 20 10:03:40 np0005625204.localdomain podman[323714]: 2026-02-20 10:03:40.209234835 +0000 UTC m=+0.145706457 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 10:03:40 np0005625204.localdomain podman[323714]: 2026-02-20 10:03:40.241479838 +0000 UTC m=+0.177951460 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:03:40 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:03:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4cb8de8b-c2e5-47d8-bfa6-1319d78b7e6c", "format": "json"}]: dispatch
Feb 20 10:03:40 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4cb8de8b-c2e5-47d8-bfa6-1319d78b7e6c", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:41 np0005625204.localdomain ceph-mon[301857]: pgmap v642: 177 pgs: 177 active+clean; 219 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 99 KiB/s wr, 5 op/s
Feb 20 10:03:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:43.127 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:43 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e273 e273: 6 total, 6 up, 6 in
Feb 20 10:03:43 np0005625204.localdomain ceph-mon[301857]: pgmap v643: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 108 KiB/s wr, 5 op/s
Feb 20 10:03:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ccc69125-8271-465f-a7cf-99b18598188c", "snap_name": "7873ec63-b44a-47a7-8bbe-8f944c5b9a9d_f5381e37-8883-4e7e-9251-e9f21c220e6c", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:43 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "ccc69125-8271-465f-a7cf-99b18598188c", "snap_name": "7873ec63-b44a-47a7-8bbe-8f944c5b9a9d", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:43 np0005625204.localdomain ceph-mon[301857]: osdmap e273: 6 total, 6 up, 6 in
Feb 20 10:03:45 np0005625204.localdomain ceph-mon[301857]: pgmap v645: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 75 KiB/s wr, 4 op/s
Feb 20 10:03:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ccc69125-8271-465f-a7cf-99b18598188c", "format": "json"}]: dispatch
Feb 20 10:03:47 np0005625204.localdomain ceph-mon[301857]: from='client.25631 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ccc69125-8271-465f-a7cf-99b18598188c", "force": true, "format": "json"}]: dispatch
Feb 20 10:03:47 np0005625204.localdomain ceph-mon[301857]: pgmap v646: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 969 B/s rd, 100 KiB/s wr, 7 op/s
Feb 20 10:03:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e274 e274: 6 total, 6 up, 6 in
Feb 20 10:03:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:47.633 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': 'c2:c2:14', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '46:d0:69:88:97:47'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:03:47 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:47.634 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 20 10:03:47 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:47.673 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:03:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:03:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:03:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:03:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:03:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18357 "" "Go-http-client/1.1"
Feb 20 10:03:48 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:48.045 264355 INFO neutron.agent.linux.ip_lib [None req-cbf568c4-6016-453f-a7ca-c6d359eb2604 - - - - - -] Device tap579a4ffd-e1 cannot be used as it has no MAC address
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.076 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:48 np0005625204.localdomain kernel: device tap579a4ffd-e1 entered promiscuous mode
Feb 20 10:03:48 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581828.0864] manager: (tap579a4ffd-e1): new Generic device (/org/freedesktop/NetworkManager/Devices/65)
Feb 20 10:03:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:48Z|00423|binding|INFO|Claiming lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 for this chassis.
Feb 20 10:03:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:48Z|00424|binding|INFO|579a4ffd-e114-452c-9e7e-536d8c8914c1: Claiming unknown
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.089 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:48 np0005625204.localdomain systemd-udevd[323766]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 10:03:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:48.099 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb651850cad14d76bf9ffb2d11fd8747', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9249ec89-4893-4da0-9067-aa2693e86932, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=579a4ffd-e114-452c-9e7e-536d8c8914c1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:03:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:48.101 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 579a4ffd-e114-452c-9e7e-536d8c8914c1 in datapath 7e429db3-d8ae-4504-93c9-b3bbe8cdc007 bound to our chassis
Feb 20 10:03:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:48.103 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port b6450798-e8b9-4b7d-a648-348e62ed609f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 10:03:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:48.104 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:03:48 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:48.105 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[d0dedf49-2b5d-4579-9812-70bf0ac86481]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:03:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:48Z|00425|binding|INFO|Setting lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 ovn-installed in OVS
Feb 20 10:03:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:48Z|00426|binding|INFO|Setting lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 up in Southbound
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.132 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain virtnodedevd[229984]: ethtool ioctl error on tap579a4ffd-e1: No such device
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.174 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.207 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:48 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:03:48Z|00427|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:03:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:48.261 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:48 np0005625204.localdomain ceph-mon[301857]: osdmap e274: 6 total, 6 up, 6 in
Feb 20 10:03:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:03:49 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:03:49 np0005625204.localdomain podman[323831]: 2026-02-20 10:03:49.168585175 +0000 UTC m=+0.097503536 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 10:03:49 np0005625204.localdomain podman[323848]: 
Feb 20 10:03:49 np0005625204.localdomain podman[323831]: 2026-02-20 10:03:49.205288075 +0000 UTC m=+0.134206436 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 20 10:03:49 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:03:49 np0005625204.localdomain podman[323830]: 2026-02-20 10:03:49.217692153 +0000 UTC m=+0.150340258 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 10:03:49 np0005625204.localdomain podman[323848]: 2026-02-20 10:03:49.24349936 +0000 UTC m=+0.141440376 container create c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 20 10:03:49 np0005625204.localdomain podman[323848]: 2026-02-20 10:03:49.153200225 +0000 UTC m=+0.051141231 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 10:03:49 np0005625204.localdomain systemd[1]: Started libpod-conmon-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023.scope.
Feb 20 10:03:49 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 10:03:49 np0005625204.localdomain podman[323830]: 2026-02-20 10:03:49.330918978 +0000 UTC m=+0.263567153 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 10:03:49 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/77fa0a05beec35518c693185a2da6b468c236c000516b316ebb93a07ea0c1c2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 10:03:49 np0005625204.localdomain podman[323848]: 2026-02-20 10:03:49.342222193 +0000 UTC m=+0.240163209 container init c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:03:49 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:03:49 np0005625204.localdomain podman[323848]: 2026-02-20 10:03:49.352609659 +0000 UTC m=+0.250550675 container start c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:03:49 np0005625204.localdomain dnsmasq[323894]: started, version 2.85 cachesize 150
Feb 20 10:03:49 np0005625204.localdomain dnsmasq[323894]: DNS service limited to local subnets
Feb 20 10:03:49 np0005625204.localdomain dnsmasq[323894]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 10:03:49 np0005625204.localdomain dnsmasq[323894]: warning: no upstream servers configured
Feb 20 10:03:49 np0005625204.localdomain dnsmasq-dhcp[323894]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 10:03:49 np0005625204.localdomain dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 0 addresses
Feb 20 10:03:49 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host
Feb 20 10:03:49 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts
Feb 20 10:03:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:49.413 264355 INFO neutron.agent.dhcp.agent [None req-b4eb6337-31e1-438e-8f4b-281f20c8fb2e - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:48Z, description=, device_id=589aa915-f12b-4442-ae0f-97795f57950f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df57d4970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df57d4340>], id=16cd30e7-a2a9-4140-b0a6-49e3b9576f4a, ip_allocation=immediate, mac_address=fa:16:3e:b6:56:5b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:45Z, description=, dns_domain=, id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-194178913-network, port_security_enabled=True, project_id=bb651850cad14d76bf9ffb2d11fd8747, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35672, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3892, status=ACTIVE, subnets=['ded50358-efae-40fd-8857-2510da215a44'], tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:46Z, vlan_transparent=None, network_id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, port_security_enabled=False, project_id=bb651850cad14d76bf9ffb2d11fd8747, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3900, status=DOWN, tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:48Z on network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007
Feb 20 10:03:49 np0005625204.localdomain ceph-mon[301857]: pgmap v648: 177 pgs: 177 active+clean; 220 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 105 KiB/s wr, 6 op/s
Feb 20 10:03:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:49.521 264355 INFO neutron.agent.dhcp.agent [None req-384d23a2-7f09-4bd3-af9f-73ce37256037 - - - - - -] DHCP configuration for ports {'ce127046-9f8a-4325-9a56-0ecbb6db009d'} is completed
Feb 20 10:03:49 np0005625204.localdomain dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 1 addresses
Feb 20 10:03:49 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host
Feb 20 10:03:49 np0005625204.localdomain podman[323911]: 2026-02-20 10:03:49.640517554 +0000 UTC m=+0.060591800 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 20 10:03:49 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts
Feb 20 10:03:49 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:49.882 264355 INFO neutron.agent.dhcp.agent [None req-cd70430f-aa50-4275-9258-a481487fc887 - - - - - -] DHCP configuration for ports {'16cd30e7-a2a9-4140-b0a6-49e3b9576f4a'} is completed
Feb 20 10:03:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:50.127 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:03:48Z, description=, device_id=589aa915-f12b-4442-ae0f-97795f57950f, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df626c0d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df5a9afd0>], id=16cd30e7-a2a9-4140-b0a6-49e3b9576f4a, ip_allocation=immediate, mac_address=fa:16:3e:b6:56:5b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:03:45Z, description=, dns_domain=, id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-194178913-network, port_security_enabled=True, project_id=bb651850cad14d76bf9ffb2d11fd8747, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=35672, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3892, status=ACTIVE, subnets=['ded50358-efae-40fd-8857-2510da215a44'], tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:46Z, vlan_transparent=None, network_id=7e429db3-d8ae-4504-93c9-b3bbe8cdc007, port_security_enabled=False, project_id=bb651850cad14d76bf9ffb2d11fd8747, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3900, status=DOWN, tags=[], tenant_id=bb651850cad14d76bf9ffb2d11fd8747, updated_at=2026-02-20T10:03:48Z on network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007
Feb 20 10:03:50 np0005625204.localdomain dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 1 addresses
Feb 20 10:03:50 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host
Feb 20 10:03:50 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts
Feb 20 10:03:50 np0005625204.localdomain podman[323947]: 2026-02-20 10:03:50.349903608 +0000 UTC m=+0.064551951 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:03:50 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:03:50.645 264355 INFO neutron.agent.dhcp.agent [None req-65891de8-6f13-46bf-ac3a-0dfd19d5bb62 - - - - - -] DHCP configuration for ports {'16cd30e7-a2a9-4140-b0a6-49e3b9576f4a'} is completed
Feb 20 10:03:51 np0005625204.localdomain ceph-mon[301857]: pgmap v649: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 31 KiB/s wr, 4 op/s
Feb 20 10:03:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:51 np0005625204.localdomain sudo[323969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:03:51 np0005625204.localdomain sudo[323969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:03:51 np0005625204.localdomain sudo[323969]: pam_unix(sudo:session): session closed for user root
Feb 20 10:03:51 np0005625204.localdomain sudo[323987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:03:51 np0005625204.localdomain sudo[323987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:03:52 np0005625204.localdomain sudo[323987]: pam_unix(sudo:session): session closed for user root
Feb 20 10:03:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:03:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:03:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:03:52 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:03:52 np0005625204.localdomain sudo[324037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:03:52 np0005625204.localdomain sudo[324037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:03:52 np0005625204.localdomain sudo[324037]: pam_unix(sudo:session): session closed for user root
Feb 20 10:03:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:53.135 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 e275: 6 total, 6 up, 6 in
Feb 20 10:03:53 np0005625204.localdomain ceph-mon[301857]: pgmap v650: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 722 B/s rd, 60 KiB/s wr, 5 op/s
Feb 20 10:03:53 np0005625204.localdomain ceph-mon[301857]: osdmap e275: 6 total, 6 up, 6 in
Feb 20 10:03:54 np0005625204.localdomain ceph-mon[301857]: pgmap v652: 177 pgs: 177 active+clean; 220 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 32 KiB/s wr, 1 op/s
Feb 20 10:03:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:03:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:03:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:03:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:03:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:03:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:03:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:03:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:03:57 np0005625204.localdomain ceph-mon[301857]: pgmap v653: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 246 B/s rd, 47 KiB/s wr, 2 op/s
Feb 20 10:03:57 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:03:57 np0005625204.localdomain systemd[1]: tmp-crun.qiMFqC.mount: Deactivated successfully.
Feb 20 10:03:57 np0005625204.localdomain podman[324055]: 2026-02-20 10:03:57.175717811 +0000 UTC m=+0.101424185 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 10:03:57 np0005625204.localdomain podman[324055]: 2026-02-20 10:03:57.215369881 +0000 UTC m=+0.141076215 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute)
Feb 20 10:03:57 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:03:57 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:03:57.636 162652 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=e6b84e4d-7dff-4c2c-96db-c41e3ef520c6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 20 10:03:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:58.138 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.253854) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838253916, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1193, "num_deletes": 265, "total_data_size": 1782068, "memory_usage": 1804352, "flush_reason": "Manual Compaction"}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838262677, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1169299, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31846, "largest_seqno": 33034, "table_properties": {"data_size": 1164271, "index_size": 2435, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12322, "raw_average_key_size": 20, "raw_value_size": 1153574, "raw_average_value_size": 1935, "num_data_blocks": 106, "num_entries": 596, "num_filter_entries": 596, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581781, "oldest_key_time": 1771581781, "file_creation_time": 1771581838, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 8865 microseconds, and 4209 cpu microseconds.
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.262728) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1169299 bytes OK
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.262754) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.264726) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.264746) EVENT_LOG_v1 {"time_micros": 1771581838264740, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.264770) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 1776068, prev total WAL file size 1776392, number of live WAL files 2.
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.265487) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323732' seq:72057594037927935, type:22 .. '6C6F676D0034353234' seq:0, type:0; will stop at (end)
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1141KB)], [48(18MB)]
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838265535, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 20659103, "oldest_snapshot_seqno": -1}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 14356 keys, 20442829 bytes, temperature: kUnknown
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838360503, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 20442829, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20358151, "index_size": 47713, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35909, "raw_key_size": 383773, "raw_average_key_size": 26, "raw_value_size": 20111830, "raw_average_value_size": 1400, "num_data_blocks": 1798, "num_entries": 14356, "num_filter_entries": 14356, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581838, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.360916) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 20442829 bytes
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.362735) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.2 rd, 214.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 18.6 +0.0 blob) out(19.5 +0.0 blob), read-write-amplify(35.2) write-amplify(17.5) OK, records in: 14904, records dropped: 548 output_compression: NoCompression
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.362764) EVENT_LOG_v1 {"time_micros": 1771581838362751, "job": 28, "event": "compaction_finished", "compaction_time_micros": 95133, "compaction_time_cpu_micros": 53506, "output_level": 6, "num_output_files": 1, "total_output_size": 20442829, "num_input_records": 14904, "num_output_records": 14356, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838363069, "job": 28, "event": "table_file_deletion", "file_number": 50}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581838365956, "job": 28, "event": "table_file_deletion", "file_number": 48}
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.265363) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366043) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:03:58.366050) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:03:59 np0005625204.localdomain ceph-mon[301857]: pgmap v654: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 39 KiB/s wr, 2 op/s
Feb 20 10:03:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:03:59.619 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:01 np0005625204.localdomain ceph-mon[301857]: pgmap v655: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 39 KiB/s wr, 1 op/s
Feb 20 10:04:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:01.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:02.240 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1304243398' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:04:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/1304243398' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:04:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:03.142 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:03 np0005625204.localdomain ceph-mon[301857]: pgmap v656: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Feb 20 10:04:04 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:04:04 np0005625204.localdomain podman[324074]: 2026-02-20 10:04:04.173839102 +0000 UTC m=+0.101684283 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 20 10:04:04 np0005625204.localdomain podman[324074]: 2026-02-20 10:04:04.213315456 +0000 UTC m=+0.141160577 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:04:04 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:04:04 np0005625204.localdomain dnsmasq[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/addn_hosts - 0 addresses
Feb 20 10:04:04 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/host
Feb 20 10:04:04 np0005625204.localdomain dnsmasq-dhcp[323894]: read /var/lib/neutron/dhcp/7e429db3-d8ae-4504-93c9-b3bbe8cdc007/opts
Feb 20 10:04:04 np0005625204.localdomain podman[324112]: 2026-02-20 10:04:04.296799564 +0000 UTC m=+0.063623913 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 10:04:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/244285166' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:04Z|00428|binding|INFO|Releasing lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 from this chassis (sb_readonly=0)
Feb 20 10:04:04 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:04Z|00429|binding|INFO|Setting lport 579a4ffd-e114-452c-9e7e-536d8c8914c1 down in Southbound
Feb 20 10:04:04 np0005625204.localdomain kernel: device tap579a4ffd-e1 left promiscuous mode
Feb 20 10:04:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:04.553 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:04.561 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7e429db3-d8ae-4504-93c9-b3bbe8cdc007', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb651850cad14d76bf9ffb2d11fd8747', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9249ec89-4893-4da0-9067-aa2693e86932, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=579a4ffd-e114-452c-9e7e-536d8c8914c1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:04:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:04.563 162652 INFO neutron.agent.ovn.metadata.agent [-] Port 579a4ffd-e114-452c-9e7e-536d8c8914c1 in datapath 7e429db3-d8ae-4504-93c9-b3bbe8cdc007 unbound from our chassis
Feb 20 10:04:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:04.566 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7e429db3-d8ae-4504-93c9-b3bbe8cdc007, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:04:04 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:04.567 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[45b8b7a7-3151-47f3-9df4-a4d89b144dbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:04:04 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:04.580 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:05.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:05 np0005625204.localdomain ceph-mon[301857]: pgmap v657: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Feb 20 10:04:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2811233635' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:04:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:04:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:06.029 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:04:06 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:06Z|00430|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:04:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:06.250 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:06.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:06 np0005625204.localdomain ceph-mon[301857]: pgmap v658: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 13 KiB/s wr, 0 op/s
Feb 20 10:04:07 np0005625204.localdomain dnsmasq[323894]: exiting on receipt of SIGTERM
Feb 20 10:04:07 np0005625204.localdomain systemd[1]: libpod-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023.scope: Deactivated successfully.
Feb 20 10:04:07 np0005625204.localdomain podman[324153]: 2026-02-20 10:04:07.487382192 +0000 UTC m=+0.067869332 container kill c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 10:04:07 np0005625204.localdomain podman[324167]: 2026-02-20 10:04:07.564907918 +0000 UTC m=+0.067735099 container died c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 20 10:04:07 np0005625204.localdomain podman[324167]: 2026-02-20 10:04:07.609018113 +0000 UTC m=+0.111845254 container cleanup c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:07 np0005625204.localdomain systemd[1]: libpod-conmon-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023.scope: Deactivated successfully.
Feb 20 10:04:07 np0005625204.localdomain podman[324174]: 2026-02-20 10:04:07.646220649 +0000 UTC m=+0.135217118 container remove c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7e429db3-d8ae-4504-93c9-b3bbe8cdc007, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 20 10:04:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:07.677 264355 INFO neutron.agent.dhcp.agent [None req-4bbc60d1-999e-4405-8002-325e8e505b6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:04:07 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:07.678 264355 INFO neutron.agent.dhcp.agent [None req-4bbc60d1-999e-4405-8002-325e8e505b6c - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.723 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.744 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.745 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:04:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:07.745 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:04:07 np0005625204.localdomain sshd[324198]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.177 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:08 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:04:08 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3556237270' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.199 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.272 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.273 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:04:08 np0005625204.localdomain systemd[1]: tmp-crun.j9DUt1.mount: Deactivated successfully.
Feb 20 10:04:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-77fa0a05beec35518c693185a2da6b468c236c000516b316ebb93a07ea0c1c2b-merged.mount: Deactivated successfully.
Feb 20 10:04:08 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c449b4851067df9a4e71db3b6c8280393430766bbd068d432daed81716696023-userdata-shm.mount: Deactivated successfully.
Feb 20 10:04:08 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d7e429db3\x2dd8ae\x2d4504\x2d93c9\x2db3bbe8cdc007.mount: Deactivated successfully.
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.508 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.511 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11204MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.511 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.512 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:04:08 np0005625204.localdomain sshd[324198]: Invalid user oracle from 196.189.116.182 port 48166
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.588 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.589 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.589 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:04:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:08.640 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:04:08 np0005625204.localdomain sshd[324198]: Received disconnect from 196.189.116.182 port 48166:11: Bye Bye [preauth]
Feb 20 10:04:08 np0005625204.localdomain sshd[324198]: Disconnected from invalid user oracle 196.189.116.182 port 48166 [preauth]
Feb 20 10:04:08 np0005625204.localdomain ceph-mon[301857]: pgmap v659: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s wr, 0 op/s
Feb 20 10:04:08 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3556237270' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:09 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:04:09 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3887188390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:09.067 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:04:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:09.074 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:04:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:09.095 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:04:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:09.097 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:04:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:09.098 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.586s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:04:09 np0005625204.localdomain sshd[324243]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:09 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3887188390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:09 np0005625204.localdomain sshd[324243]: Invalid user ubuntu from 57.128.218.144 port 60410
Feb 20 10:04:10 np0005625204.localdomain sshd[324243]: Received disconnect from 57.128.218.144 port 60410:11: Bye Bye [preauth]
Feb 20 10:04:10 np0005625204.localdomain sshd[324243]: Disconnected from invalid user ubuntu 57.128.218.144 port 60410 [preauth]
Feb 20 10:04:10 np0005625204.localdomain ceph-mon[301857]: pgmap v660: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s wr, 0 op/s
Feb 20 10:04:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:04:11 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:04:11 np0005625204.localdomain systemd[1]: tmp-crun.YIv1uB.mount: Deactivated successfully.
Feb 20 10:04:11 np0005625204.localdomain podman[324245]: 2026-02-20 10:04:11.165913568 +0000 UTC m=+0.103325104 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.buildah.version=1.33.7, container_name=openstack_network_exporter, build-date=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, version=9.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1770267347, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.expose-services=)
Feb 20 10:04:11 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:11Z|00431|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:04:11 np0005625204.localdomain podman[324246]: 2026-02-20 10:04:11.208905159 +0000 UTC m=+0.142694905 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 10:04:11 np0005625204.localdomain podman[324246]: 2026-02-20 10:04:11.225782934 +0000 UTC m=+0.159572630 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 10:04:11 np0005625204.localdomain podman[324245]: 2026-02-20 10:04:11.238045099 +0000 UTC m=+0.175456605 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, release=1770267347, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, org.opencontainers.image.created=2026-02-05T04:57:10Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 20 10:04:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:11.239 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:11 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:04:11 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:04:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:11 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/974015829' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:11 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3485150094' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.094 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.095 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.095 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.095 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.185 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.186 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.186 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.187 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.796 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.815 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.816 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.817 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:12.818 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:04:12 np0005625204.localdomain ceph-mon[301857]: pgmap v661: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s wr, 0 op/s
Feb 20 10:04:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:13.181 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:14.440 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:04:15 np0005625204.localdomain ceph-mon[301857]: pgmap v662: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s wr, 0 op/s
Feb 20 10:04:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:17 np0005625204.localdomain ceph-mon[301857]: pgmap v663: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 2.0 KiB/s wr, 0 op/s
Feb 20 10:04:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:04:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:04:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:04:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:04:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:04:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1"
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.188 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.191 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.192 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.214 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.214 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:04:18 np0005625204.localdomain sshd[324287]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:18 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:18.539 264355 INFO neutron.agent.linux.ip_lib [None req-a31bd127-1bf3-4a93-afef-73b69b90945b - - - - - -] Device tapd32edf6e-af cannot be used as it has no MAC address
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.566 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain kernel: device tapd32edf6e-af entered promiscuous mode
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.575 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:18Z|00432|binding|INFO|Claiming lport d32edf6e-af3c-4889-8b25-6677d97e68ca for this chassis.
Feb 20 10:04:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:18Z|00433|binding|INFO|d32edf6e-af3c-4889-8b25-6677d97e68ca: Claiming unknown
Feb 20 10:04:18 np0005625204.localdomain NetworkManager[5988]: <info>  [1771581858.5783] manager: (tapd32edf6e-af): new Generic device (/org/freedesktop/NetworkManager/Devices/66)
Feb 20 10:04:18 np0005625204.localdomain systemd-udevd[324299]: Network interface NamePolicy= disabled on kernel command line.
Feb 20 10:04:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:18.585 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf2a5acf56b14171a5a2864e56a6776f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bfdb76f-0f89-4a13-995e-55a12ac2e6c3, chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=d32edf6e-af3c-4889-8b25-6677d97e68ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:04:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:18.586 162652 INFO neutron.agent.ovn.metadata.agent [-] Port d32edf6e-af3c-4889-8b25-6677d97e68ca in datapath 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa bound to our chassis
Feb 20 10:04:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:18.589 162652 DEBUG neutron.agent.ovn.metadata.agent [-] Port 5aa4082c-2d79-46f5-890f-3fe4407bf387 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 20 10:04:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:18.589 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:04:18 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:18.591 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[7b41d26b-adda-46c5-af12-b179a9c6550b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:04:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:18Z|00434|binding|INFO|Setting lport d32edf6e-af3c-4889-8b25-6677d97e68ca ovn-installed in OVS
Feb 20 10:04:18 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:18Z|00435|binding|INFO|Setting lport d32edf6e-af3c-4889-8b25-6677d97e68ca up in Southbound
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain sshd[324307]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:18 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:18.667 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:18 np0005625204.localdomain sshd[324307]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:04:19 np0005625204.localdomain ceph-mon[301857]: pgmap v664: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:19 np0005625204.localdomain podman[324354]: 
Feb 20 10:04:19 np0005625204.localdomain podman[324354]: 2026-02-20 10:04:19.633893326 +0000 UTC m=+0.099550579 container create d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 20 10:04:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:04:19 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:04:19 np0005625204.localdomain podman[324354]: 2026-02-20 10:04:19.584304652 +0000 UTC m=+0.049961946 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 20 10:04:19 np0005625204.localdomain systemd[1]: Started libpod-conmon-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e.scope.
Feb 20 10:04:19 np0005625204.localdomain systemd[1]: Started libcrun container.
Feb 20 10:04:19 np0005625204.localdomain sshd[324287]: Received disconnect from 188.166.218.64 port 41798:11: Bye Bye [preauth]
Feb 20 10:04:19 np0005625204.localdomain sshd[324287]: Disconnected from authenticating user root 188.166.218.64 port 41798 [preauth]
Feb 20 10:04:19 np0005625204.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/49096323e1fb7225bf70181bb6d0806fa8b81d0cddf9541ad891a72b23a8852e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 20 10:04:19 np0005625204.localdomain podman[324354]: 2026-02-20 10:04:19.729844983 +0000 UTC m=+0.195502226 container init d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:19 np0005625204.localdomain podman[324354]: 2026-02-20 10:04:19.738717464 +0000 UTC m=+0.204374707 container start d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:19 np0005625204.localdomain dnsmasq[324392]: started, version 2.85 cachesize 150
Feb 20 10:04:19 np0005625204.localdomain dnsmasq[324392]: DNS service limited to local subnets
Feb 20 10:04:19 np0005625204.localdomain dnsmasq[324392]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 20 10:04:19 np0005625204.localdomain dnsmasq[324392]: warning: no upstream servers configured
Feb 20 10:04:19 np0005625204.localdomain dnsmasq-dhcp[324392]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 20 10:04:19 np0005625204.localdomain dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 0 addresses
Feb 20 10:04:19 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host
Feb 20 10:04:19 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts
Feb 20 10:04:19 np0005625204.localdomain podman[324369]: 2026-02-20 10:04:19.801119237 +0000 UTC m=+0.110179782 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 10:04:19 np0005625204.localdomain podman[324369]: 2026-02-20 10:04:19.836113816 +0000 UTC m=+0.145174411 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 20 10:04:19 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:04:19 np0005625204.localdomain podman[324368]: 2026-02-20 10:04:19.851958649 +0000 UTC m=+0.167200892 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Feb 20 10:04:19 np0005625204.localdomain podman[324368]: 2026-02-20 10:04:19.916096746 +0000 UTC m=+0.231338979 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 20 10:04:19 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:04:19 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:19.929 264355 INFO neutron.agent.dhcp.agent [None req-080ebf1a-31de-4eee-8d19-78a05a4476e4 - - - - - -] DHCP configuration for ports {'2c135f3f-0ca4-4c6b-9c27-1389cfad3242'} is completed
Feb 20 10:04:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:20.135 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:20 np0005625204.localdomain systemd[1]: tmp-crun.Mgdatz.mount: Deactivated successfully.
Feb 20 10:04:20 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:20.645 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:04:20Z, description=, device_id=e57870c6-94a1-474d-9937-954c4e871cf2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62ba760>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62baaf0>], id=f7a85e65-2443-4667-95de-2ffc5f519eac, ip_allocation=immediate, mac_address=fa:16:3e:d6:73:42, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:04:16Z, description=, dns_domain=, id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1219578190-network, port_security_enabled=True, project_id=cf2a5acf56b14171a5a2864e56a6776f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28209, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3934, status=ACTIVE, subnets=['257e5086-6db1-4f5c-9f51-784fc979c3a4'], tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:17Z, vlan_transparent=None, network_id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, port_security_enabled=False, project_id=cf2a5acf56b14171a5a2864e56a6776f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3942, status=DOWN, tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:20Z on network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa
Feb 20 10:04:20 np0005625204.localdomain podman[324433]: 2026-02-20 10:04:20.901215923 +0000 UTC m=+0.093617877 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 20 10:04:20 np0005625204.localdomain dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 1 addresses
Feb 20 10:04:20 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host
Feb 20 10:04:20 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts
Feb 20 10:04:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:21.181 264355 INFO neutron.agent.dhcp.agent [None req-1ea73423-3f5c-426f-8d76-7b1747981187 - - - - - -] DHCP configuration for ports {'f7a85e65-2443-4667-95de-2ffc5f519eac'} is completed
Feb 20 10:04:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:21.360 264355 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-20T10:04:20Z, description=, device_id=e57870c6-94a1-474d-9937-954c4e871cf2, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62baa60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f3df62baac0>], id=f7a85e65-2443-4667-95de-2ffc5f519eac, ip_allocation=immediate, mac_address=fa:16:3e:d6:73:42, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-20T10:04:16Z, description=, dns_domain=, id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PrometheusGabbiTest-1219578190-network, port_security_enabled=True, project_id=cf2a5acf56b14171a5a2864e56a6776f, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=28209, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3934, status=ACTIVE, subnets=['257e5086-6db1-4f5c-9f51-784fc979c3a4'], tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:17Z, vlan_transparent=None, network_id=2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, port_security_enabled=False, project_id=cf2a5acf56b14171a5a2864e56a6776f, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3942, status=DOWN, tags=[], tenant_id=cf2a5acf56b14171a5a2864e56a6776f, updated_at=2026-02-20T10:04:20Z on network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa
Feb 20 10:04:21 np0005625204.localdomain ceph-mon[301857]: pgmap v665: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:21 np0005625204.localdomain podman[324473]: 2026-02-20 10:04:21.564579053 +0000 UTC m=+0.060724574 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 20 10:04:21 np0005625204.localdomain dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 1 addresses
Feb 20 10:04:21 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host
Feb 20 10:04:21 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts
Feb 20 10:04:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:21 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:21.936 264355 INFO neutron.agent.dhcp.agent [None req-8fc83402-ddf5-466b-a0e5-3d5fe3ddf4b9 - - - - - -] DHCP configuration for ports {'f7a85e65-2443-4667-95de-2ffc5f519eac'} is completed
Feb 20 10:04:22 np0005625204.localdomain sshd[324495]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:23 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:23.249 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:23 np0005625204.localdomain ceph-mon[301857]: pgmap v666: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:23 np0005625204.localdomain sshd[324495]: Invalid user mailuser from 86.99.116.54 port 38460
Feb 20 10:04:24 np0005625204.localdomain sshd[324495]: Received disconnect from 86.99.116.54 port 38460:11: Bye Bye [preauth]
Feb 20 10:04:24 np0005625204.localdomain sshd[324495]: Disconnected from invalid user mailuser 86.99.116.54 port 38460 [preauth]
Feb 20 10:04:25 np0005625204.localdomain ceph-mon[301857]: pgmap v667: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:26 np0005625204.localdomain ceph-mon[301857]: pgmap v668: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:04:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:04:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:04:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:04:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:04:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:04:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:27 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:27.173 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:28 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:04:28 np0005625204.localdomain systemd[1]: tmp-crun.m03X9m.mount: Deactivated successfully.
Feb 20 10:04:28 np0005625204.localdomain podman[324497]: 2026-02-20 10:04:28.169938261 +0000 UTC m=+0.110728350 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:28 np0005625204.localdomain podman[324497]: 2026-02-20 10:04:28.180458292 +0000 UTC m=+0.121248391 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 20 10:04:28 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:04:28 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:28.282 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:29 np0005625204.localdomain ceph-mon[301857]: pgmap v669: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:31 np0005625204.localdomain ceph-mon[301857]: pgmap v670: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:32 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:32.037 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:33 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:33.286 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:34 np0005625204.localdomain ceph-mon[301857]: pgmap v671: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:34 np0005625204.localdomain sshd[324517]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:35 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:04:35 np0005625204.localdomain podman[324519]: 2026-02-20 10:04:35.16118649 +0000 UTC m=+0.086470878 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 10:04:35 np0005625204.localdomain ceph-mon[301857]: pgmap v672: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:35 np0005625204.localdomain podman[324519]: 2026-02-20 10:04:35.198962623 +0000 UTC m=+0.124247001 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:04:35 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:04:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e276 e276: 6 total, 6 up, 6 in
Feb 20 10:04:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:37 np0005625204.localdomain ceph-mon[301857]: pgmap v673: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 4.2 KiB/s rd, 682 B/s wr, 6 op/s
Feb 20 10:04:37 np0005625204.localdomain ceph-mon[301857]: osdmap e276: 6 total, 6 up, 6 in
Feb 20 10:04:37 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e277 e277: 6 total, 6 up, 6 in
Feb 20 10:04:37 np0005625204.localdomain sshd[324517]: Invalid user sysadmin from 182.93.7.194 port 34980
Feb 20 10:04:37 np0005625204.localdomain sshd[324517]: Received disconnect from 182.93.7.194 port 34980:11: Bye Bye [preauth]
Feb 20 10:04:37 np0005625204.localdomain sshd[324517]: Disconnected from invalid user sysadmin 182.93.7.194 port 34980 [preauth]
Feb 20 10:04:37 np0005625204.localdomain sshd[324541]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:38 np0005625204.localdomain ceph-mon[301857]: osdmap e277: 6 total, 6 up, 6 in
Feb 20 10:04:38 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:38.288 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:38 np0005625204.localdomain sshd[324541]: Invalid user oracle from 103.191.14.210 port 36958
Feb 20 10:04:39 np0005625204.localdomain sshd[324541]: Received disconnect from 103.191.14.210 port 36958:11: Bye Bye [preauth]
Feb 20 10:04:39 np0005625204.localdomain sshd[324541]: Disconnected from invalid user oracle 103.191.14.210 port 36958 [preauth]
Feb 20 10:04:39 np0005625204.localdomain ceph-mon[301857]: pgmap v676: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 6.4 KiB/s rd, 1023 B/s wr, 9 op/s
Feb 20 10:04:40 np0005625204.localdomain dnsmasq[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/addn_hosts - 0 addresses
Feb 20 10:04:40 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/host
Feb 20 10:04:40 np0005625204.localdomain dnsmasq-dhcp[324392]: read /var/lib/neutron/dhcp/2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa/opts
Feb 20 10:04:40 np0005625204.localdomain podman[324560]: 2026-02-20 10:04:40.460422796 +0000 UTC m=+0.057880947 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:40Z|00436|binding|INFO|Releasing lport d32edf6e-af3c-4889-8b25-6677d97e68ca from this chassis (sb_readonly=0)
Feb 20 10:04:40 np0005625204.localdomain kernel: device tapd32edf6e-af left promiscuous mode
Feb 20 10:04:40 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:40Z|00437|binding|INFO|Setting lport d32edf6e-af3c-4889-8b25-6677d97e68ca down in Southbound
Feb 20 10:04:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:40.689 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:40.698 162652 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005625204.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpddea5d95-4823-5483-a49b-f0c62a87e99b-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'cf2a5acf56b14171a5a2864e56a6776f', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005625204.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5bfdb76f-0f89-4a13-995e-55a12ac2e6c3, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>], logical_port=d32edf6e-af3c-4889-8b25-6677d97e68ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbba6197400>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 20 10:04:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:40.702 162652 INFO neutron.agent.ovn.metadata.agent [-] Port d32edf6e-af3c-4889-8b25-6677d97e68ca in datapath 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa unbound from our chassis
Feb 20 10:04:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:40.704 162652 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 20 10:04:40 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:04:40.709 162782 DEBUG oslo.privsep.daemon [-] privsep: reply[32d4053b-7f00-41a3-9e5b-13d54db54837]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 20 10:04:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:40.713 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:40 np0005625204.localdomain ceph-mon[301857]: pgmap v677: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 8.0 KiB/s rd, 1.6 MiB/s wr, 12 op/s
Feb 20 10:04:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:04:42 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:04:42 np0005625204.localdomain podman[324583]: 2026-02-20 10:04:42.145063736 +0000 UTC m=+0.079567038 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, release=1770267347, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-02-05T04:57:10Z, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, name=ubi9/ubi-minimal)
Feb 20 10:04:42 np0005625204.localdomain podman[324583]: 2026-02-20 10:04:42.161063614 +0000 UTC m=+0.095566846 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1770267347, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_id=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 20 10:04:42 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:04:42 np0005625204.localdomain podman[324584]: 2026-02-20 10:04:42.254544227 +0000 UTC m=+0.186947655 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:04:42 np0005625204.localdomain podman[324584]: 2026-02-20 10:04:42.262441737 +0000 UTC m=+0.194845165 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 20 10:04:42 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:04:42 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:42Z|00438|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:04:42 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:42.811 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:42 np0005625204.localdomain ceph-mon[301857]: pgmap v678: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 55 op/s
Feb 20 10:04:43 np0005625204.localdomain dnsmasq[324392]: exiting on receipt of SIGTERM
Feb 20 10:04:43 np0005625204.localdomain podman[324642]: 2026-02-20 10:04:43.257488948 +0000 UTC m=+0.060649462 container kill d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 20 10:04:43 np0005625204.localdomain systemd[1]: libpod-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e.scope: Deactivated successfully.
Feb 20 10:04:43 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 e278: 6 total, 6 up, 6 in
Feb 20 10:04:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:43.291 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:43 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:43.293 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:43 np0005625204.localdomain podman[324658]: 2026-02-20 10:04:43.337680394 +0000 UTC m=+0.058704802 container died d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 20 10:04:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e-userdata-shm.mount: Deactivated successfully.
Feb 20 10:04:43 np0005625204.localdomain systemd[1]: var-lib-containers-storage-overlay-49096323e1fb7225bf70181bb6d0806fa8b81d0cddf9541ad891a72b23a8852e-merged.mount: Deactivated successfully.
Feb 20 10:04:43 np0005625204.localdomain podman[324658]: 2026-02-20 10:04:43.38373387 +0000 UTC m=+0.104758208 container remove d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-2e2eb23b-f1a5-4d6e-92da-7b51adfa2daa, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 10:04:43 np0005625204.localdomain systemd[1]: libpod-conmon-d549f209759fb14c53898cf162d8c43ee87418169ee71f1f44bf7f24a2d0f72e.scope: Deactivated successfully.
Feb 20 10:04:43 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:43.407 264355 INFO neutron.agent.dhcp.agent [None req-bfcfd522-9ba2-4fa8-a8bc-49ae408e8ef3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:04:43 np0005625204.localdomain neutron_dhcp_agent[264351]: 2026-02-20 10:04:43.408 264355 INFO neutron.agent.dhcp.agent [None req-bfcfd522-9ba2-4fa8-a8bc-49ae408e8ef3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 20 10:04:43 np0005625204.localdomain systemd[1]: run-netns-qdhcp\x2d2e2eb23b\x2df1a5\x2d4d6e\x2d92da\x2d7b51adfa2daa.mount: Deactivated successfully.
Feb 20 10:04:44 np0005625204.localdomain ceph-mon[301857]: osdmap e278: 6 total, 6 up, 6 in
Feb 20 10:04:45 np0005625204.localdomain ceph-mon[301857]: pgmap v680: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 35 KiB/s rd, 2.7 MiB/s wr, 48 op/s
Feb 20 10:04:46 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:04:46Z|00439|binding|INFO|Releasing lport 3323e11d-576a-42f3-bcca-e10425268e61 from this chassis (sb_readonly=0)
Feb 20 10:04:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:46.389 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:47 np0005625204.localdomain ceph-mon[301857]: pgmap v681: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.4 MiB/s wr, 43 op/s
Feb 20 10:04:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:04:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:04:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:04:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:04:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:04:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18356 "" "Go-http-client/1.1"
Feb 20 10:04:48 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:48.343 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:49 np0005625204.localdomain ceph-mon[301857]: pgmap v682: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 2.0 MiB/s wr, 36 op/s
Feb 20 10:04:49 np0005625204.localdomain sshd[324685]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:04:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:04:50 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:04:50 np0005625204.localdomain podman[324688]: 2026-02-20 10:04:50.156657398 +0000 UTC m=+0.088437659 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 10:04:50 np0005625204.localdomain podman[324688]: 2026-02-20 10:04:50.190953465 +0000 UTC m=+0.122733696 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true)
Feb 20 10:04:50 np0005625204.localdomain sshd[324685]: Invalid user jcarlos from 154.91.170.41 port 60238
Feb 20 10:04:50 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:04:50 np0005625204.localdomain systemd[1]: tmp-crun.ozpwH3.mount: Deactivated successfully.
Feb 20 10:04:50 np0005625204.localdomain podman[324687]: 2026-02-20 10:04:50.267361656 +0000 UTC m=+0.201123947 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:04:50 np0005625204.localdomain sshd[324685]: Received disconnect from 154.91.170.41 port 60238:11: Bye Bye [preauth]
Feb 20 10:04:50 np0005625204.localdomain sshd[324685]: Disconnected from invalid user jcarlos 154.91.170.41 port 60238 [preauth]
Feb 20 10:04:50 np0005625204.localdomain podman[324687]: 2026-02-20 10:04:50.334170305 +0000 UTC m=+0.267932606 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 20 10:04:50 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:04:51 np0005625204.localdomain ceph-mon[301857]: pgmap v683: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 821 KiB/s wr, 33 op/s
Feb 20 10:04:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:52 np0005625204.localdomain sudo[324725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:04:52 np0005625204.localdomain sudo[324725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:04:52 np0005625204.localdomain sudo[324725]: pam_unix(sudo:session): session closed for user root
Feb 20 10:04:52 np0005625204.localdomain sudo[324743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 check-host
Feb 20 10:04:52 np0005625204.localdomain sudo[324743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:04:53 np0005625204.localdomain sudo[324743]: pam_unix(sudo:session): session closed for user root
Feb 20 10:04:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:53.348 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:04:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:53.351 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:04:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:53.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5007 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:04:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:53.352 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:04:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:53.381 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:53 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:53.382 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:04:53 np0005625204.localdomain ceph-mon[301857]: pgmap v684: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:53 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:53 np0005625204.localdomain sudo[324782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:04:53 np0005625204.localdomain sudo[324782]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:04:53 np0005625204.localdomain sudo[324782]: pam_unix(sudo:session): session closed for user root
Feb 20 10:04:53 np0005625204.localdomain sudo[324800]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:04:53 np0005625204.localdomain sudo[324800]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:04:54 np0005625204.localdomain sudo[324800]: pam_unix(sudo:session): session closed for user root
Feb 20 10:04:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:54 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:54 np0005625204.localdomain sudo[324850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:04:54 np0005625204.localdomain sudo[324850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:04:54 np0005625204.localdomain sudo[324850]: pam_unix(sudo:session): session closed for user root
Feb 20 10:04:55 np0005625204.localdomain ceph-mon[301857]: pgmap v685: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:04:55 np0005625204.localdomain ceph-mon[301857]: mgrmap e56: np0005625202.arwxwo(active, since 16m), standbys: np0005625203.lonygy, np0005625204.exgrzx
Feb 20 10:04:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:04:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:04:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:04:55 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:04:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:04:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:04:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:04:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:04:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:04:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:04:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:04:57 np0005625204.localdomain ceph-mon[301857]: pgmap v686: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.305722) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898305843, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1036, "num_deletes": 251, "total_data_size": 1541594, "memory_usage": 1563584, "flush_reason": "Manual Compaction"}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898315977, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1012543, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33039, "largest_seqno": 34070, "table_properties": {"data_size": 1008141, "index_size": 2065, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9542, "raw_average_key_size": 18, "raw_value_size": 999022, "raw_average_value_size": 1982, "num_data_blocks": 86, "num_entries": 504, "num_filter_entries": 504, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581838, "oldest_key_time": 1771581838, "file_creation_time": 1771581898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 10325 microseconds, and 5245 cpu microseconds.
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.316061) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1012543 bytes OK
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.316092) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.318333) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.318396) EVENT_LOG_v1 {"time_micros": 1771581898318384, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.318437) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1536419, prev total WAL file size 1536419, number of live WAL files 2.
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.319519) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353238' seq:72057594037927935, type:22 .. '6B760031373739' seq:0, type:0; will stop at (end)
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(988KB)], [51(19MB)]
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898319605, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 21455372, "oldest_snapshot_seqno": -1}
Feb 20 10:04:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:58.383 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:04:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:58.417 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:04:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:58.417 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:04:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:58.418 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 14329 keys, 20410290 bytes, temperature: kUnknown
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898419759, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 20410290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20325987, "index_size": 47402, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35845, "raw_key_size": 384697, "raw_average_key_size": 26, "raw_value_size": 20080133, "raw_average_value_size": 1401, "num_data_blocks": 1768, "num_entries": 14329, "num_filter_entries": 14329, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581898, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:04:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:58.419 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:04:58 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:04:58.420 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.420095) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 20410290 bytes
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.422101) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 214.0 rd, 203.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 19.5 +0.0 blob) out(19.5 +0.0 blob), read-write-amplify(41.3) write-amplify(20.2) OK, records in: 14860, records dropped: 531 output_compression: NoCompression
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.422132) EVENT_LOG_v1 {"time_micros": 1771581898422117, "job": 30, "event": "compaction_finished", "compaction_time_micros": 100239, "compaction_time_cpu_micros": 54165, "output_level": 6, "num_output_files": 1, "total_output_size": 20410290, "num_input_records": 14860, "num_output_records": 14329, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898422397, "job": 30, "event": "table_file_deletion", "file_number": 53}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581898425187, "job": 30, "event": "table_file_deletion", "file_number": 51}
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.319364) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425277) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425284) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:04:58 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:04:58.425286) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:04:59 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:04:59 np0005625204.localdomain podman[324868]: 2026-02-20 10:04:59.143627492 +0000 UTC m=+0.081334183 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:59 np0005625204.localdomain podman[324868]: 2026-02-20 10:04:59.157071432 +0000 UTC m=+0.094778103 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:04:59 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:04:59 np0005625204.localdomain ceph-mon[301857]: pgmap v687: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 20 10:04:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:05:00 np0005625204.localdomain ceph-mon[301857]: pgmap v688: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 20 10:05:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:01.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3731268602' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3731268602' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: pgmap v689: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3731268602' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:05:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/3731268602' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:05:03 np0005625204.localdomain sshd[324888]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:03.422 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:03.424 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:03.424 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:03.425 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:03.454 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:03 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:03.455 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:04 np0005625204.localdomain sshd[324888]: Invalid user ssh-user from 203.228.30.198 port 50226
Feb 20 10:05:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/309175783' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:04 np0005625204.localdomain sshd[324888]: Received disconnect from 203.228.30.198 port 50226:11: Bye Bye [preauth]
Feb 20 10:05:04 np0005625204.localdomain sshd[324888]: Disconnected from invalid user ssh-user 203.228.30.198 port 50226 [preauth]
Feb 20 10:05:04 np0005625204.localdomain sshd[324890]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:05 np0005625204.localdomain sshd[324890]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:05:05 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:05:05 np0005625204.localdomain podman[324892]: 2026-02-20 10:05:05.366139398 +0000 UTC m=+0.082119007 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 10:05:05 np0005625204.localdomain podman[324892]: 2026-02-20 10:05:05.404108146 +0000 UTC m=+0.120087715 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:05:05 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:05:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:05.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:05 np0005625204.localdomain ceph-mon[301857]: pgmap v690: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 20 10:05:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/206249604' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:05:06.027 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:05:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:05:06.028 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:05:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:05:06.029 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:05:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:06 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:06.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:06 np0005625204.localdomain ceph-mon[301857]: pgmap v691: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 20 10:05:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:07.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.455 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.457 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:05:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:08.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:05:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:10.523 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:10.525 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:10 np0005625204.localdomain ceph-mon[301857]: pgmap v692: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 20 10:05:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:05:10 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2880488553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:10.819 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.047 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.048 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.250 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.252 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11196MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.252 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.253 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.337 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.337 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.338 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.383 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:05:11 np0005625204.localdomain ceph-mon[301857]: pgmap v693: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 20 10:05:11 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2880488553' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:11 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/226919907' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:05:11 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1236691951' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.875 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.882 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.899 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.902 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:05:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:11.902 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.649s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:05:12 np0005625204.localdomain ceph-mon[301857]: pgmap v694: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 20 10:05:12 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1236691951' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:12 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1551072278' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:05:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:12.903 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:12.903 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:12.904 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:05:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:12.904 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:05:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:05:13 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:05:13 np0005625204.localdomain podman[324960]: 2026-02-20 10:05:13.161903725 +0000 UTC m=+0.085010884 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9/ubi-minimal, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1770267347, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Feb 20 10:05:13 np0005625204.localdomain podman[324960]: 2026-02-20 10:05:13.203101693 +0000 UTC m=+0.126208802 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1770267347, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, config_id=openstack_network_exporter, managed_by=edpm_ansible, version=9.7, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, architecture=x86_64, org.opencontainers.image.created=2026-02-05T04:57:10Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.213 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.213 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.214 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.214 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:05:13 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:05:13 np0005625204.localdomain podman[324961]: 2026-02-20 10:05:13.223325699 +0000 UTC m=+0.140447336 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:05:13 np0005625204.localdomain podman[324961]: 2026-02-20 10:05:13.236085749 +0000 UTC m=+0.153207356 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 10:05:13 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.656 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.677 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.678 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.678 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.720 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:13.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:05:13 np0005625204.localdomain sshd[325005]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:13 np0005625204.localdomain sshd[325005]: Accepted publickey for zuul from 38.102.83.114 port 43892 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:13 np0005625204.localdomain systemd-logind[759]: New session 74 of user zuul.
Feb 20 10:05:13 np0005625204.localdomain systemd[1]: Started Session 74 of User zuul.
Feb 20 10:05:13 np0005625204.localdomain sshd[325005]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:14 np0005625204.localdomain sudo[325025]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enfpwhzhfhiihjokphwknvngxxkeemkb ; /usr/bin/python3
Feb 20 10:05:14 np0005625204.localdomain sudo[325025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:14 np0005625204.localdomain python3[325027]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-9e25-3a25-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 20 10:05:14 np0005625204.localdomain sudo[325025]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:15 np0005625204.localdomain ceph-mon[301857]: pgmap v695: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:15.526 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:15.529 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:15.529 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:15.530 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:15.556 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:15.556 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:17 np0005625204.localdomain ovn_controller[156798]: 2026-02-20T10:05:17Z|00440|memory_trim|INFO|Detected inactivity (last active 30013 ms ago): trimming memory
Feb 20 10:05:17 np0005625204.localdomain ceph-mon[301857]: pgmap v696: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:05:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:05:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:05:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:05:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:05:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18351 "" "Go-http-client/1.1"
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.321 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'name': 'test', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'np0005625204.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '91bce661d685472eb3e7cacab17bf52a', 'user_id': '141ec720081546bb92f7e9338deb8445', 'hostId': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.322 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.351 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 389120 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.352 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.bytes volume: 512 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc1a5acc-16a3-415a-bb22-47430dd09ed1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 389120, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.323181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8d80af8-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '4bb8d7aa4b08c5ac2e1ac52748e6235eb9b6198176fc65ea951d9d2781487051'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 512, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.323181', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8d826b4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '008853a379d2a3000938877f00bc3c42349e69496e8aefe858e487af276fac05'}]}, 'timestamp': '2026-02-20 10:05:18.353462', '_unique_id': 'c20d32d7b628424b9c14b61690a92125'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.355 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.356 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.356 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 1324972840 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.357 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.latency volume: 28227071 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cf5db138-1fa6-494c-bef4-aa57204ebc65', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1324972840, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.356597', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8d8bc82-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '89dc85777c359a01bda9440cd6a8dec9f1106da60b21563fd962bca3b2cfb817'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28227071, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.356597', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8d8d6e0-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'cb1c3b897595eca11b9fdce0c736509e744e2b0f43e44a6391e06a06d2cc7b4a'}]}, 'timestamp': '2026-02-20 10:05:18.358012', '_unique_id': '36b5a04ce672485baea05792b41e2470'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.359 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.372 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.373 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ebc6d929-3679-438c-a5d8-380ed70a9fe2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.361222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8db2b7a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '91076383257015028fdd11cb1114204af5efdf2c5901fe0ef25186eec90d3ec6'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.361222', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8db4d26-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': 'd0d9832ce268bb96148dcfdd8c351638a1088ad467b5a473fa9e7a53e3038842'}]}, 'timestamp': '2026-02-20 10:05:18.374177', '_unique_id': '7463d041ae7b469eac68a3bb7180433d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.375 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.376 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 1272 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.377 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.requests volume: 124 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '969cdeb7-6c56-49c5-afd9-49f75a81aa38', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1272, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.376702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8dbcc10-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'fa6a7d3e1b59b6ea93970ea6008a70e456ecd084834d7f821732b8819cf7f3cf'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 124, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.376702', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8dbe66e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'ec71536623297488afb638bec8e59d01a21d6b6730e7bda2a38926d1e119ea55'}]}, 'timestamp': '2026-02-20 10:05:18.378069', '_unique_id': '2964b6d46b2e4221b614308b502fa7f9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.379 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.381 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.398 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/memory.usage volume: 51.60546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '90f79615-914a-4846-829f-2d6e93b22978', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 51.60546875, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:05:18.381593', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0}, 'message_id': 'a8df2720-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.637858066, 'message_signature': '7a338690fbb3b9e5764a1ac6a989c653db9d5f8068dbcc688eecf6904ecdabec'}]}, 'timestamp': '2026-02-20 10:05:18.399386', '_unique_id': 'aa4981ad14ce49cba38c82d43390a7ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.400 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.401 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.401 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 4362901801 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.402 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.latency volume: 101633057 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3bf7cd13-c7d0-48e0-8695-529c4ba142c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4362901801, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.401883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8dfa358-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '4e3b8ac6482649ac454a84bb69cfdbc37ea14c79483035ac779bd7b8da3158cb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 101633057, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.401883', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8dfbd34-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '31c1433eab048cc41d14c89737311e5b312907dcf86cf197a79ade01a3fe18cc'}]}, 'timestamp': '2026-02-20 10:05:18.403222', '_unique_id': '8f71d81c636248c28d85aabc917001e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.404 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.406 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.410 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '80d0f8fe-e992-4c01-b94a-da3dfe2c8670', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.406863', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e0f3a2-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '41565f429267b62a3b9031a01f269fef77fef1a41e595502a022c1ebfecfa019'}]}, 'timestamp': '2026-02-20 10:05:18.411213', '_unique_id': 'fa98f04052ef4f618f70a93712b69231'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.412 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.414 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.414 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '67b8143b-b1ad-453c-8f0e-e2da452dc3c9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.414230', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e185b0-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '0a5d7cfeb21164a31c6af2f9f06ed4cad1fc2add8a04f8604548840b38fdbb18'}]}, 'timestamp': '2026-02-20 10:05:18.414972', '_unique_id': '1b33f1ef955744dcabc8a8d229cfccc4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.416 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.417 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.418 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.419 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4708b033-a101-48dd-9713-676335401b1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.418332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e226b4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '42a36472751d663c9daebf95878ff780d9efedd298658bfbb846e32ffcaca6bf'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.418332', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e23f50-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '25791f15f5b445e4a42593240d50b39e7606e7dde9c20b6dd5c1ab7c6bccb63e'}]}, 'timestamp': '2026-02-20 10:05:18.419692', '_unique_id': 'f7dffba734f0400ca9caa0830841f7ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.421 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.423 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.423 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5444f33d-003c-4c96-8c81-73b87fedff52', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.423242', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e2e32e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'b4f5df9e9334420d8844b50feea9c872792b515980331d66ac1585a89615f8f1'}]}, 'timestamp': '2026-02-20 10:05:18.423860', '_unique_id': '58b800d1f8f545efabfd3b2f7e013438'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.425 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.426 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.426 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes volume: 9770 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f7cecbe-c7a4-4def-977e-9a7988666f62', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 9770, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.426287', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e357e6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '0c0e76bc42114c348100d8bd6ae3183df509c25f04ead9df8429ea3f779ec9ac'}]}, 'timestamp': '2026-02-20 10:05:18.426789', '_unique_id': '5f512283d9a548b58886e12e7c7dfa29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.427 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.428 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.429 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '76c71744-a304-466f-992b-d03a61cfaaa7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.429101', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e3c5e6-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '4f44b40701ebd6f011e899e2acca4065d746065e55d1414cf15a5cd4da050d10'}]}, 'timestamp': '2026-02-20 10:05:18.429577', '_unique_id': 'a111999803d34727890553e0ba73c296'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.430 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.431 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.431 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1979d68-ed82-4ff4-81b6-5fde694b0996', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.431930', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e4345e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'c914b34863b47bfe046b67097a8c5dbc4ba34c6a3f39cce42f6b80d36ef58b09'}]}, 'timestamp': '2026-02-20 10:05:18.432403', '_unique_id': '9ae79b11c49b423596ff87b89d5c84d3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.433 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.434 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.434 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/cpu volume: 20770000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8092272-0ab0-4399-8549-a0a8665914c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20770000000, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'timestamp': '2026-02-20T10:05:18.434566', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'cpu_number': 1}, 'message_id': 'a8e49cbe-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.637858066, 'message_signature': '245df19d32f1d90a0f76872581db63aefa9a6078d2c001730266c4bf7b0bda63'}]}, 'timestamp': '2026-02-20 10:05:18.435060', '_unique_id': '2cd05016d8b44c728a43597837f73d6e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.436 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.437 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no new  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.437 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.packets volume: 114 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '98bc014b-6869-4ecb-8493-48bd4f0e41b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 114, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.437528', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e50d7a-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '54bac9ee305e7a5c73748d6b3711b6f6e59c1271f3d0577b618236ed72cb7d70'}]}, 'timestamp': '2026-02-20 10:05:18.437872', '_unique_id': 'f710c58cb57e4ab78efed8a8230f3c49'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.438 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.439 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.439 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.bytes volume: 6809 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea0e7861-12e8-423f-9dc1-85fe3ae32332', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6809, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.439178', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e54c0e-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': '8dd6fd81ea4a1db988bbf52ca02ee3ae7f084540ba96abbde2c58b9d9ede4de5'}]}, 'timestamp': '2026-02-20 10:05:18.439480', '_unique_id': 'ae6f20c1022249bfb54fc9d95c5d28f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.440 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8bcb4361-e7ad-42e5-973c-7a254ada144f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.440847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e58d54-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '150e852923e574f8d606bad49d4661ea50a1f59f719e119accfba579b88f5ab1'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.440847', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e597ae-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.600483485, 'message_signature': '19ab0dcfc09885260c4733d44641d80646b99bdac05e6629ff4f9f177ac4aaed'}]}, 'timestamp': '2026-02-20 10:05:18.441389', '_unique_id': '862297e7703f45a38db1d10f234677b0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.441 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.442 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.442 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.incoming.packets volume: 60 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28700196-8abf-4e4d-a600-ba19af2b27cf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 60, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.442750', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e5d796-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'fac41ed69908adcf957d16a9bbe832e6876ae56141295f01e49e5c3e45ead572'}]}, 'timestamp': '2026-02-20 10:05:18.443042', '_unique_id': '687f2940994841b58cdab035e9fd7eb8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.443 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.444 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.444 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 35560448 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.444 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.read.bytes volume: 2154496 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dceb67b0-4812-45c4-be82-5df840c94ff2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 35560448, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.444461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e61ae4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '7d4b0d468b8192614bbd8249ffba97f02b9e59d214f8824357be9fbb68d28afa'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2154496, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.444461', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e62782-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'e4bf3769f8086bf3e3ed58068c7dd73fe719733fc55323f54f8f0ecacdf47a54'}]}, 'timestamp': '2026-02-20 10:05:18.445074', '_unique_id': '2ef5e6837613465eb397c6f6e6f026df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.445 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.446 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.446 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 47 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.446 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/disk.device.write.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37587125-7a23-466e-b97a-7b4140166098', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 47, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vda', 'timestamp': '2026-02-20T10:05:18.446411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vda'}, 'message_id': 'a8e66684-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': '4e4550ec6e7159d38cfdae9b4b106436d272c1ee6789c1b161554f3e9aa08523'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'f9924957-6cff-426e-9f03-c739820f4ff3-vdb', 'timestamp': '2026-02-20T10:05:18.446411', 'resource_metadata': {'display_name': 'test', 'name': 'instance-00000002', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'disk_name': 'vdb'}, 'message_id': 'a8e671c4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.562403573, 'message_signature': 'bd34cb9a3eff0ada9945b1a0c2587dfaa6a71794e0890fffe75c55eb3159926a'}]}, 'timestamp': '2026-02-20 10:05:18.446971', '_unique_id': '09c0c220b4254014814cefc79cae15c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.447 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.448 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.448 12 DEBUG ceilometer.compute.pollsters [-] f9924957-6cff-426e-9f03-c739820f4ff3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c021b471-e23f-4af9-9607-945193da0be5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '141ec720081546bb92f7e9338deb8445', 'user_name': None, 'project_id': '91bce661d685472eb3e7cacab17bf52a', 'project_name': None, 'resource_id': 'instance-00000002-f9924957-6cff-426e-9f03-c739820f4ff3-tape7aa8e2a-27', 'timestamp': '2026-02-20T10:05:18.448296', 'resource_metadata': {'display_name': 'test', 'name': 'tape7aa8e2a-27', 'instance_id': 'f9924957-6cff-426e-9f03-c739820f4ff3', 'instance_type': 'm1.small', 'host': '3a725fa94c9918dbeadc86da9628cec1b9430be7b22c52f02a8fd296', 'instance_host': 'np0005625204.localdomain', 'flavor': {'id': '739ef37c-e459-414b-b65a-355581d54c7c', 'name': 'm1.small', 'vcpus': 1, 'ram': 512, 'disk': 1, 'ephemeral': 1, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '43eca6d8-1b99-4300-a417-76015fcc59e1'}, 'image_ref': '43eca6d8-1b99-4300-a417-76015fcc59e1', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 512, 'disk_gb': 1, 'ephemeral_gb': 1, 'root_gb': 0, 'mac': 'fa:16:3e:b0:ed:d2', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape7aa8e2a-27'}, 'message_id': 'a8e6b0e4-0e43-11f1-9294-fa163ef029e2', 'monotonic_time': 12477.646096077, 'message_signature': 'e9c0259a5c8f41acc1ea78dbc7438efc460c8b047924f2f14bfc8b24d98b3398'}]}, 'timestamp': '2026-02-20 10:05:18.448606', '_unique_id': '92ed0a6cb54641e782de61f60e4c80f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     yield
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 20 10:05:18 np0005625204.localdomain ceilometer_agent_compute[237325]: 2026-02-20 10:05:18.449 12 ERROR oslo_messaging.notify.messaging 
Feb 20 10:05:18 np0005625204.localdomain sshd[325005]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:18 np0005625204.localdomain systemd[1]: session-74.scope: Deactivated successfully.
Feb 20 10:05:18 np0005625204.localdomain systemd-logind[759]: Session 74 logged out. Waiting for processes to exit.
Feb 20 10:05:18 np0005625204.localdomain systemd-logind[759]: Removed session 74.
Feb 20 10:05:19 np0005625204.localdomain ceph-mon[301857]: pgmap v697: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:20.557 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:05:21 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:05:21 np0005625204.localdomain podman[325031]: 2026-02-20 10:05:21.153215058 +0000 UTC m=+0.084216640 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 20 10:05:21 np0005625204.localdomain podman[325031]: 2026-02-20 10:05:21.187092292 +0000 UTC m=+0.118093894 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:05:21 np0005625204.localdomain podman[325030]: 2026-02-20 10:05:21.199300274 +0000 UTC m=+0.134552866 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:05:21 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:05:21 np0005625204.localdomain podman[325030]: 2026-02-20 10:05:21.267695801 +0000 UTC m=+0.202948413 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 10:05:21 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:05:21 np0005625204.localdomain ceph-mon[301857]: pgmap v698: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:23 np0005625204.localdomain ceph-mon[301857]: pgmap v699: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:25 np0005625204.localdomain ceph-mon[301857]: pgmap v700: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:25.560 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:25.562 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:25.562 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:25.563 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:25.604 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:25.605 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:05:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:05:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:05:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:05:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:05:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:05:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:27 np0005625204.localdomain ceph-mon[301857]: pgmap v701: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:29 np0005625204.localdomain ceph-mon[301857]: pgmap v702: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:30 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:05:30 np0005625204.localdomain podman[325074]: 2026-02-20 10:05:30.150127304 +0000 UTC m=+0.087713928 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute)
Feb 20 10:05:30 np0005625204.localdomain podman[325074]: 2026-02-20 10:05:30.163290465 +0000 UTC m=+0.100877079 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 20 10:05:30 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:05:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:30.606 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:30.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:30.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:30.608 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:30.644 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:30.645 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:31 np0005625204.localdomain ceph-mon[301857]: pgmap v703: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:33 np0005625204.localdomain ceph-mon[301857]: pgmap v704: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:35 np0005625204.localdomain ceph-mon[301857]: pgmap v705: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:35.646 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:35.647 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:35.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:35.648 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:35.650 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:35.650 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:36 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:05:36 np0005625204.localdomain podman[325094]: 2026-02-20 10:05:36.156854606 +0000 UTC m=+0.093774063 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 10:05:36 np0005625204.localdomain podman[325094]: 2026-02-20 10:05:36.16518323 +0000 UTC m=+0.102102707 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 20 10:05:36 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:05:36 np0005625204.localdomain ceph-mon[301857]: pgmap v706: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:37 np0005625204.localdomain sshd[325117]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:37 np0005625204.localdomain sshd[325117]: Accepted publickey for zuul from 38.102.83.114 port 34026 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:37 np0005625204.localdomain systemd-logind[759]: New session 75 of user zuul.
Feb 20 10:05:37 np0005625204.localdomain systemd[1]: Started Session 75 of User zuul.
Feb 20 10:05:37 np0005625204.localdomain sshd[325117]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:38 np0005625204.localdomain sudo[325121]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Feb 20 10:05:38 np0005625204.localdomain sudo[325121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:39 np0005625204.localdomain sudo[325121]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:39 np0005625204.localdomain sshd[325120]: Received disconnect from 38.102.83.114 port 34026:11: disconnected by user
Feb 20 10:05:39 np0005625204.localdomain sshd[325120]: Disconnected from user zuul 38.102.83.114 port 34026
Feb 20 10:05:39 np0005625204.localdomain sshd[325117]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:39 np0005625204.localdomain systemd[1]: session-75.scope: Deactivated successfully.
Feb 20 10:05:39 np0005625204.localdomain systemd-logind[759]: Session 75 logged out. Waiting for processes to exit.
Feb 20 10:05:39 np0005625204.localdomain systemd-logind[759]: Removed session 75.
Feb 20 10:05:39 np0005625204.localdomain ceph-mon[301857]: pgmap v707: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:39 np0005625204.localdomain sshd[325139]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:39 np0005625204.localdomain sshd[325139]: Accepted publickey for zuul from 38.102.83.114 port 34034 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:39 np0005625204.localdomain systemd-logind[759]: New session 76 of user zuul.
Feb 20 10:05:39 np0005625204.localdomain systemd[1]: Started Session 76 of User zuul.
Feb 20 10:05:39 np0005625204.localdomain sshd[325139]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:39 np0005625204.localdomain sudo[325143]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Feb 20 10:05:39 np0005625204.localdomain sudo[325143]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:39 np0005625204.localdomain sudo[325143]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:39 np0005625204.localdomain sshd[325142]: Received disconnect from 38.102.83.114 port 34034:11: disconnected by user
Feb 20 10:05:39 np0005625204.localdomain sshd[325142]: Disconnected from user zuul 38.102.83.114 port 34034
Feb 20 10:05:39 np0005625204.localdomain sshd[325139]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:39 np0005625204.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Feb 20 10:05:39 np0005625204.localdomain systemd-logind[759]: Session 76 logged out. Waiting for processes to exit.
Feb 20 10:05:39 np0005625204.localdomain systemd-logind[759]: Removed session 76.
Feb 20 10:05:39 np0005625204.localdomain sshd[325161]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:40 np0005625204.localdomain sshd[325161]: Accepted publickey for zuul from 38.102.83.114 port 34048 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:40 np0005625204.localdomain systemd-logind[759]: New session 77 of user zuul.
Feb 20 10:05:40 np0005625204.localdomain systemd[1]: Started Session 77 of User zuul.
Feb 20 10:05:40 np0005625204.localdomain sshd[325161]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:40 np0005625204.localdomain sudo[325165]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Feb 20 10:05:40 np0005625204.localdomain sudo[325165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:40 np0005625204.localdomain sudo[325165]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:40 np0005625204.localdomain sshd[325164]: Received disconnect from 38.102.83.114 port 34048:11: disconnected by user
Feb 20 10:05:40 np0005625204.localdomain sshd[325164]: Disconnected from user zuul 38.102.83.114 port 34048
Feb 20 10:05:40 np0005625204.localdomain sshd[325161]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:40 np0005625204.localdomain systemd[1]: session-77.scope: Deactivated successfully.
Feb 20 10:05:40 np0005625204.localdomain systemd-logind[759]: Session 77 logged out. Waiting for processes to exit.
Feb 20 10:05:40 np0005625204.localdomain systemd-logind[759]: Removed session 77.
Feb 20 10:05:40 np0005625204.localdomain sshd[325183]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:40.652 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:40 np0005625204.localdomain sshd[325183]: Accepted publickey for zuul from 38.102.83.114 port 34064 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:40 np0005625204.localdomain systemd-logind[759]: New session 78 of user zuul.
Feb 20 10:05:40 np0005625204.localdomain systemd[1]: Started Session 78 of User zuul.
Feb 20 10:05:40 np0005625204.localdomain sshd[325183]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:40 np0005625204.localdomain sudo[325187]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Feb 20 10:05:40 np0005625204.localdomain sudo[325187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:40 np0005625204.localdomain sudo[325187]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:40 np0005625204.localdomain sshd[325186]: Received disconnect from 38.102.83.114 port 34064:11: disconnected by user
Feb 20 10:05:40 np0005625204.localdomain sshd[325186]: Disconnected from user zuul 38.102.83.114 port 34064
Feb 20 10:05:40 np0005625204.localdomain sshd[325183]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:40 np0005625204.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Feb 20 10:05:40 np0005625204.localdomain systemd-logind[759]: Session 78 logged out. Waiting for processes to exit.
Feb 20 10:05:40 np0005625204.localdomain systemd-logind[759]: Removed session 78.
Feb 20 10:05:41 np0005625204.localdomain sshd[325205]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:41 np0005625204.localdomain sshd[325205]: Accepted publickey for zuul from 38.102.83.114 port 34066 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:41 np0005625204.localdomain systemd-logind[759]: New session 79 of user zuul.
Feb 20 10:05:41 np0005625204.localdomain systemd[1]: Started Session 79 of User zuul.
Feb 20 10:05:41 np0005625204.localdomain sshd[325205]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:41 np0005625204.localdomain ceph-mon[301857]: pgmap v708: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:41 np0005625204.localdomain sudo[325209]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Feb 20 10:05:41 np0005625204.localdomain sudo[325209]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:41 np0005625204.localdomain sudo[325209]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:41 np0005625204.localdomain sshd[325208]: Received disconnect from 38.102.83.114 port 34066:11: disconnected by user
Feb 20 10:05:41 np0005625204.localdomain sshd[325208]: Disconnected from user zuul 38.102.83.114 port 34066
Feb 20 10:05:41 np0005625204.localdomain sshd[325205]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:41 np0005625204.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Feb 20 10:05:41 np0005625204.localdomain systemd-logind[759]: Session 79 logged out. Waiting for processes to exit.
Feb 20 10:05:41 np0005625204.localdomain systemd-logind[759]: Removed session 79.
Feb 20 10:05:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:41 np0005625204.localdomain sshd[325227]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:41 np0005625204.localdomain sshd[325227]: Accepted publickey for zuul from 38.102.83.114 port 34072 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:41 np0005625204.localdomain systemd-logind[759]: New session 80 of user zuul.
Feb 20 10:05:41 np0005625204.localdomain systemd[1]: Started Session 80 of User zuul.
Feb 20 10:05:41 np0005625204.localdomain sshd[325227]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:41 np0005625204.localdomain sudo[325231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Feb 20 10:05:41 np0005625204.localdomain sudo[325231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:41 np0005625204.localdomain sudo[325231]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:41 np0005625204.localdomain sshd[325230]: Received disconnect from 38.102.83.114 port 34072:11: disconnected by user
Feb 20 10:05:41 np0005625204.localdomain sshd[325230]: Disconnected from user zuul 38.102.83.114 port 34072
Feb 20 10:05:41 np0005625204.localdomain sshd[325227]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:41 np0005625204.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Feb 20 10:05:41 np0005625204.localdomain systemd-logind[759]: Session 80 logged out. Waiting for processes to exit.
Feb 20 10:05:41 np0005625204.localdomain systemd-logind[759]: Removed session 80.
Feb 20 10:05:42 np0005625204.localdomain sshd[325249]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:42 np0005625204.localdomain sshd[325249]: Accepted publickey for zuul from 38.102.83.114 port 34088 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:42 np0005625204.localdomain systemd-logind[759]: New session 81 of user zuul.
Feb 20 10:05:42 np0005625204.localdomain systemd[1]: Started Session 81 of User zuul.
Feb 20 10:05:42 np0005625204.localdomain sshd[325249]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:42 np0005625204.localdomain sudo[325253]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Feb 20 10:05:42 np0005625204.localdomain sudo[325253]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:42 np0005625204.localdomain sudo[325253]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:42 np0005625204.localdomain sshd[325252]: Received disconnect from 38.102.83.114 port 34088:11: disconnected by user
Feb 20 10:05:42 np0005625204.localdomain sshd[325252]: Disconnected from user zuul 38.102.83.114 port 34088
Feb 20 10:05:42 np0005625204.localdomain sshd[325249]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:42 np0005625204.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Feb 20 10:05:42 np0005625204.localdomain systemd-logind[759]: Session 81 logged out. Waiting for processes to exit.
Feb 20 10:05:42 np0005625204.localdomain systemd-logind[759]: Removed session 81.
Feb 20 10:05:42 np0005625204.localdomain sshd[325271]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:42 np0005625204.localdomain sshd[325271]: Accepted publickey for zuul from 38.102.83.114 port 34090 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:42 np0005625204.localdomain systemd-logind[759]: New session 82 of user zuul.
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: Started Session 82 of User zuul.
Feb 20 10:05:43 np0005625204.localdomain sshd[325271]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:43 np0005625204.localdomain sudo[325275]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Feb 20 10:05:43 np0005625204.localdomain sudo[325275]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:43 np0005625204.localdomain sudo[325275]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:43 np0005625204.localdomain sshd[325274]: Received disconnect from 38.102.83.114 port 34090:11: disconnected by user
Feb 20 10:05:43 np0005625204.localdomain sshd[325274]: Disconnected from user zuul 38.102.83.114 port 34090
Feb 20 10:05:43 np0005625204.localdomain sshd[325271]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Feb 20 10:05:43 np0005625204.localdomain systemd-logind[759]: Session 82 logged out. Waiting for processes to exit.
Feb 20 10:05:43 np0005625204.localdomain systemd-logind[759]: Removed session 82.
Feb 20 10:05:43 np0005625204.localdomain ceph-mon[301857]: pgmap v709: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:43 np0005625204.localdomain sshd[325293]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:05:43 np0005625204.localdomain sshd[325293]: Accepted publickey for zuul from 38.102.83.114 port 34096 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:05:43 np0005625204.localdomain systemd-logind[759]: New session 83 of user zuul.
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: Started Session 83 of User zuul.
Feb 20 10:05:43 np0005625204.localdomain sshd[325293]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:05:43 np0005625204.localdomain podman[325296]: 2026-02-20 10:05:43.64245406 +0000 UTC m=+0.088059648 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 20 10:05:43 np0005625204.localdomain podman[325295]: 2026-02-20 10:05:43.707790163 +0000 UTC m=+0.153566117 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1770267347, build-date=2026-02-05T04:57:10Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, container_name=openstack_network_exporter, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-02-05T04:57:10Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter)
Feb 20 10:05:43 np0005625204.localdomain sudo[325328]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Feb 20 10:05:43 np0005625204.localdomain sudo[325328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:05:43 np0005625204.localdomain podman[325296]: 2026-02-20 10:05:43.731752105 +0000 UTC m=+0.177357633 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 20 10:05:43 np0005625204.localdomain sudo[325328]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:43 np0005625204.localdomain sshd[325317]: Received disconnect from 38.102.83.114 port 34096:11: disconnected by user
Feb 20 10:05:43 np0005625204.localdomain sshd[325317]: Disconnected from user zuul 38.102.83.114 port 34096
Feb 20 10:05:43 np0005625204.localdomain sshd[325293]: pam_unix(sshd:session): session closed for user zuul
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Feb 20 10:05:43 np0005625204.localdomain podman[325295]: 2026-02-20 10:05:43.747466583 +0000 UTC m=+0.193242517 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, io.buildah.version=1.33.7, managed_by=edpm_ansible, version=9.7, architecture=x86_64, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_id=openstack_network_exporter, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-02-05T04:57:10Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 20 10:05:43 np0005625204.localdomain systemd-logind[759]: Session 83 logged out. Waiting for processes to exit.
Feb 20 10:05:43 np0005625204.localdomain systemd-logind[759]: Removed session 83.
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:05:43 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:05:45 np0005625204.localdomain ceph-mon[301857]: pgmap v710: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:45.655 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4996-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:45.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:45.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:45.657 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:45.692 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:45.693 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:47 np0005625204.localdomain ceph-mon[301857]: pgmap v711: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:05:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:05:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:05:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:05:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:05:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18352 "" "Go-http-client/1.1"
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: pgmap v712: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.455814) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581949455866, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 802, "num_deletes": 251, "total_data_size": 824370, "memory_usage": 840008, "flush_reason": "Manual Compaction"}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581949466704, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 537771, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34075, "largest_seqno": 34872, "table_properties": {"data_size": 534261, "index_size": 1365, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8376, "raw_average_key_size": 19, "raw_value_size": 527142, "raw_average_value_size": 1258, "num_data_blocks": 61, "num_entries": 419, "num_filter_entries": 419, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771581898, "oldest_key_time": 1771581898, "file_creation_time": 1771581949, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 10950 microseconds, and 2839 cpu microseconds.
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.466762) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 537771 bytes OK
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.466791) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.471446) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.471469) EVENT_LOG_v1 {"time_micros": 1771581949471462, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.471491) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 820161, prev total WAL file size 820161, number of live WAL files 2.
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.472319) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(525KB)], [54(19MB)]
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581949472370, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 20948061, "oldest_snapshot_seqno": -1}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 14231 keys, 19536779 bytes, temperature: kUnknown
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581949563100, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 19536779, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19454496, "index_size": 45638, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35589, "raw_key_size": 383135, "raw_average_key_size": 26, "raw_value_size": 19211690, "raw_average_value_size": 1349, "num_data_blocks": 1689, "num_entries": 14231, "num_filter_entries": 14231, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1771580799, "oldest_key_time": 0, "file_creation_time": 1771581949, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "316f2b4e-6103-43ad-8119-3359f94ef991", "db_session_id": "OMQD63SADIG5WJVO9ZZI", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.563448) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 19536779 bytes
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.567340) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 230.7 rd, 215.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 19.5 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(75.3) write-amplify(36.3) OK, records in: 14748, records dropped: 517 output_compression: NoCompression
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.567371) EVENT_LOG_v1 {"time_micros": 1771581949567358, "job": 32, "event": "compaction_finished", "compaction_time_micros": 90813, "compaction_time_cpu_micros": 54728, "output_level": 6, "num_output_files": 1, "total_output_size": 19536779, "num_input_records": 14748, "num_output_records": 14231, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581949567600, "job": 32, "event": "table_file_deletion", "file_number": 56}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005625204/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: EVENT_LOG_v1 {"time_micros": 1771581949571591, "job": 32, "event": "table_file_deletion", "file_number": 54}
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.472196) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.571703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.571713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.571717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.571720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:05:49 np0005625204.localdomain ceph-mon[301857]: rocksdb: (Original Log Time 2026/02/20-10:05:49.571724) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 20 10:05:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:50.694 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:50.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:50.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:50.696 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:50.727 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:50 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:50.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:51 np0005625204.localdomain ceph-mon[301857]: pgmap v713: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:05:52 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:05:52 np0005625204.localdomain podman[325359]: 2026-02-20 10:05:52.145418274 +0000 UTC m=+0.079970440 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 20 10:05:52 np0005625204.localdomain podman[325360]: 2026-02-20 10:05:52.205993043 +0000 UTC m=+0.134965819 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 20 10:05:52 np0005625204.localdomain podman[325360]: 2026-02-20 10:05:52.215152172 +0000 UTC m=+0.144124948 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 20 10:05:52 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:05:52 np0005625204.localdomain podman[325359]: 2026-02-20 10:05:52.233553334 +0000 UTC m=+0.168105510 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 20 10:05:52 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:05:52 np0005625204.localdomain sshd[325400]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:05:52 np0005625204.localdomain sshd[325400]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:05:53 np0005625204.localdomain ceph-mon[301857]: pgmap v714: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:54 np0005625204.localdomain sudo[325402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:05:54 np0005625204.localdomain sudo[325402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:05:54 np0005625204.localdomain sudo[325402]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:54 np0005625204.localdomain sudo[325420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:05:54 np0005625204.localdomain sudo[325420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:05:55 np0005625204.localdomain ceph-mon[301857]: pgmap v715: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:55 np0005625204.localdomain sudo[325420]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:55.728 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:05:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:55.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:55.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:05:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:55.731 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:55.732 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:05:55 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:55.735 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:05:55 np0005625204.localdomain sudo[325470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:05:55 np0005625204.localdomain sudo[325470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:05:55 np0005625204.localdomain sudo[325470]: pam_unix(sudo:session): session closed for user root
Feb 20 10:05:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:05:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:05:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:05:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:05:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:05:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:05:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:05:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:05:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:05:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:05:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:05:57 np0005625204.localdomain ceph-mon[301857]: pgmap v716: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:59 np0005625204.localdomain ceph-mon[301857]: pgmap v717: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:05:59 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:05:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:59.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:05:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:05:59.722 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 20 10:06:00 np0005625204.localdomain ceph-mon[301857]: pgmap v718: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:00 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:00.734 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:00 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:06:00 np0005625204.localdomain systemd[1]: tmp-crun.xozSkW.mount: Deactivated successfully.
Feb 20 10:06:00 np0005625204.localdomain podman[325488]: 2026-02-20 10:06:00.882725089 +0000 UTC m=+0.097066933 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 20 10:06:00 np0005625204.localdomain podman[325488]: 2026-02-20 10:06:00.923188255 +0000 UTC m=+0.137530159 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Feb 20 10:06:00 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:06:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:02 np0005625204.localdomain ceph-mon[301857]: pgmap v719: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2438891474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:06:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2438891474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:06:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:02.985 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:05.738 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:05 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:05.767 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:05 np0005625204.localdomain ceph-mon[301857]: pgmap v720: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:06:06.028 162652 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:06:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:06:06.029 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:06:06 np0005625204.localdomain ovn_metadata_agent[162647]: 2026-02-20 10:06:06.030 162652 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:06:06 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:06 np0005625204.localdomain ceph-mon[301857]: pgmap v721: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:06 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3751615180' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:07 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:06:07 np0005625204.localdomain podman[325505]: 2026-02-20 10:06:07.148986311 +0000 UTC m=+0.081530199 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:06:07 np0005625204.localdomain podman[325505]: 2026-02-20 10:06:07.161016358 +0000 UTC m=+0.093560286 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 20 10:06:07 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:06:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:07.750 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:07 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:07.750 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:07 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3492603920' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:08 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:08.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:09 np0005625204.localdomain ceph-mon[301857]: pgmap v722: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:09.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:09.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:06:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:09.741 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:06:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:09.742 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:06:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:09.742 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Auditing locally available compute resources for np0005625204.localdomain (node: np0005625204.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 20 10:06:09 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:09.743 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:06:10 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:06:10 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2553218350' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.192 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.255 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.256 281292 DEBUG nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] skipping disk for instance-00000002 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Feb 20 10:06:10 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2553218350' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.442 281292 WARNING nova.virt.libvirt.driver [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.445 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Hypervisor/Node resource view: name=np0005625204.localdomain free_ram=11192MB free_disk=41.836978912353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.446 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.446 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.525 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Instance f9924957-6cff-426e-9f03-c739820f4ff3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 2, 'MEMORY_MB': 512, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.526 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.526 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Final resource view: name=np0005625204.localdomain phys_ram=15738MB used_ram=1024MB phys_disk=41GB used_disk=2GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.740 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.743 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.743 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.743 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.787 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:10 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:10.788 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:11.150 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 20 10:06:11 np0005625204.localdomain ceph-mon[301857]: pgmap v723: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 20 10:06:11 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/599090075' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:11.589 281292 DEBUG oslo_concurrency.processutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 20 10:06:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:11.594 281292 DEBUG nova.compute.provider_tree [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed in ProviderTree for provider: 41976f9f-3656-482f-8ad0-c81e454a3952 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 20 10:06:11 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:11.737 281292 DEBUG nova.scheduler.client.report [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Inventory has not changed for provider 41976f9f-3656-482f-8ad0-c81e454a3952 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 20 10:06:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:11.739 281292 DEBUG nova.compute.resource_tracker [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Compute_service record updated for np0005625204.localdomain:np0005625204.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 20 10:06:11 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:11.740 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.294s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:06:12 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/599090075' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.741 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.741 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.742 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.742 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.808 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.809 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquired lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.809 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Feb 20 10:06:12 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:12.810 281292 DEBUG nova.objects.instance [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f9924957-6cff-426e-9f03-c739820f4ff3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.211 281292 DEBUG nova.network.neutron [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updating instance_info_cache with network_info: [{"id": "e7aa8e2a-27a6-452b-906c-21cea166b882", "address": "fa:16:3e:b0:ed:d2", "network": {"id": "de929a91-c460-4398-96e0-15a80685a485", "bridge": "br-int", "label": "private", "subnets": [{"cidr": "192.168.0.0/24", "dns": [], "gateway": {"address": "192.168.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.0.140", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.20", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.0.3"}}], "meta": {"injected": false, "tenant_id": "91bce661d685472eb3e7cacab17bf52a", "mtu": 1292, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape7aa8e2a-27", "ovs_interfaceid": "e7aa8e2a-27a6-452b-906c-21cea166b882", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.229 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Releasing lock "refresh_cache-f9924957-6cff-426e-9f03-c739820f4ff3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.230 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] [instance: f9924957-6cff-426e-9f03-c739820f4ff3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.231 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:13 np0005625204.localdomain ceph-mon[301857]: pgmap v724: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:13 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2627620429' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 20 10:06:13 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:13.745 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 20 10:06:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:06:14 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:06:14 np0005625204.localdomain systemd[1]: tmp-crun.9isltC.mount: Deactivated successfully.
Feb 20 10:06:14 np0005625204.localdomain podman[325573]: 2026-02-20 10:06:14.153053093 +0000 UTC m=+0.083638063 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, build-date=2026-02-05T04:57:10Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1770267347, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, managed_by=edpm_ansible, version=9.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=openstack_network_exporter)
Feb 20 10:06:14 np0005625204.localdomain podman[325573]: 2026-02-20 10:06:14.165123021 +0000 UTC m=+0.095707991 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-02-05T04:57:10Z, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1770267347, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-02-05T04:57:10Z, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9/ubi-minimal)
Feb 20 10:06:14 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:06:14 np0005625204.localdomain podman[325574]: 2026-02-20 10:06:14.212391603 +0000 UTC m=+0.139243879 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 20 10:06:14 np0005625204.localdomain podman[325574]: 2026-02-20 10:06:14.251120125 +0000 UTC m=+0.177972381 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 10:06:14 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:06:14 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1462323746' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 20 10:06:14 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:14.741 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:15 np0005625204.localdomain ceph-mon[301857]: pgmap v725: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.721 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.721 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.789 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.791 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.792 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.834 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:15 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:15.835 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:16 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:17 np0005625204.localdomain sshd[325615]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:06:17 np0005625204.localdomain ceph-mon[301857]: pgmap v726: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:17 np0005625204.localdomain sshd[325615]: Invalid user sol from 45.148.10.240 port 35794
Feb 20 10:06:17 np0005625204.localdomain sshd[325615]: Connection closed by invalid user sol 45.148.10.240 port 35794 [preauth]
Feb 20 10:06:17 np0005625204.localdomain podman[241968]: time="2026-02-20T10:06:17Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:06:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:06:17 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:06:17 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:06:17 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18352 "" "Go-http-client/1.1"
Feb 20 10:06:19 np0005625204.localdomain ceph-mon[301857]: pgmap v727: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:20.836 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:20.841 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:20.841 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5006 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:20.841 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:20.863 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:20 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:20.865 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:21 np0005625204.localdomain ceph-mon[301857]: pgmap v728: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:21 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:21 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:21.722 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:06:23 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:06:23 np0005625204.localdomain podman[325617]: 2026-02-20 10:06:23.159568141 +0000 UTC m=+0.096894787 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 20 10:06:23 np0005625204.localdomain systemd[1]: tmp-crun.exVZ13.mount: Deactivated successfully.
Feb 20 10:06:23 np0005625204.localdomain podman[325617]: 2026-02-20 10:06:23.238374676 +0000 UTC m=+0.175701372 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:06:23 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:06:23 np0005625204.localdomain podman[325618]: 2026-02-20 10:06:23.240388267 +0000 UTC m=+0.174152794 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 20 10:06:23 np0005625204.localdomain podman[325618]: 2026-02-20 10:06:23.321741869 +0000 UTC m=+0.255506396 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 20 10:06:23 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:06:23 np0005625204.localdomain ceph-mon[301857]: pgmap v729: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:25 np0005625204.localdomain ceph-mon[301857]: pgmap v730: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:25 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:25.866 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:06:26 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:06:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:06:26 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:06:26 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:06:26 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:06:26 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:26 np0005625204.localdomain ceph-mon[301857]: pgmap v731: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:29 np0005625204.localdomain ceph-mon[301857]: pgmap v732: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:30.903 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:30.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:30.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5035 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:30.905 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:30.906 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:30 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:30.906 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:31 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:06:31 np0005625204.localdomain podman[325660]: 2026-02-20 10:06:31.148202743 +0000 UTC m=+0.083730616 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 20 10:06:31 np0005625204.localdomain podman[325660]: 2026-02-20 10:06:31.184398058 +0000 UTC m=+0.119925881 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 20 10:06:31 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:06:31 np0005625204.localdomain ceph-mon[301857]: pgmap v733: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:31 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:33 np0005625204.localdomain ceph-mon[301857]: pgmap v734: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:35 np0005625204.localdomain ceph-mon[301857]: pgmap v735: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:35.907 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:35.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:35.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:35.908 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:35.932 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:35 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:35.933 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:36 np0005625204.localdomain sshd[325679]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:06:36 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:36 np0005625204.localdomain sshd[325679]: Accepted publickey for zuul from 192.168.122.10 port 46190 ssh2: RSA SHA256:uaxCfdsB9dLyL4znjsLCPDTT6O+6xZzjHRjczJFOA6k
Feb 20 10:06:36 np0005625204.localdomain systemd-logind[759]: New session 84 of user zuul.
Feb 20 10:06:36 np0005625204.localdomain systemd[1]: Started Session 84 of User zuul.
Feb 20 10:06:36 np0005625204.localdomain sshd[325679]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 20 10:06:36 np0005625204.localdomain sudo[325683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Feb 20 10:06:36 np0005625204.localdomain sudo[325683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 20 10:06:37 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.
Feb 20 10:06:37 np0005625204.localdomain podman[325710]: 2026-02-20 10:06:37.306050685 +0000 UTC m=+0.085145258 container health_status 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 20 10:06:37 np0005625204.localdomain podman[325710]: 2026-02-20 10:06:37.344311423 +0000 UTC m=+0.123405946 container exec_died 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 20 10:06:37 np0005625204.localdomain ceph-mon[301857]: pgmap v736: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:37 np0005625204.localdomain sshd[325735]: main: sshd: ssh-rsa algorithm is disabled
Feb 20 10:06:37 np0005625204.localdomain systemd[1]: 010f5ab1ce0480bd3e8db137b28d4e02d7b3d2de0979c704d45e2356a2f476cb.service: Deactivated successfully.
Feb 20 10:06:37 np0005625204.localdomain sshd[325735]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 20 10:06:39 np0005625204.localdomain ceph-mon[301857]: pgmap v737: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 10:06:39 np0005625204.localdomain ceph-mon[301857]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 4646 writes, 35K keys, 4646 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 4646 writes, 4646 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2440 writes, 13K keys, 2440 commit groups, 1.0 writes per commit group, ingest: 17.69 MB, 0.03 MB/s
                                                           Interval WAL: 2440 writes, 2440 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    170.4      0.23              0.10        16    0.014       0      0       0.0       0.0
                                                             L6      1/0   18.63 MB   0.0      0.3     0.0      0.2       0.3      0.0       0.0   6.9    228.2    212.2      1.26              0.72        15    0.084    203K   7716       0.0       0.0
                                                            Sum      1/0   18.63 MB   0.0      0.3     0.0      0.2       0.3      0.1       0.0   7.9    193.2    205.8      1.48              0.82        31    0.048    203K   7716       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0  12.9    206.0    210.1      0.86              0.48        18    0.048    128K   4836       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.2       0.3      0.0       0.0   0.0    228.2    212.2      1.26              0.72        15    0.084    203K   7716       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    171.9      0.23              0.10        15    0.015       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.9      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.038, interval 0.014
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.30 GB write, 0.25 MB/s write, 0.28 GB read, 0.24 MB/s read, 1.5 seconds
                                                           Interval compaction: 0.18 GB write, 0.30 MB/s write, 0.17 GB read, 0.29 MB/s read, 0.9 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x559a1eac51f0#2 capacity: 304.00 MB usage: 22.11 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000263 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1344,20.77 MB,6.83375%) FilterBlock(31,605.05 KB,0.194364%) IndexBlock(31,763.42 KB,0.24524%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 20 10:06:40 np0005625204.localdomain ceph-mon[301857]: from='client.58939 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:40 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "status"} v 0)
Feb 20 10:06:40 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2111544624' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 20 10:06:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:40.934 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:40.936 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:40.936 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:40.936 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:40.986 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:40 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:40.987 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.49767 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.98600 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: pgmap v738: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.58945 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.49773 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.98606 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3194755779' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/391173645' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2111544624' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 20 10:06:41 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:43 np0005625204.localdomain ovs-vsctl[325957]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 20 10:06:43 np0005625204.localdomain ceph-mon[301857]: pgmap v739: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:44 np0005625204.localdomain virtqemud[206495]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 20 10:06:44 np0005625204.localdomain virtqemud[206495]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 20 10:06:44 np0005625204.localdomain virtqemud[206495]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 20 10:06:44 np0005625204.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 326109 (lsinitrd)
Feb 20 10:06:44 np0005625204.localdomain systemd[1]: Mounting EFI System Partition Automount...
Feb 20 10:06:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.
Feb 20 10:06:44 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.
Feb 20 10:06:44 np0005625204.localdomain systemd[1]: Mounted EFI System Partition Automount.
Feb 20 10:06:44 np0005625204.localdomain ceph-mon[301857]: pgmap v740: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:44 np0005625204.localdomain ceph-mon[301857]: from='client.58963 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:44 np0005625204.localdomain podman[326115]: 2026-02-20 10:06:44.704334535 +0000 UTC m=+0.108248104 container health_status 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.component=ubi9-minimal-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, version=9.7, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1770267347, org.opencontainers.image.created=2026-02-05T04:57:10Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-02-05T04:57:10Z, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c)
Feb 20 10:06:44 np0005625204.localdomain podman[326116]: 2026-02-20 10:06:44.769313948 +0000 UTC m=+0.174855676 container health_status f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 10:06:44 np0005625204.localdomain podman[326115]: 2026-02-20 10:06:44.791394551 +0000 UTC m=+0.195308110 container exec_died 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c, name=openstack_network_exporter, managed_by=edpm_ansible, io.buildah.version=1.33.7, release=1770267347, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:156f95f51d0a91422548c574e96ee37f07a200c948e173b22523982f24f1e79c', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=21849199b7179dc3074812b8e24698ec609d6a5c, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.created=2026-02-05T04:57:10Z, distribution-scope=public, build-date=2026-02-05T04:57:10Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=21849199b7179dc3074812b8e24698ec609d6a5c, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 20 10:06:44 np0005625204.localdomain podman[326116]: 2026-02-20 10:06:44.803922953 +0000 UTC m=+0.209464691 container exec_died f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl', '--path.rootfs=/rootfs'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/:/rootfs:ro', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 20 10:06:44 np0005625204.localdomain systemd[1]: 7bd1902a3d6b0fbd8c333b8a0d8fb0cd03128f85bfb72ed019b009d1495077da.service: Deactivated successfully.
Feb 20 10:06:44 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: cache status {prefix=cache status} (starting...)
Feb 20 10:06:44 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:44 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: client ls {prefix=client ls} (starting...)
Feb 20 10:06:44 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:45 np0005625204.localdomain systemd[1]: f8fb2f5d8bad6453e0d369fb0fe26814df7c66717dc1bd19db6cee09d4c60079.service: Deactivated successfully.
Feb 20 10:06:45 np0005625204.localdomain lvm[326263]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 20 10:06:45 np0005625204.localdomain lvm[326263]: VG ceph_vg1 finished
Feb 20 10:06:45 np0005625204.localdomain lvm[326267]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 20 10:06:45 np0005625204.localdomain lvm[326267]: VG ceph_vg0 finished
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: damage ls {prefix=damage ls} (starting...)
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.49785 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.58969 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.49791 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.98618 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/198603473' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1730588979' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: dump loads {prefix=dump loads} (starting...)
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb 20 10:06:45 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:45.988 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:45.990 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:45.990 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:45 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:45.991 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:46.015 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:46 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:46.016 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "report"} v 0)
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2593553930' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1502005148' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.98627 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.58990 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/252892574' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: pgmap v741: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2484470455' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.49812 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/639021198' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/932176662' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2593553930' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2366589297' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.98651 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/26432637' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3139561945' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2483575500' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1502005148' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Feb 20 10:06:46 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3334651353' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: ops {prefix=ops} (starting...)
Feb 20 10:06:46 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config log"} v 0)
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2447655434' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4212865981' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3458121244' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: session ls {prefix=session ls} (starting...)
Feb 20 10:06:47 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl Can't run that command on an inactive MDS!
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2884655271' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2372081725' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3334651353' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/224606860' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.59044 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2447655434' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.49845 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2724243597' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.59050 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4212865981' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3226058210' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.49857 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3458121244' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 20 10:06:47 np0005625204.localdomain podman[241968]: time="2026-02-20T10:06:47Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 20 10:06:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:06:47 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155257 "" "Go-http-client/1.1"
Feb 20 10:06:47 np0005625204.localdomain podman[241968]: @ - - [20/Feb/2026:10:06:47 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18360 "" "Go-http-client/1.1"
Feb 20 10:06:47 np0005625204.localdomain ceph-mds[284061]: mds.mds.np0005625204.wnsphl asok_command: status {prefix=status} (starting...)
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 20 10:06:47 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2314617770' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2564032777' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "features"} v 0)
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2273315391' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2868000341' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1605299969' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1261710750' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: pgmap v742: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.98705 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2314617770' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3018402630' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/269488103' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.98711 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1510216666' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/93294090' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1012312030' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2564032777' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2032604306' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1688900044' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2273315391' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 20 10:06:48 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1261218979' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/419782457' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3888246860' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1296629889' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.59095 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.49893 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1261218979' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3830416334' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.59113 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2544276023' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/419782457' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3888246860' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2343932821' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.49911 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.59122 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.98762 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1296629889' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2029848996' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2992728241' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb 20 10:06:49 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3859343773' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/576497619' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.49932 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: pgmap v743: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.59137 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3859343773' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.98777 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.49938 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2780626791' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/474225806' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.59152 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.98783 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/576497619' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.49950 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1742345117' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4092547990' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 20 10:06:50 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3386122060' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:03.177595+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:04.177789+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:05.177979+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:06.178127+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:07.178343+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:08.178540+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:09.178747+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:10.179004+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:11.179220+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:12.179382+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:13.179600+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 11 from mon.np0005625201 (according to old e11)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 11
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:44:43.337910+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:14.179763+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:15.179962+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:16.180155+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:17.180304+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:18.180499+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:19.180678+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:20.180797+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:21.180951+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:22.181097+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:23.181293+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:24.181486+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:25.181614+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:26.181770+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:27.181932+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:28.182072+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:29.182240+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:30.182419+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:31.182549+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:32.182739+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:33.182942+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:34.183121+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:35.183279+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:36.183436+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:37.183600+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:38.183705+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:39.183860+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:40.184046+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:41.184171+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:42.184466+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:43.184691+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 944884 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:44.185330+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 heartbeat osd_stat(store_statfs(0x1b8fbf000/0x0/0x1bfc00000, data 0x2a4e4e1/0x2ace000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:45.185520+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88375296 unmapped: 696320 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:46.185947+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 88 handle_osd_map epochs [88,89], i have 88, src has [1,89]
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 82.219421387s of 82.275581360s, submitted: 12
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 31
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/1027089384
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect No active mgr available yet
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 89 ms_handle_reset con 0x55cbaae93c00 session 0x55cbaae99680
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7f92000
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88465408 unmapped: 606208 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:47.186084+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 32
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: get_auth_request con 0x55cbaae91000 auth_method 0
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88653824 unmapped: 417792 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:48.186261+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 33
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88653824 unmapped: 417792 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:49.186481+0000)
Feb 20 10:06:50 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbb000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88801280 unmapped: 270336 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 34
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:50.186747+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88793088 unmapped: 278528 heap: 89071616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 35
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:51.186878+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:52.187014+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88137728 unmapped: 1982464 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 36
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:53.187185+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:54.187438+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:55.187768+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:56.188083+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:57.188305+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:58.188513+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:59.189113+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:00.189273+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:01.189402+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:02.189597+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:03.189892+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:04.190065+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:05.190250+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:06.190397+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:07.190551+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:08.190766+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:09.190959+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 12 from mon.np0005625201 (according to old e12)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 12
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:39.346453+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:10.191149+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:11.191296+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:12.191415+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:13.191696+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:14.191840+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:15.191989+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:16.192360+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 13 from mon.np0005625201 (according to old e13)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 13
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:46.327222+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: mon.np0005625201 at [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] went away
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _reopen_session rank -1
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _add_conns ranks=[0,1,2]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625204 con 0x55cbabca3800 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625202 con 0x55cbaae03000 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625203 con 0x55cbabcc8400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _finish_auth 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request con 0x55cbabca3800 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth method 2
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request con 0x55cbaae03000 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth method 2
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_done global_id 24241 payload 293
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_hunting 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: found mon.np0005625204
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_auth 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:16.344041+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: ms_handle_reset current mon [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _reopen_session rank -1
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _add_conns ranks=[0,2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625204 con 0x55cbabca1c00 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625203 con 0x55cbaae8e400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625202 con 0x55cbaa279400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 ms_handle_reset con 0x55cbabca3800 session 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request con 0x55cbabca1c00 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth method 2
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request con 0x55cbaa279400 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth method 2
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_done global_id 24241 payload 293
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_hunting 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: found mon.np0005625202
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_auth 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:16.352577+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 13 from mon.np0005625202 (according to old e13)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 13
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:46.327222+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_config config(7 keys)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: set_mon_vals no callback set
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 36
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:17.192502+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:18.192603+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:19.193171+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:20.193468+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:21.194135+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:22.194360+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:23.467320+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:24.467436+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:25.467731+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:26.467875+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:27.468129+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 14 from mon.np0005625202 (according to old e14)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 14
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:57.556107+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
                                                          3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:28.468266+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:29.468719+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:30.468819+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:31.469088+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:32.469227+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:33.469405+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:34.469562+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:35.469699+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:36.469854+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:37.470122+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 15 from mon.np0005625202 (according to old e15)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 15
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:08.177805+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:38.470249+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:39.470451+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:40.470767+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:41.470989+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:42.471216+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:43.471497+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:44.471718+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:45.471852+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:46.472004+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:47.472142+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:48.472324+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:49.472731+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:50.472861+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:51.473098+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:52.473297+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:53.473466+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88350720 unmapped: 1769472 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 16 from mon.np0005625202 (according to old e16)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 16
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:24.360760+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: ms_handle_reset current mon [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _reopen_session rank -1
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _add_conns ranks=[0,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625202 con 0x55cbaae94400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): picked mon.np0005625203 con 0x55cbaae8e400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): start opening mon connection
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 ms_handle_reset con 0x55cbaa279400 session 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request con 0x55cbaae8e400 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth method 2
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request con 0x55cbaae94400 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth method 2
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient(hunting): handle_auth_done global_id 24241 payload 293
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_hunting 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: found mon.np0005625202
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_auth 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:54.382017+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 16 from mon.np0005625202 (according to old e16)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 16
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:24.360760+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_config config(7 keys)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: set_mon_vals no callback set
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 36
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:54.473592+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:55.473731+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:56.473990+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:57.474197+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:58.474336+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:59.474565+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:00.474748+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:01.474904+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:02.475026+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:03.475226+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:04.475419+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:05.475615+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:06.475768+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:07.475974+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:08.476097+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:09.476268+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:10.476488+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:11.476716+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:12.477030+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:13.477230+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:14.477416+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:15.477582+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:16.477776+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient:  got monmap 17 from mon.np0005625202 (according to old e17)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: dump:
                                                          epoch 17
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:46.606881+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625204
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:17.478108+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:18.478247+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:19.478402+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:20.478594+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:21.478748+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88358912 unmapped: 1761280 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 37
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:22.478938+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:23.479156+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 946976 data_alloc: 285212672 data_used: 3522560
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:24.479297+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:25.479470+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:26.479673+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 heartbeat osd_stat(store_statfs(0x1b8fbd000/0x0/0x1bfc00000, data 0x2a505f5/0x2ad1000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:27.479850+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88162304 unmapped: 1957888 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 38
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2084071713
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect No active mgr available yet
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 ms_handle_reset con 0x55cba7f92000 session 0x55cbaae99e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca0800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 101.827842712s of 101.886848450s, submitted: 12
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:28.479967+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88334336 unmapped: 1785856 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 39
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: get_auth_request con 0x55cbaa499800 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:29.480129+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88317952 unmapped: 1802240 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 handle_osd_map epochs [90,90], i have 90, src has [1,90]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:30.480272+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:31.480415+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:32.480600+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:33.480820+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:34.480952+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 41
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:35.481087+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:36.481219+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:37.481361+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:38.481502+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:39.481678+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:40.481874+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:41.482022+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:42.482130+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:43.482311+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 1867776 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:44.482602+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:45.482968+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:46.483156+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:47.483331+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:48.483502+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:49.483737+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:50.483922+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:51.484076+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88104960 unmapped: 2015232 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 42
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:52.484283+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:53.484517+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:54.484747+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:55.484901+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:56.485076+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:57.485216+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:58.485305+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:59.485405+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:00.485664+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:01.485927+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:02.486105+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:03.486352+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:04.486504+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:05.486681+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:06.486860+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:07.487008+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:08.487152+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:09.487338+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:10.487550+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:11.487736+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:12.487947+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:13.488176+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:14.488363+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:15.488528+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:16.488697+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:17.488886+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:18.489256+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:19.489490+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:20.489731+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:21.489968+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:22.490186+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:23.490698+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:24.490824+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:25.491024+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:26.491276+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:27.491565+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:28.491960+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:29.492203+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:30.492556+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:31.492793+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:32.493059+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:51.017 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4995-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:51.019 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:51.019 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:51.019 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:33.493293+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:34.493514+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:35.493766+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:36.494020+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:37.494278+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:38.494496+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:39.494762+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:40.494975+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:41.495132+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:42.495304+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:43.495482+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:44.495665+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:45.495829+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:46.496045+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:47.496272+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:48.496503+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:49.496702+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 951148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:50.496885+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:51.497074+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 heartbeat osd_stat(store_statfs(0x1b8fb9000/0x0/0x1bfc00000, data 0x2a527b5/0x2ad4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:52.497234+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88096768 unmapped: 2023424 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 84.782577515s of 84.841644287s, submitted: 14
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 43
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect Terminating session with v2:172.18.0.108:6810/689946273
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect No active mgr available yet
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 ms_handle_reset con 0x55cbabca0800 session 0x55cba9050960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbab53f400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:53.497408+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88195072 unmapped: 1925120 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 44
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: get_auth_request con 0x55cbabca2000 auth_method 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:54.497569+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 954148 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb5000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 1826816 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:55.497757+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 1826816 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:56.497910+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 1826816 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:57.498080+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 1826816 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:58.498235+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 45
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:59.498398+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:00.498606+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:01.498827+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:02.499069+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:03.499324+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:04.499520+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:05.499717+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:06.499869+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:07.500048+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:08.500197+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:09.500376+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:10.500584+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:11.500781+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:12.500971+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:13.501183+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:14.501366+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:15.501565+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:16.501735+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:17.501866+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:18.502018+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:19.502239+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:20.502392+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:21.502589+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:22.502829+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:23.503055+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:24.503296+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:25.503501+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:26.503690+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:27.503855+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:28.504028+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:29.504161+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:30.504303+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:31.504479+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:32.504675+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:33.504870+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:34.505067+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:35.505222+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:36.505421+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:37.505573+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:38.505695+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:39.505870+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:40.506012+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:41.506166+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:42.506340+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:43.506526+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:44.506693+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:45.506864+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:46.507045+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:47.507199+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:48.507338+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:49.507494+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:50.507711+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:51.507879+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:52.508054+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:53.508254+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:54.508482+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:55.676238+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:56.676569+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:57.676702+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:58.676858+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:59.677031+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:00.682693+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:01.682947+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:02.683513+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:03.683684+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:04.683822+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:05.683998+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:06.684169+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:07.684359+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:08.684491+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:09.684621+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:10.684778+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:11.684924+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:12.685235+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:13.685416+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:14.685552+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:15.685699+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:16.685855+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:17.685993+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:18.686111+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:19.686358+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:20.686486+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5997 writes, 25K keys, 5997 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5997 writes, 930 syncs, 6.45 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 177 writes, 364 keys, 177 commit groups, 1.0 writes per commit group, ingest: 0.34 MB, 0.00 MB/s
                                                          Interval WAL: 177 writes, 85 syncs, 2.08 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:21.686620+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:22.686795+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:23.686964+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 46
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:24.687092+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 953268 data_alloc: 285212672 data_used: 3530752
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:25.687218+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:26.687366+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:27.688178+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b8fb6000/0x0/0x1bfc00000, data 0x2a54b23/0x2ad8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 87875584 unmapped: 2244608 heap: 90120192 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa8c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 94.648574829s of 94.729278564s, submitted: 20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:28.688318+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 heartbeat osd_stat(store_statfs(0x1b87b5000/0x0/0x1bfc00000, data 0x3254b33/0x32d9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 89014272 unmapped: 16842752 heap: 105857024 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:29.688471+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 89014272 unmapped: 16842752 heap: 105857024 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 92 ms_handle_reset con 0x55cbaafa8c00 session 0x55cbab5fde00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 92 heartbeat osd_stat(store_statfs(0x1b7b45000/0x0/0x1bfc00000, data 0x3ec4b33/0x3f49000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1103091 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:30.688614+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:31.688740+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88236032 unmapped: 26017792 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 92 ms_handle_reset con 0x55cbaae91400 session 0x55cbab562000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:32.688913+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88236032 unmapped: 26017792 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:33.689125+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88236032 unmapped: 26017792 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:34.689321+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88236032 unmapped: 26017792 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:35.689491+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88244224 unmapped: 26009600 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:36.689701+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88244224 unmapped: 26009600 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:37.689874+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88244224 unmapped: 26009600 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:38.690014+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88244224 unmapped: 26009600 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:39.690170+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88244224 unmapped: 26009600 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:40.690338+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 26001408 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:41.690508+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 26001408 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:42.690681+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 26001408 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:43.690899+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88252416 unmapped: 26001408 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:44.691129+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88268800 unmapped: 25985024 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:45.691323+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88268800 unmapped: 25985024 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:46.691475+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88268800 unmapped: 25985024 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:47.691691+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88268800 unmapped: 25985024 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:48.691848+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:49.691992+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:50.692139+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:51.692265+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:52.692423+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:53.692668+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:54.692841+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88276992 unmapped: 25976832 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:55.693002+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:56.693185+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:57.693333+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:58.693528+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:59.693698+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:00.693867+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:01.694014+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:02.694136+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88285184 unmapped: 25968640 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:03.694304+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 25960448 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:04.694545+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 25960448 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:05.694773+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 25960448 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:06.694954+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88293376 unmapped: 25960448 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:07.695097+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:08.695293+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:09.695460+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:10.695619+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:11.695817+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:12.696283+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:13.696696+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:14.696848+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:15.697009+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:16.697148+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:17.697307+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88301568 unmapped: 25952256 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:18.697595+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:19.697718+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:20.697901+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:21.698116+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:22.698308+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:23.698556+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:24.698752+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248225 data_alloc: 285212672 data_used: 3543040
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:25.698940+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:26.699077+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88309760 unmapped: 25944064 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:27.699262+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88317952 unmapped: 25935872 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b66ca000/0x0/0x1bfc00000, data 0x5339236/0x53c3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:28.699429+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 60.608001709s of 60.817741394s, submitted: 32
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 88317952 unmapped: 25935872 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8f400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 ms_handle_reset con 0x55cbabc8f400 session 0x55cbab5621e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:29.699602+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae97000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 94126080 unmapped: 20127744 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1327689 data_alloc: 301989888 data_used: 8196096
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 ms_handle_reset con 0x55cbaae97000 session 0x55cbab5623c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:30.699842+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 ms_handle_reset con 0x55cbaae91400 session 0x55cbab5625a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 93290496 unmapped: 20963328 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:31.700056+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 93290496 unmapped: 20963328 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b5fd8000/0x0/0x1bfc00000, data 0x5a2a2a8/0x5ab6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa8c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 ms_handle_reset con 0x55cbaafa8c00 session 0x55cbab562f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:32.700232+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8f400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 93700096 unmapped: 20553728 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca0800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:33.700430+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 95387648 unmapped: 18866176 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:34.700584+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96878592 unmapped: 17375232 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1362813 data_alloc: 301989888 data_used: 12001280
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:35.700746+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b5fb3000/0x0/0x1bfc00000, data 0x5a4e2b8/0x5adb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96878592 unmapped: 17375232 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:36.700918+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96878592 unmapped: 17375232 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:37.701097+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96878592 unmapped: 17375232 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:38.701289+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96878592 unmapped: 17375232 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:39.701481+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96878592 unmapped: 17375232 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1362813 data_alloc: 301989888 data_used: 12001280
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b5fb3000/0x0/0x1bfc00000, data 0x5a4e2b8/0x5adb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:40.701622+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96886784 unmapped: 17367040 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 heartbeat osd_stat(store_statfs(0x1b5fb3000/0x0/0x1bfc00000, data 0x5a4e2b8/0x5adb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:41.701792+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 96870400 unmapped: 17383424 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:42.701936+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 13.893432617s of 14.137339592s, submitted: 61
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 101793792 unmapped: 12460032 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:43.702102+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 107331584 unmapped: 6922240 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8ec00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:44.702226+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 104988672 unmapped: 9265152 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1490216 data_alloc: 301989888 data_used: 12922880
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:45.702380+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 94 ms_handle_reset con 0x55cbaae8ec00 session 0x55cbab8ac960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 104988672 unmapped: 9265152 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:46.702539+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 104988672 unmapped: 9265152 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 94 ms_handle_reset con 0x55cbabca0800 session 0x55cba9051c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 94 ms_handle_reset con 0x55cbabc8f400 session 0x55cbab5630e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 94 heartbeat osd_stat(store_statfs(0x1b5176000/0x0/0x1bfc00000, data 0x6883676/0x6915000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:47.702671+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8ec00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 105037824 unmapped: 9216000 heap: 114253824 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 95 ms_handle_reset con 0x55cbaae8ec00 session 0x55cba83a6b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:48.702815+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 95 heartbeat osd_stat(store_statfs(0x1b5174000/0x0/0x1bfc00000, data 0x68859de/0x6919000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 113328128 unmapped: 5128192 heap: 118456320 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:49.702971+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 95 ms_handle_reset con 0x55cbaae91400 session 0x55cbab9883c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca0c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 115990528 unmapped: 9789440 heap: 125779968 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1666685 data_alloc: 301989888 data_used: 18071552
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:50.703102+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 96 ms_handle_reset con 0x55cbaae95400 session 0x55cbab66ab40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 96 heartbeat osd_stat(store_statfs(0x1b402e000/0x0/0x1bfc00000, data 0x79cc9de/0x7a60000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 96 ms_handle_reset con 0x55cbabca0c00 session 0x55cbab8ad0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 116088832 unmapped: 9691136 heap: 125779968 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:51.703298+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbabca1800 session 0x55cbab5fcd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbabc8d400 session 0x55cbaae8d680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8ec00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae8ec00 session 0x55cbaada7c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 116121600 unmapped: 9658368 heap: 125779968 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae91400 session 0x55cbaa02d2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:52.703427+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.790157318s of 10.005820274s, submitted: 365
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110116864 unmapped: 15663104 heap: 125779968 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae95400 session 0x55cbaa268960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:53.703600+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca0c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbabca0c00 session 0x55cbab99ef00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 106635264 unmapped: 19144704 heap: 125779968 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae93400 session 0x55cbab5c0f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbab5c0d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae06000 session 0x55cbab5c0b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:54.703749+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae04c00 session 0x55cbab5c0780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 106151936 unmapped: 23306240 heap: 129458176 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae07400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1545743 data_alloc: 301989888 data_used: 8220672
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 ms_handle_reset con 0x55cbaae07400 session 0x55cbaa55dc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:55.704691+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 97 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 98 ms_handle_reset con 0x55cbaae04c00 session 0x55cba83a6780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 98 heartbeat osd_stat(store_statfs(0x1b496a000/0x0/0x1bfc00000, data 0x708aac2/0x7123000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 107479040 unmapped: 21979136 heap: 129458176 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 98 ms_handle_reset con 0x55cbaae06000 session 0x55cba83a6b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:56.704844+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 106004480 unmapped: 27656192 heap: 133660672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:57.704978+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae93400 session 0x55cbaa02b4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbab5fcb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabcc8400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbabcc8400 session 0x55cbab5fcd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae04c00 session 0x55cbab9883c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 106045440 unmapped: 27615232 heap: 133660672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae93400 session 0x55cbab99fe00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae06000 session 0x55cbaa357e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabcc8400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbaa5aad20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8ec00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae8ec00 session 0x55cbaadf74a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae04c00 session 0x55cbaaded2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:58.705108+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 ms_handle_reset con 0x55cbaae06000 session 0x55cbaafa2960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 ms_handle_reset con 0x55cbabc8dc00 session 0x55cba83a74a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125247488 unmapped: 15040512 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 ms_handle_reset con 0x55cbaae93400 session 0x55cbab5630e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:59.705248+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 ms_handle_reset con 0x55cbaae02c00 session 0x55cbab66be00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 ms_handle_reset con 0x55cbabcc8400 session 0x55cbaadf7e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 ms_handle_reset con 0x55cbaae04c00 session 0x55cbab66a1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 ms_handle_reset con 0x55cbaae06000 session 0x55cbaae98780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 108322816 unmapped: 31965184 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1918332 data_alloc: 301989888 data_used: 9404416
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:00.705398+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 101 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 101 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 101 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbab5fde00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 108584960 unmapped: 31703040 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 101 heartbeat osd_stat(store_statfs(0x1b1e77000/0x0/0x1bfc00000, data 0x9b746a6/0x9c16000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae90800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:01.705550+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 108699648 unmapped: 31588352 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:02.705727+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.182841301s of 10.030245781s, submitted: 455
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 102 ms_handle_reset con 0x55cbaae03c00 session 0x55cbab670960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 108044288 unmapped: 32243712 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:03.705905+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 108003328 unmapped: 32284672 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:04.706009+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110510080 unmapped: 29777920 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 102 heartbeat osd_stat(store_statfs(0x1b2d47000/0x0/0x1bfc00000, data 0x8a53638/0x8af5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1833330 data_alloc: 301989888 data_used: 13303808
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:05.706163+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110526464 unmapped: 29761536 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:06.706316+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110526464 unmapped: 29761536 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:07.706494+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110551040 unmapped: 29736960 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 heartbeat osd_stat(store_statfs(0x1b2f94000/0x0/0x1bfc00000, data 0x8a55886/0x8af9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:08.706663+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110551040 unmapped: 29736960 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:09.706810+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110551040 unmapped: 29736960 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1835260 data_alloc: 301989888 data_used: 13307904
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:10.706944+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 heartbeat osd_stat(store_statfs(0x1b2f94000/0x0/0x1bfc00000, data 0x8a55886/0x8af9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 110551040 unmapped: 29736960 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:11.707113+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae90000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 117956608 unmapped: 22331392 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:12.707260+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.543693542s of 10.011410713s, submitted: 179
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cbaa3ef800 session 0x55cbab5c0f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fc800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 heartbeat osd_stat(store_statfs(0x1b21af000/0x0/0x1bfc00000, data 0x983b886/0x98df000, compress 0x0/0x0/0x0, omap 0x647, meta 0x416f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 116056064 unmapped: 24231936 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cbaae03c00 session 0x55cbab99f2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cbaae93400 session 0x55cba920dc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:13.707426+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:14.753593+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 121880576 unmapped: 18407424 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 heartbeat osd_stat(store_statfs(0x1b0a84000/0x0/0x1bfc00000, data 0xab66886/0xac0a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 heartbeat osd_stat(store_statfs(0x1b0a84000/0x0/0x1bfc00000, data 0xab66886/0xac0a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2106786 data_alloc: 301989888 data_used: 14823424
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:15.753703+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120340480 unmapped: 19947520 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:16.753833+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120700928 unmapped: 19587072 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:17.754001+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120709120 unmapped: 19578880 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cbabc8e400 session 0x55cbaa8723c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cbaae90800 session 0x55cbab670b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:18.754260+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120717312 unmapped: 19570688 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cbaa3ef800 session 0x55cbab989860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 ms_handle_reset con 0x55cba91fc800 session 0x55cbab99e1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:19.754402+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120758272 unmapped: 19529728 heap: 140288000 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 ms_handle_reset con 0x55cbaae03c00 session 0x55cbaa5a0b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 heartbeat osd_stat(store_statfs(0x1b09b3000/0x0/0x1bfc00000, data 0xac37886/0xacdb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 ms_handle_reset con 0x55cbaae04000 session 0x55cba83a63c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbab53f000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2048514 data_alloc: 301989888 data_used: 23539712
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:20.754597+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 137601024 unmapped: 11001856 heap: 148602880 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 ms_handle_reset con 0x55cbaae00000 session 0x55cbaa55dc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 ms_handle_reset con 0x55cbaae90000 session 0x55cbab5c1a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 ms_handle_reset con 0x55cbab53f000 session 0x55cbaadfe3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fc800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 ms_handle_reset con 0x55cbaae02800 session 0x55cbab989c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:21.754745+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136667136 unmapped: 14032896 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 105 ms_handle_reset con 0x55cba91fc800 session 0x55cbaa264b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 105 ms_handle_reset con 0x55cbaa3ef800 session 0x55cba92410e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:22.754945+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136830976 unmapped: 13869056 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.785494804s of 10.234784126s, submitted: 446
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:23.755140+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136896512 unmapped: 13803520 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:24.755316+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136904704 unmapped: 13795328 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 handle_osd_map epochs [105,106], i have 106, src has [1,106]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 ms_handle_reset con 0x55cbaafa9c00 session 0x55cbab5fde00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafab000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 ms_handle_reset con 0x55cbaafab000 session 0x55cbaac3a960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 ms_handle_reset con 0x55cbabc8e800 session 0x55cbab66dc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1740793 data_alloc: 301989888 data_used: 8290304
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:25.755550+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120152064 unmapped: 30547968 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 heartbeat osd_stat(store_statfs(0x1b3346000/0x0/0x1bfc00000, data 0x7baa2a6/0x7c4d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:26.755707+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120152064 unmapped: 30547968 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 heartbeat osd_stat(store_statfs(0x1b3346000/0x0/0x1bfc00000, data 0x7baa2a6/0x7c4d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 106 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.11] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:27.755887+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.17] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[2.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120152064 unmapped: 30547968 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b3341000/0x0/0x1bfc00000, data 0x7bac4f4/0x7c51000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:28.756030+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 120152064 unmapped: 30547968 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:29.756194+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123977728 unmapped: 26722304 heap: 150700032 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1980112 data_alloc: 301989888 data_used: 11501568
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:30.756361+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 ms_handle_reset con 0x55cbabca2400 session 0x55cbaadfcd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125026304 unmapped: 40845312 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b216c000/0x0/0x1bfc00000, data 0x947b566/0x9522000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b216c000/0x0/0x1bfc00000, data 0x947b566/0x9522000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:31.756585+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125042688 unmapped: 40828928 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 heartbeat osd_stat(store_statfs(0x1b2092000/0x0/0x1bfc00000, data 0x9555566/0x95fc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafadc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 ms_handle_reset con 0x55cbaafadc00 session 0x55cba9051860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:32.756694+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125198336 unmapped: 40673280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 ms_handle_reset con 0x55cbaafa9c00 session 0x55cba9d89680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafab000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.563639641s of 10.383434296s, submitted: 198
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 ms_handle_reset con 0x55cbabc8e800 session 0x55cba821b0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b2068000/0x0/0x1bfc00000, data 0x957b932/0x9625000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:33.756842+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125255680 unmapped: 40615936 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:34.756975+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 ms_handle_reset con 0x55cbabca2400 session 0x55cba821a000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 121749504 unmapped: 44122112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 ms_handle_reset con 0x55cbaae04c00 session 0x55cbaa001e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:35.757171+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1820405 data_alloc: 301989888 data_used: 15532032
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123379712 unmapped: 42491904 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 heartbeat osd_stat(store_statfs(0x1b3b27000/0x0/0x1bfc00000, data 0x7ab88e0/0x7b62000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:36.757380+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125255680 unmapped: 40615936 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:37.757565+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 127746048 unmapped: 38125568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 ms_handle_reset con 0x55cbaafab000 session 0x55cbaadfe000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:38.757740+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 127598592 unmapped: 38273024 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 ms_handle_reset con 0x55cbaae04c00 session 0x55cbaa356f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:39.757926+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123904000 unmapped: 41967616 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:40.758091+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1609410 data_alloc: 301989888 data_used: 12931072
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123904000 unmapped: 41967616 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b51ea000/0x0/0x1bfc00000, data 0x60edaac/0x6196000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:41.758298+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123904000 unmapped: 41967616 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:42.758430+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123904000 unmapped: 41967616 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:43.758601+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123904000 unmapped: 41967616 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:44.758787+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 123904000 unmapped: 41967616 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:45.758939+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1609570 data_alloc: 301989888 data_used: 12935168
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.672436714s of 12.144534111s, submitted: 154
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126369792 unmapped: 39501824 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b51ea000/0x0/0x1bfc00000, data 0x60edaac/0x6196000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [0,0,0,1,4,2])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:46.759101+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130031616 unmapped: 35840000 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:47.759240+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130031616 unmapped: 35840000 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:48.759397+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130539520 unmapped: 35332096 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:49.759586+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130539520 unmapped: 35332096 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:50.759760+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1760842 data_alloc: 301989888 data_used: 14696448
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130539520 unmapped: 35332096 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b43a8000/0x0/0x1bfc00000, data 0x723daac/0x72e6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:51.759928+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130539520 unmapped: 35332096 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:52.760137+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130539520 unmapped: 35332096 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b43a8000/0x0/0x1bfc00000, data 0x723daac/0x72e6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:53.760348+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130473984 unmapped: 35397632 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:54.760479+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130473984 unmapped: 35397632 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:55.760685+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1755786 data_alloc: 301989888 data_used: 14700544
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130473984 unmapped: 35397632 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:56.760826+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.564716339s of 11.115330696s, submitted: 168
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130473984 unmapped: 35397632 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 ms_handle_reset con 0x55cbaae93000 session 0x55cba9d88b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 ms_handle_reset con 0x55cbabca2c00 session 0x55cbab66cb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:57.761080+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 ms_handle_reset con 0x55cbaa51d400 session 0x55cbaa264b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:58.761266+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b6288000/0x0/0x1bfc00000, data 0x535ca3a/0x5403000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:59.761448+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:00.761613+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1480767 data_alloc: 301989888 data_used: 8335360
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:01.761811+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:02.761944+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:03.762165+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:04.762347+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b6288000/0x0/0x1bfc00000, data 0x535ca3a/0x5403000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:05.762484+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1480767 data_alloc: 301989888 data_used: 8335360
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:06.762612+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b6288000/0x0/0x1bfc00000, data 0x535ca3a/0x5403000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:07.762690+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:08.762819+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b6288000/0x0/0x1bfc00000, data 0x535ca3a/0x5403000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:09.762996+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:10.763218+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1480767 data_alloc: 301989888 data_used: 8335360
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:11.764746+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:12.764896+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:13.765071+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124862464 unmapped: 41009152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 heartbeat osd_stat(store_statfs(0x1b6288000/0x0/0x1bfc00000, data 0x535ca3a/0x5403000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 17.403413773s of 17.594388962s, submitted: 55
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:14.765226+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124887040 unmapped: 40984576 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 110 ms_handle_reset con 0x55cbaafa9400 session 0x55cba920c780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:15.765387+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1491730 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124903424 unmapped: 40968192 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:16.765578+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124895232 unmapped: 40976384 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 110 handle_osd_map epochs [110,111], i have 110, src has [1,111]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 111 ms_handle_reset con 0x55cbaa51d400 session 0x55cbaa1121e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:17.765749+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124952576 unmapped: 40919040 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:18.765921+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124952576 unmapped: 40919040 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6282000/0x0/0x1bfc00000, data 0x536115e/0x540b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:19.766121+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124952576 unmapped: 40919040 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6282000/0x0/0x1bfc00000, data 0x536115e/0x540b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:20.766296+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1490547 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124952576 unmapped: 40919040 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:21.766466+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124960768 unmapped: 40910848 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 111 heartbeat osd_stat(store_statfs(0x1b6282000/0x0/0x1bfc00000, data 0x536115e/0x540b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:22.766620+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124968960 unmapped: 40902656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:23.766829+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124968960 unmapped: 40902656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:24.766951+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124968960 unmapped: 40902656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:25.767147+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124968960 unmapped: 40902656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:26.767390+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124968960 unmapped: 40902656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:27.767569+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124968960 unmapped: 40902656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:28.767709+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:29.767944+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:30.768131+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:31.768318+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:32.768526+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:33.768740+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:34.768914+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124977152 unmapped: 40894464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:35.769106+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:36.769259+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:37.769411+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:38.769550+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:39.769740+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:40.769894+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:41.770109+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124985344 unmapped: 40886272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:42.770321+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124993536 unmapped: 40878080 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:43.770538+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124993536 unmapped: 40878080 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:44.770715+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 124993536 unmapped: 40878080 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:45.770929+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125001728 unmapped: 40869888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:46.771161+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125001728 unmapped: 40869888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:47.771816+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125001728 unmapped: 40869888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:48.772006+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125001728 unmapped: 40869888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:49.772295+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125001728 unmapped: 40869888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:50.772448+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125001728 unmapped: 40869888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:51.772661+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:52.772829+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:53.773009+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:54.773198+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:55.773388+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:56.773586+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:57.773753+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:58.773954+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125009920 unmapped: 40861696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:59.774130+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125018112 unmapped: 40853504 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:00.774303+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125018112 unmapped: 40853504 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:01.774582+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125018112 unmapped: 40853504 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:02.774740+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125018112 unmapped: 40853504 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:03.774982+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125026304 unmapped: 40845312 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:04.775192+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125026304 unmapped: 40845312 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:05.775378+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125026304 unmapped: 40845312 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:06.775568+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125026304 unmapped: 40845312 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:07.775727+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:08.776022+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:09.776196+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:10.776397+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:11.776578+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:12.776804+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:13.777061+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:14.777338+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125034496 unmapped: 40837120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:15.777537+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 heartbeat osd_stat(store_statfs(0x1b627f000/0x0/0x1bfc00000, data 0x53633ac/0x540f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492669 data_alloc: 301989888 data_used: 8347648
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125042688 unmapped: 40828928 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:16.777673+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 62.159889221s of 62.601642609s, submitted: 94
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125067264 unmapped: 40804352 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:17.777827+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125075456 unmapped: 40796160 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 113 handle_osd_map epochs [113,113], i have 113, src has [1,113]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 113 ms_handle_reset con 0x55cbaae04000 session 0x55cbaa02b4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:18.778083+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125075456 unmapped: 40796160 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:19.778324+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125075456 unmapped: 40796160 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:20.778501+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 113 heartbeat osd_stat(store_statfs(0x1b6276000/0x0/0x1bfc00000, data 0x5365b76/0x5418000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1507932 data_alloc: 301989888 data_used: 8359936
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133513216 unmapped: 32358400 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:21.778764+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133480448 unmapped: 32391168 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:22.778904+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133505024 unmapped: 32366592 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 115 handle_osd_map epochs [114,115], i have 115, src has [1,115]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 115 ms_handle_reset con 0x55cbaae06000 session 0x55cbaa02de00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:23.779105+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133537792 unmapped: 32333824 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 115 heartbeat osd_stat(store_statfs(0x1b1a6a000/0x0/0x1bfc00000, data 0x9b6a327/0x9c23000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:24.779296+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 115 ms_handle_reset con 0x55cbaae05c00 session 0x55cbaa265680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125190144 unmapped: 40681472 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 116 ms_handle_reset con 0x55cbabca2800 session 0x55cbaa02b680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:25.779440+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2024890 data_alloc: 301989888 data_used: 8372224
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125263872 unmapped: 40607744 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 117 ms_handle_reset con 0x55cbaae04000 session 0x55cba7f8fa40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:26.779557+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 117 ms_handle_reset con 0x55cbaa51d400 session 0x55cbaa5a10e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.344388008s of 10.079930305s, submitted: 168
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125296640 unmapped: 40574976 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:27.779706+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 handle_osd_map epochs [118,119], i have 119, src has [1,119]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 ms_handle_reset con 0x55cbaae05c00 session 0x55cba920da40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125329408 unmapped: 40542208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 handle_osd_map epochs [118,119], i have 119, src has [1,119]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:28.779931+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125353984 unmapped: 40517632 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 heartbeat osd_stat(store_statfs(0x1b625a000/0x0/0x1bfc00000, data 0x5372d1f/0x5430000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:29.780092+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125370368 unmapped: 40501248 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:30.985846+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:34.090057+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125370368 unmapped: 40501248 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1546440 data_alloc: 301989888 data_used: 8372224
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 heartbeat osd_stat(store_statfs(0x1b625e000/0x0/0x1bfc00000, data 0x5372d1f/0x5430000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 heartbeat osd_stat(store_statfs(0x1b625e000/0x0/0x1bfc00000, data 0x5372d1f/0x5430000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:35.090168+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125386752 unmapped: 40484864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:36.090264+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125386752 unmapped: 40484864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 heartbeat osd_stat(store_statfs(0x1b625d000/0x0/0x1bfc00000, data 0x5372d2f/0x5431000, compress 0x0/0x0/0x0, omap 0x647, meta 0x456f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 ms_handle_reset con 0x55cbaae06000 session 0x55cba920c000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:37.090378+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 125411328 unmapped: 40460288 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.681238174s of 10.042939186s, submitted: 110
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 120 ms_handle_reset con 0x55cbabca2800 session 0x55cba821b4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:38.090545+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126590976 unmapped: 39280640 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 121 ms_handle_reset con 0x55cbaa51d400 session 0x55cba83a4780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 121 ms_handle_reset con 0x55cbaae04000 session 0x55cba83a52c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:39.090663+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126664704 unmapped: 39206912 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1556941 data_alloc: 301989888 data_used: 8384512
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:40.090794+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126664704 unmapped: 39206912 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:41.090945+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126664704 unmapped: 39206912 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:42.091377+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126672896 unmapped: 39198720 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 122 heartbeat osd_stat(store_statfs(0x1b5e52000/0x0/0x1bfc00000, data 0x5379577/0x5438000, compress 0x0/0x0/0x0, omap 0x647, meta 0x496f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:43.091545+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126672896 unmapped: 39198720 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:44.091700+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126697472 unmapped: 39174144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1557545 data_alloc: 301989888 data_used: 8384512
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:45.091884+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126697472 unmapped: 39174144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:46.092026+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126697472 unmapped: 39174144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:47.092165+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126697472 unmapped: 39174144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 123 heartbeat osd_stat(store_statfs(0x1b5e51000/0x0/0x1bfc00000, data 0x537b7c5/0x543c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x496f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.976301193s of 10.407719612s, submitted: 157
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:48.092299+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126705664 unmapped: 39165952 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:49.092470+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126705664 unmapped: 39165952 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1561995 data_alloc: 301989888 data_used: 8384512
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbab53fc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 123 ms_handle_reset con 0x55cbab53fc00 session 0x55cba83a41e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:50.092618+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126722048 unmapped: 39149568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:51.092796+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126722048 unmapped: 39149568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 123 heartbeat osd_stat(store_statfs(0x1b5e52000/0x0/0x1bfc00000, data 0x537b7c5/0x543c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x496f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:52.092950+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126722048 unmapped: 39149568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:53.093095+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126722048 unmapped: 39149568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:54.093265+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126722048 unmapped: 39149568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559411 data_alloc: 301989888 data_used: 8384512
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fcc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:55.093456+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126713856 unmapped: 39157760 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 124 ms_handle_reset con 0x55cba91fcc00 session 0x55cba83a45a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:56.093592+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126730240 unmapped: 39141376 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 124 heartbeat osd_stat(store_statfs(0x1b5e48000/0x0/0x1bfc00000, data 0x537eaaa/0x5445000, compress 0x0/0x0/0x0, omap 0x647, meta 0x496f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 124 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 124 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 125 ms_handle_reset con 0x55cbaae93000 session 0x55cba83a5e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:51.059 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:51 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:51.060 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:57.093735+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126763008 unmapped: 39108608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fcc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.970922470s of 10.311238289s, submitted: 79
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 126 ms_handle_reset con 0x55cba91fcc00 session 0x55cba8bde960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:58.093879+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126877696 unmapped: 38993920 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:59.094024+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 126902272 unmapped: 38969344 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1595921 data_alloc: 301989888 data_used: 8409088
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 127 handle_osd_map epochs [126,127], i have 127, src has [1,127]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 127 handle_osd_map epochs [127,127], i have 127, src has [1,127]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 128 ms_handle_reset con 0x55cbaa51d400 session 0x55cba9fffc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:00.094160+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 128 ms_handle_reset con 0x55cba8485000 session 0x55cbaa02c1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 127016960 unmapped: 38854656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 128 handle_osd_map epochs [128,129], i have 128, src has [1,129]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 129 heartbeat osd_stat(store_statfs(0x1b5e2c000/0x0/0x1bfc00000, data 0x53899c0/0x5460000, compress 0x0/0x0/0x0, omap 0x647, meta 0x496f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae97c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 129 ms_handle_reset con 0x55cbaae97c00 session 0x55cba9dd23c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:01.094318+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 127033344 unmapped: 38838272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae92800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 129 handle_osd_map epochs [129,130], i have 129, src has [1,130]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 130 handle_osd_map epochs [130,130], i have 130, src has [1,130]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:02.094577+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 130 ms_handle_reset con 0x55cbaae92800 session 0x55cba9dd3e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 127098880 unmapped: 38772736 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 130 heartbeat osd_stat(store_statfs(0x1b4c7f000/0x0/0x1bfc00000, data 0x538ef78/0x546e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:03.094692+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 131 heartbeat osd_stat(store_statfs(0x1b4c7f000/0x0/0x1bfc00000, data 0x538ef78/0x546e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fcc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 127131648 unmapped: 38739968 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 131 ms_handle_reset con 0x55cba8485000 session 0x55cbab99ed20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:04.094834+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1631707 data_alloc: 301989888 data_used: 8421376
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128270336 unmapped: 37601280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 132 heartbeat osd_stat(store_statfs(0x1b4c7b000/0x0/0x1bfc00000, data 0x5391311/0x5471000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 132 ms_handle_reset con 0x55cbaa51d400 session 0x55cbab99f4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 132 ms_handle_reset con 0x55cba91fcc00 session 0x55cba7f8fe00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:05.095280+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 132 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128393216 unmapped: 37478400 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 133 handle_osd_map epochs [133,133], i have 133, src has [1,133]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae96400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:06.095426+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 133 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 134 ms_handle_reset con 0x55cbaa5ef000 session 0x55cbab99e960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128499712 unmapped: 37371904 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 134 ms_handle_reset con 0x55cbaae96400 session 0x55cbab672d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 134 heartbeat osd_stat(store_statfs(0x1b4c75000/0x0/0x1bfc00000, data 0x5395d50/0x5474000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae96400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:07.095567+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128598016 unmapped: 37273600 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7799800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 135 ms_handle_reset con 0x55cbaae96400 session 0x55cbab99f680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.463511467s of 10.003138542s, submitted: 463
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:08.095703+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 136 ms_handle_reset con 0x55cba7799800 session 0x55cbaa02cd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128647168 unmapped: 37224448 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:09.096088+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 137 ms_handle_reset con 0x55cba8485000 session 0x55cbab99f0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1633575 data_alloc: 301989888 data_used: 8450048
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128745472 unmapped: 37126144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 137 handle_osd_map epochs [135,137], i have 137, src has [1,137]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:10.096227+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128745472 unmapped: 37126144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:11.096375+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128745472 unmapped: 37126144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 137 heartbeat osd_stat(store_statfs(0x1b4c74000/0x0/0x1bfc00000, data 0x539aa39/0x5474000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:12.096568+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128745472 unmapped: 37126144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:13.096665+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128761856 unmapped: 37109760 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 138 ms_handle_reset con 0x55cbabca1c00 session 0x55cbab99e780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:14.096803+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 138 heartbeat osd_stat(store_statfs(0x1b4c74000/0x0/0x1bfc00000, data 0x539aa39/0x5474000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1637161 data_alloc: 301989888 data_used: 8454144
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128761856 unmapped: 37109760 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:15.097013+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128770048 unmapped: 37101568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cbaae05c00 session 0x55cbab99e000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:16.097167+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128811008 unmapped: 37060608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7799800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:17.097306+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cba8485000 session 0x55cbaa55cb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cba7799800 session 0x55cbaa5abe00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128827392 unmapped: 37044224 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae96400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cbaae96400 session 0x55cbaadf6960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:18.097454+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 heartbeat osd_stat(store_statfs(0x1b4c6e000/0x0/0x1bfc00000, data 0x539f145/0x5480000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128827392 unmapped: 37044224 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cbabca1c00 session 0x55cbaa55d680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.453866959s of 10.827260017s, submitted: 111
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cbaa279400 session 0x55cba920c1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:19.097565+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1645534 data_alloc: 301989888 data_used: 8466432
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128851968 unmapped: 37019648 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:20.097701+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128851968 unmapped: 37019648 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 ms_handle_reset con 0x55cbaa5ef800 session 0x55cba821a1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:21.097834+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7d70000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128860160 unmapped: 37011456 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 ms_handle_reset con 0x55cba7d70000 session 0x55cbaa0012c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:22.097984+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128933888 unmapped: 36937728 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae90800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 ms_handle_reset con 0x55cbaae90800 session 0x55cba7f8f680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 heartbeat osd_stat(store_statfs(0x1b4c6c000/0x0/0x1bfc00000, data 0x53a142d/0x5481000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets getting new tickets!
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:23.098222+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _finish_auth 0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:23.099844+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 ms_handle_reset con 0x55cbaae91000 session 0x55cba9051e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128966656 unmapped: 36904960 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:24.098356+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1644741 data_alloc: 301989888 data_used: 8478720
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 128966656 unmapped: 36904960 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 47
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:25.098519+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129212416 unmapped: 36659200 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbab53f800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 ms_handle_reset con 0x55cbab53f800 session 0x55cba9fff680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:26.098774+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129228800 unmapped: 36642816 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 heartbeat osd_stat(store_statfs(0x1b4c6d000/0x0/0x1bfc00000, data 0x53a142d/0x5481000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:27.098916+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129228800 unmapped: 36642816 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7d70000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 ms_handle_reset con 0x55cba7d70000 session 0x55cbaadf7a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:28.100745+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129261568 unmapped: 36610048 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 48
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:29.100881+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1657577 data_alloc: 301989888 data_used: 8491008
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:30.101035+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:31.101176+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 141 heartbeat osd_stat(store_statfs(0x1b4c65000/0x0/0x1bfc00000, data 0x53a37b7/0x5488000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:32.101340+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:33.101520+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:34.101706+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 15.072218895s of 15.370197296s, submitted: 98
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1656240 data_alloc: 301989888 data_used: 8491008
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:35.102153+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129294336 unmapped: 36577280 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:36.102332+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 143 handle_osd_map epochs [143,143], i have 143, src has [1,143]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129302528 unmapped: 36569088 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:37.102450+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 143 heartbeat osd_stat(store_statfs(0x1b4c61000/0x0/0x1bfc00000, data 0x53a5b1f/0x548c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129302528 unmapped: 36569088 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 143 heartbeat osd_stat(store_statfs(0x1b4c5c000/0x0/0x1bfc00000, data 0x53a7e87/0x5490000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:38.102567+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 129335296 unmapped: 36536320 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:39.102750+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 145 handle_osd_map epochs [145,145], i have 145, src has [1,145]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1670230 data_alloc: 301989888 data_used: 8515584
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130392064 unmapped: 35479552 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:40.103003+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130392064 unmapped: 35479552 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:41.103193+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130375680 unmapped: 35495936 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:42.103344+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130375680 unmapped: 35495936 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 145 heartbeat osd_stat(store_statfs(0x1b4c59000/0x0/0x1bfc00000, data 0x53ac52b/0x5495000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 145 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:43.103489+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130383872 unmapped: 35487744 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:44.103630+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1670304 data_alloc: 301989888 data_used: 8531968
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130383872 unmapped: 35487744 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fc800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.308926582s of 10.696312904s, submitted: 126
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 146 ms_handle_reset con 0x55cba91fc800 session 0x55cba821ad20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:45.103871+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 146 heartbeat osd_stat(store_statfs(0x1b4c54000/0x0/0x1bfc00000, data 0x53ae78a/0x549a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130408448 unmapped: 35463168 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:46.103997+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 146 heartbeat osd_stat(store_statfs(0x1b4c54000/0x0/0x1bfc00000, data 0x53ae78a/0x549a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 ms_handle_reset con 0x55cbabca1400 session 0x55cbaadf6000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130433024 unmapped: 35438592 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:47.104110+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130433024 unmapped: 35438592 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7799800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:48.104251+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 ms_handle_reset con 0x55cba7799800 session 0x55cbaa112f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130433024 unmapped: 35438592 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 heartbeat osd_stat(store_statfs(0x1b4c4e000/0x0/0x1bfc00000, data 0x53b0af2/0x549e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 147 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:49.105720+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1677720 data_alloc: 301989888 data_used: 8548352
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130441216 unmapped: 35430400 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 148 heartbeat osd_stat(store_statfs(0x1b4c4b000/0x0/0x1bfc00000, data 0x53b2eae/0x54a2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:50.105865+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130441216 unmapped: 35430400 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:51.106002+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130441216 unmapped: 35430400 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:52.106185+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130441216 unmapped: 35430400 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 148 handle_osd_map epochs [148,149], i have 148, src has [1,149]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:53.106321+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 heartbeat osd_stat(store_statfs(0x1b4c48000/0x0/0x1bfc00000, data 0x53b50eb/0x54a5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:54.106447+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1680722 data_alloc: 301989888 data_used: 8548352
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:55.106833+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:56.107064+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:57.107165+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:58.107582+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:59.107819+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 heartbeat osd_stat(store_statfs(0x1b4c48000/0x0/0x1bfc00000, data 0x53b50eb/0x54a5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1680722 data_alloc: 301989888 data_used: 8548352
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:00.108106+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:01.108984+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:02.109126+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:03.109287+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:04.109551+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 heartbeat osd_stat(store_statfs(0x1b4c48000/0x0/0x1bfc00000, data 0x53b50eb/0x54a5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1680722 data_alloc: 301989888 data_used: 8548352
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:05.109731+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130449408 unmapped: 35422208 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 20.863721848s of 20.963228226s, submitted: 46
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8bce400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 ms_handle_reset con 0x55cba8bce400 session 0x55cbaa2683c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:06.109907+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130473984 unmapped: 35397632 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7799800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7d70000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:07.110025+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 heartbeat osd_stat(store_statfs(0x1b4c46000/0x0/0x1bfc00000, data 0x53b5283/0x54a8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 ms_handle_reset con 0x55cba7d70000 session 0x55cbaac3b860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130506752 unmapped: 35364864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:08.110163+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130506752 unmapped: 35364864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:09.110314+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1683433 data_alloc: 301989888 data_used: 8548352
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130506752 unmapped: 35364864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:10.110463+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130506752 unmapped: 35364864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:11.110671+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130506752 unmapped: 35364864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:12.110806+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 heartbeat osd_stat(store_statfs(0x1b4c48000/0x0/0x1bfc00000, data 0x53b5322/0x54a6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130506752 unmapped: 35364864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:13.111213+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130514944 unmapped: 35356672 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:14.111447+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1692249 data_alloc: 301989888 data_used: 8560640
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130523136 unmapped: 35348480 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 151 handle_osd_map epochs [150,151], i have 151, src has [1,151]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:15.111664+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130531328 unmapped: 35340288 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:16.112061+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 151 handle_osd_map epochs [150,151], i have 151, src has [1,151]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.286362648s of 10.646632195s, submitted: 139
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130539520 unmapped: 35332096 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca3400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 151 ms_handle_reset con 0x55cbabca3400 session 0x55cbab6712c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:17.112311+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 151 heartbeat osd_stat(store_statfs(0x1b4c40000/0x0/0x1bfc00000, data 0x53b9b13/0x54ad000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130555904 unmapped: 35315712 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:18.112479+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130555904 unmapped: 35315712 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:19.112716+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1694561 data_alloc: 301989888 data_used: 8560640
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130555904 unmapped: 35315712 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:20.112857+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaab6d000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaab6d000 session 0x55cba83a7860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130572288 unmapped: 35299328 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:21.113043+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 130580480 unmapped: 35291136 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:22.113205+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131629056 unmapped: 34242560 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaa5ee800 session 0x55cbaa264d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafacc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:23.113360+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4c3b000/0x0/0x1bfc00000, data 0x53bbe5e/0x54b3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaafacc00 session 0x55cba9240b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 132153344 unmapped: 33718272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:24.113516+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1859227 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 132153344 unmapped: 33718272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafac400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:25.113678+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaafac400 session 0x55cbab8ad680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b42d6000/0x0/0x1bfc00000, data 0x53bbdfc/0x54b2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131989504 unmapped: 33882112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:26.113835+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131989504 unmapped: 33882112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.116103172s of 10.465703964s, submitted: 94
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:27.113960+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131989504 unmapped: 33882112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b42d5000/0x0/0x1bfc00000, data 0x53bbe97/0x54b3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:28.114288+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131989504 unmapped: 33882112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:29.114765+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1708721 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131989504 unmapped: 33882112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 49
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:30.114970+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4c3a000/0x0/0x1bfc00000, data 0x53bbf8d/0x54b4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131801088 unmapped: 34070528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:31.115231+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131751936 unmapped: 34119680 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa51d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:32.115415+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaae93000 session 0x55cbaa264960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131776512 unmapped: 34095104 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:33.115706+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131776512 unmapped: 34095104 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:34.115923+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4c36000/0x0/0x1bfc00000, data 0x53bc28e/0x54b8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaa5ef800 session 0x55cbaadf6d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1718619 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaa5ee800 session 0x55cbabcdab40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131801088 unmapped: 34070528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:35.116256+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131801088 unmapped: 34070528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4c37000/0x0/0x1bfc00000, data 0x53bc265/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5b0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:36.116465+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131801088 unmapped: 34070528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.838531494s of 10.041240692s, submitted: 46
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:37.116685+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:38.116884+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:39.117054+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1715240 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4839000/0x0/0x1bfc00000, data 0x53bc2f9/0x54b5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:40.117206+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:41.117360+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x53bc394/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:42.117612+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:43.117774+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:44.117931+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x53bc394/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1717008 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131817472 unmapped: 34054144 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:45.118103+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaa3ee800 session 0x55cbab99f2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131825664 unmapped: 34045952 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:46.121083+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4836000/0x0/0x1bfc00000, data 0x53bc407/0x54b8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131825664 unmapped: 34045952 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:47.121233+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.417752266s of 10.545260429s, submitted: 31
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaa5ee800 session 0x55cba821bc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 ms_handle_reset con 0x55cbaa5ef800 session 0x55cba920c3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131850240 unmapped: 34021376 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:48.121444+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131858432 unmapped: 34013184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:49.121662+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1720364 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131858432 unmapped: 34013184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:50.121822+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:51.122058+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x53bc45e/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:52.122268+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:53.122452+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:54.122599+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1721195 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:55.122804+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:56.122970+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:57.123133+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x53bc528/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131883008 unmapped: 33988608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:58.123294+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131891200 unmapped: 33980416 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:59.123462+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1722465 data_alloc: 301989888 data_used: 8564736
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131891200 unmapped: 33980416 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 12.356285095s of 12.577958107s, submitted: 53
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:00.123613+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x53bc528/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131891200 unmapped: 33980416 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4838000/0x0/0x1bfc00000, data 0x53bc528/0x54b6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:01.123759+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131891200 unmapped: 33980416 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:02.123884+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131891200 unmapped: 33980416 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 heartbeat osd_stat(store_statfs(0x1b4839000/0x0/0x1bfc00000, data 0x53bc557/0x54b5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:03.124274+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131907584 unmapped: 33964032 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:04.124460+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1730847 data_alloc: 301989888 data_used: 8577024
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131940352 unmapped: 33931264 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 154 handle_osd_map epochs [153,154], i have 154, src has [1,154]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:05.124626+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131948544 unmapped: 33923072 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca3400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 155 ms_handle_reset con 0x55cbabca3400 session 0x55cbab988780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:06.124775+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 131981312 unmapped: 33890304 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 156 handle_osd_map epochs [156,156], i have 156, src has [1,156]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:07.124923+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 156 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbaa5a05a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 132030464 unmapped: 33841152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:08.125064+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 156 heartbeat osd_stat(store_statfs(0x1b481f000/0x0/0x1bfc00000, data 0x53c5772/0x54cb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 132055040 unmapped: 33816576 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:09.125248+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 157 ms_handle_reset con 0x55cbaae95800 session 0x55cba83a4000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1747277 data_alloc: 301989888 data_used: 8605696
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133111808 unmapped: 32759808 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 157 heartbeat osd_stat(store_statfs(0x1b4823000/0x0/0x1bfc00000, data 0x53c79ee/0x54ca000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 157 ms_handle_reset con 0x55cbaa5ee800 session 0x55cba8bdeb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:10.125461+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.261688232s of 10.196164131s, submitted: 128
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133152768 unmapped: 32718848 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:11.125604+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133152768 unmapped: 32718848 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:12.125701+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133160960 unmapped: 32710656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:13.125853+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133160960 unmapped: 32710656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:14.126095+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1751643 data_alloc: 301989888 data_used: 8605696
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae90800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133160960 unmapped: 32710656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 159 ms_handle_reset con 0x55cbaae90800 session 0x55cba7f8e000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:15.126275+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 159 ms_handle_reset con 0x55cbaae91c00 session 0x55cba821b860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 159 heartbeat osd_stat(store_statfs(0x1b481c000/0x0/0x1bfc00000, data 0x53cc26f/0x54d2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133160960 unmapped: 32710656 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:16.126402+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 159 ms_handle_reset con 0x55cbabca1c00 session 0x55cbaa0df0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133169152 unmapped: 32702464 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:17.126532+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 159 heartbeat osd_stat(store_statfs(0x1b4819000/0x0/0x1bfc00000, data 0x53cc344/0x54d5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133193728 unmapped: 32677888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 heartbeat osd_stat(store_statfs(0x1b4815000/0x0/0x1bfc00000, data 0x53ce568/0x54d8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x5f0f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:18.126721+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca3c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 137584640 unmapped: 28286976 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 ms_handle_reset con 0x55cbabca3c00 session 0x55cbaa2690e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:19.126853+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1880621 data_alloc: 301989888 data_used: 8622080
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 ms_handle_reset con 0x55cbaa5ee800 session 0x55cbaa5ab680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133750784 unmapped: 32120832 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:20.126981+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 heartbeat osd_stat(store_statfs(0x1b42c3000/0x0/0x1bfc00000, data 0x690066c/0x6a0b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 133767168 unmapped: 32104448 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.094021797s of 10.697221756s, submitted: 216
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:21.127107+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 ms_handle_reset con 0x55cbabc8e000 session 0x55cbaafa30e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 heartbeat osd_stat(store_statfs(0x1b42c3000/0x0/0x1bfc00000, data 0x690066c/0x6a0b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 134840320 unmapped: 31031296 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae90000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:22.127234+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 ms_handle_reset con 0x55cbaae90000 session 0x55cbaa02dc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 ms_handle_reset con 0x55cbaa279000 session 0x55cba7f8ef00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 heartbeat osd_stat(store_statfs(0x1b4a99000/0x0/0x1bfc00000, data 0x5da8672/0x5eb3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 134864896 unmapped: 31006720 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:23.127394+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8c800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 ms_handle_reset con 0x55cbabc8c800 session 0x55cbaadf65a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 heartbeat osd_stat(store_statfs(0x1b4a92000/0x0/0x1bfc00000, data 0x5daaa9e/0x5eb9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 134864896 unmapped: 31006720 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:24.127538+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1850234 data_alloc: 301989888 data_used: 8630272
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 161 handle_osd_map epochs [161,162], i have 161, src has [1,162]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 135929856 unmapped: 29941760 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 162 ms_handle_reset con 0x55cbaa279000 session 0x55cbab5fc3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:25.127725+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 162 ms_handle_reset con 0x55cbaae91400 session 0x55cbaadf7c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 135954432 unmapped: 29917184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:26.127910+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca0800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 163 ms_handle_reset con 0x55cbabca0800 session 0x55cbaadf63c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 135987200 unmapped: 29884416 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 164 ms_handle_reset con 0x55cbaa5ef800 session 0x55cba91e70e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:27.128056+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 164 ms_handle_reset con 0x55cbabc8e800 session 0x55cba8bde3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136044544 unmapped: 29827072 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:28.128189+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 165 heartbeat osd_stat(store_statfs(0x1b4e0c000/0x0/0x1bfc00000, data 0x5db14ae/0x5ec0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 165 ms_handle_reset con 0x55cbaa279000 session 0x55cbaa5abc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 165 ms_handle_reset con 0x55cbaa5ef800 session 0x55cbab5625a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136077312 unmapped: 29794304 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:29.129899+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1864504 data_alloc: 301989888 data_used: 8646656
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136077312 unmapped: 29794304 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:30.130047+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 165 heartbeat osd_stat(store_statfs(0x1b4e09000/0x0/0x1bfc00000, data 0x5db38c0/0x5ec5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136077312 unmapped: 29794304 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:31.130218+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.231129646s of 10.838499069s, submitted: 179
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136069120 unmapped: 29802496 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:32.130414+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136069120 unmapped: 29802496 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:33.130704+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136093696 unmapped: 29777920 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:34.130971+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1878536 data_alloc: 301989888 data_used: 8658944
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136093696 unmapped: 29777920 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:35.131784+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 167 heartbeat osd_stat(store_statfs(0x1b4dfd000/0x0/0x1bfc00000, data 0x5db81bc/0x5ecf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 167 ms_handle_reset con 0x55cbabca2400 session 0x55cbaac3ab40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136118272 unmapped: 29753344 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:36.132396+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136126464 unmapped: 29745152 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:37.132620+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 168 ms_handle_reset con 0x55cbaae05800 session 0x55cba9d88780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136200192 unmapped: 29671424 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:38.132809+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 ms_handle_reset con 0x55cbaae95000 session 0x55cbaadf61e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 ms_handle_reset con 0x55cbaa279000 session 0x55cba8b57680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136265728 unmapped: 29605888 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:39.132983+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 ms_handle_reset con 0x55cbaa5ef800 session 0x55cba91e6d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 ms_handle_reset con 0x55cbaae05800 session 0x55cbaa02c000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1892644 data_alloc: 301989888 data_used: 8671232
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136298496 unmapped: 29573120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:40.133404+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 ms_handle_reset con 0x55cbabca2c00 session 0x55cbaadfc780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136298496 unmapped: 29573120 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:41.134217+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 heartbeat osd_stat(store_statfs(0x1b4df2000/0x0/0x1bfc00000, data 0x5dbcc85/0x5edb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.369969368s of 10.000178337s, submitted: 240
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136306688 unmapped: 29564928 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:42.134599+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 170 handle_osd_map epochs [170,170], i have 170, src has [1,170]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 170 ms_handle_reset con 0x55cbaae91c00 session 0x55cbaafa3c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136249344 unmapped: 29622272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:43.134816+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136257536 unmapped: 29614080 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:44.135003+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 171 ms_handle_reset con 0x55cbaa279000 session 0x55cbaa02cd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1917517 data_alloc: 301989888 data_used: 8683520
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136282112 unmapped: 29589504 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:45.135212+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 ms_handle_reset con 0x55cbaae02800 session 0x55cba83a65a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa8c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 ms_handle_reset con 0x55cbaafa8c00 session 0x55cba91e6d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136380416 unmapped: 29491200 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:46.135430+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 heartbeat osd_stat(store_statfs(0x1b4de0000/0x0/0x1bfc00000, data 0x5dc3bcc/0x5eec000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 172 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ee000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 173 ms_handle_reset con 0x55cbaa3ee000 session 0x55cbaa1121e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 173 ms_handle_reset con 0x55cbaae95400 session 0x55cbab5fc3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136249344 unmapped: 29622272 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:47.135715+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 174 handle_osd_map epochs [174,175], i have 174, src has [1,175]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136273920 unmapped: 29597696 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:48.135890+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 175 heartbeat osd_stat(store_statfs(0x1b4dd6000/0x0/0x1bfc00000, data 0x5dca5e3/0x5ef5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 175 ms_handle_reset con 0x55cbaa279000 session 0x55cba7f8f860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136347648 unmapped: 29523968 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:49.136221+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1932422 data_alloc: 301989888 data_used: 8695808
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8f800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136364032 unmapped: 29507584 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:50.136441+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 176 ms_handle_reset con 0x55cbaae8f800 session 0x55cbaadfe000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 176 heartbeat osd_stat(store_statfs(0x1b4dd1000/0x0/0x1bfc00000, data 0x5dcca40/0x5ef9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x4f2f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136364032 unmapped: 29507584 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:51.136732+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136413184 unmapped: 29458432 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:52.136990+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.938403130s of 10.957387924s, submitted: 333
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae97400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 136429568 unmapped: 29442048 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:53.137143+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 178 ms_handle_reset con 0x55cbaa51d400 session 0x55cbaa112780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 178 ms_handle_reset con 0x55cbaae97400 session 0x55cbaafa25a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141778944 unmapped: 24092672 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:54.137295+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 50
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2040648 data_alloc: 301989888 data_used: 8728576
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:55.137515+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141852672 unmapped: 24018944 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae94000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 179 heartbeat osd_stat(store_statfs(0x1b301a000/0x0/0x1bfc00000, data 0x69de5b2/0x6b12000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 179 ms_handle_reset con 0x55cbaae94000 session 0x55cbab562960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:56.137726+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141836288 unmapped: 24035328 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:57.137880+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141770752 unmapped: 24100864 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa279000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 181 handle_osd_map epochs [180,181], i have 181, src has [1,181]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 181 ms_handle_reset con 0x55cbaa279000 session 0x55cbaa2650e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:58.138088+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141836288 unmapped: 24035328 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:59.138244+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141836288 unmapped: 24035328 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1959112 data_alloc: 301989888 data_used: 8744960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:00.138404+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141860864 unmapped: 24010752 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:01.138611+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 183 heartbeat osd_stat(store_statfs(0x1b3c1a000/0x0/0x1bfc00000, data 0x5ddc3fa/0x5f13000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141860864 unmapped: 24010752 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafab000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 183 ms_handle_reset con 0x55cbaafab000 session 0x55cbaa5a0960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:02.138904+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141885440 unmapped: 23986176 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 184 handle_osd_map epochs [184,184], i have 184, src has [1,184]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.056692123s of 10.002618790s, submitted: 589
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 184 heartbeat osd_stat(store_statfs(0x1b3c1a000/0x0/0x1bfc00000, data 0x5ddc3d4/0x5f13000, compress 0x0/0x0/0x0, omap 0x647, meta 0x60cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 184 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 185 ms_handle_reset con 0x55cbabc8e000 session 0x55cbaa5a1a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:03.139080+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141950976 unmapped: 23920640 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8d400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:04.139232+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141950976 unmapped: 23920640 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 186 ms_handle_reset con 0x55cbabc8d400 session 0x55cba7f8f680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1974356 data_alloc: 301989888 data_used: 8777728
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:05.139406+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141950976 unmapped: 23920640 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae92400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 186 ms_handle_reset con 0x55cbaae92400 session 0x55cba7f8f2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:06.139553+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141967360 unmapped: 23904256 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:07.139714+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141967360 unmapped: 23904256 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:08.139866+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b3813000/0x0/0x1bfc00000, data 0x5de2e75/0x5f1b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b3813000/0x0/0x1bfc00000, data 0x5de2e75/0x5f1b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:09.140130+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973768 data_alloc: 301989888 data_used: 8785920
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:10.140280+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b380e000/0x0/0x1bfc00000, data 0x5de5103/0x5f1f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:11.140432+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:12.140606+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b380e000/0x0/0x1bfc00000, data 0x5de5103/0x5f1f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:13.140781+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:14.140926+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b380e000/0x0/0x1bfc00000, data 0x5de5103/0x5f1f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973768 data_alloc: 301989888 data_used: 8785920
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:15.141098+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:16.142126+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:17.142312+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:18.142431+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 heartbeat osd_stat(store_statfs(0x1b380e000/0x0/0x1bfc00000, data 0x5de5103/0x5f1f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:19.142569+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973768 data_alloc: 301989888 data_used: 8785920
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:20.142714+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:21.142866+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 18.383689880s of 18.720594406s, submitted: 125
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:22.143028+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141983744 unmapped: 23887872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 188 handle_osd_map epochs [188,188], i have 188, src has [1,188]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:23.143241+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141991936 unmapped: 23879680 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:24.143392+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141991936 unmapped: 23879680 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 188 heartbeat osd_stat(store_statfs(0x1b3808000/0x0/0x1bfc00000, data 0x5de74dd/0x5f25000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 188 handle_osd_map epochs [188,189], i have 188, src has [1,189]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 189 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1985182 data_alloc: 301989888 data_used: 8810496
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:25.143572+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:26.143742+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:27.143904+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:28.144073+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:29.144236+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 189 heartbeat osd_stat(store_statfs(0x1b3807000/0x0/0x1bfc00000, data 0x5de9827/0x5f27000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1982470 data_alloc: 301989888 data_used: 8810496
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:30.144405+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:31.144569+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae97c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.103258133s of 10.398211479s, submitted: 73
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 189 ms_handle_reset con 0x55cbaae97c00 session 0x55cba83a7860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:32.144739+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142041088 unmapped: 23830528 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:33.144886+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142065664 unmapped: 23805952 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 ms_handle_reset con 0x55cbaae03000 session 0x55cba83a50e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:34.145042+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:35.145211+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1990478 data_alloc: 301989888 data_used: 8822784
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 heartbeat osd_stat(store_statfs(0x1b3801000/0x0/0x1bfc00000, data 0x5debc09/0x5f2b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:36.145394+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:37.145554+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 ms_handle_reset con 0x55cbaae91400 session 0x55cbaa001c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:38.145772+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:39.145966+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 ms_handle_reset con 0x55cbaae04c00 session 0x55cbaadf6960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:40.146135+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1996107 data_alloc: 301989888 data_used: 8822784
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 heartbeat osd_stat(store_statfs(0x1b3800000/0x0/0x1bfc00000, data 0x5debde1/0x5f2e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 191 ms_handle_reset con 0x55cbaae05800 session 0x55cba920c1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:41.146300+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142098432 unmapped: 23773184 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 191 heartbeat osd_stat(store_statfs(0x1b37fb000/0x0/0x1bfc00000, data 0x5dee149/0x5f32000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.794912338s of 10.129559517s, submitted: 117
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 192 ms_handle_reset con 0x55cbaae05800 session 0x55cbaafa2000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:42.146461+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142123008 unmapped: 23748608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 192 ms_handle_reset con 0x55cbaae03000 session 0x55cbaae99a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:43.146627+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142155776 unmapped: 23715840 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 193 ms_handle_reset con 0x55cbaae04c00 session 0x55cba9ffeb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:44.146812+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 193 ms_handle_reset con 0x55cbaae91400 session 0x55cbaa02bc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae97c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141934592 unmapped: 23937024 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 193 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 ms_handle_reset con 0x55cbaae97c00 session 0x55cba7e90960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:45.146985+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2008935 data_alloc: 301989888 data_used: 8863744
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141934592 unmapped: 23937024 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 ms_handle_reset con 0x55cbaae03000 session 0x55cbaadec000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 heartbeat osd_stat(store_statfs(0x1b37f0000/0x0/0x1bfc00000, data 0x5df4c58/0x5f3d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 ms_handle_reset con 0x55cbaae04c00 session 0x55cbaadec780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 heartbeat osd_stat(store_statfs(0x1b37f0000/0x0/0x1bfc00000, data 0x5df4c58/0x5f3d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:46.147177+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141926400 unmapped: 23945216 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 ms_handle_reset con 0x55cbaae05800 session 0x55cbaadfc960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:47.147349+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141934592 unmapped: 23937024 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbaae91400 session 0x55cbaafa30e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:48.147490+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 141934592 unmapped: 23937024 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbab53fc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbab53fc00 session 0x55cbaa5a0f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:49.147627+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbaae03000 session 0x55cbaadffc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142024704 unmapped: 23846912 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 heartbeat osd_stat(store_statfs(0x1b37d2000/0x0/0x1bfc00000, data 0x5e0edb2/0x5f5b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:50.147829+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbaae00000 session 0x55cbaa269e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2029680 data_alloc: 301989888 data_used: 8871936
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8ec00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbabc8ec00 session 0x55cbaa02d4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142082048 unmapped: 23789568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabcc8400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbabcc8400 session 0x55cba83a7860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbaae95800 session 0x55cba7f8e3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:51.147987+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 heartbeat osd_stat(store_statfs(0x1b37be000/0x0/0x1bfc00000, data 0x5e21faa/0x5f6f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142147584 unmapped: 23724032 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:52.148179+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.545290947s of 10.198814392s, submitted: 191
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbaae00000 session 0x55cba7f8f680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142180352 unmapped: 23691264 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 ms_handle_reset con 0x55cbaae03000 session 0x55cbaa5a1a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:53.148339+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 195 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142229504 unmapped: 23642112 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:54.148512+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143319040 unmapped: 22552576 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 196 heartbeat osd_stat(store_statfs(0x1b3790000/0x0/0x1bfc00000, data 0x5e49cb2/0x5f9e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:55.148697+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2041678 data_alloc: 301989888 data_used: 8884224
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143319040 unmapped: 22552576 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 196 ms_handle_reset con 0x55cbabca1400 session 0x55cba91e70e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca3400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 ms_handle_reset con 0x55cbabca3400 session 0x55cbaa55cb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:56.148863+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 ms_handle_reset con 0x55cbaae04c00 session 0x55cbaa02b680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143147008 unmapped: 22724608 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 ms_handle_reset con 0x55cbaae00000 session 0x55cbaac3b4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:57.149034+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142639104 unmapped: 23232512 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:58.149201+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142753792 unmapped: 23117824 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:59.149357+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142802944 unmapped: 23068672 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 heartbeat osd_stat(store_statfs(0x1b3732000/0x0/0x1bfc00000, data 0x5eaa7ab/0x5ff9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:00.149622+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2041797 data_alloc: 301989888 data_used: 8896512
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142925824 unmapped: 22945792 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:01.149866+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 ms_handle_reset con 0x55cbaac19800 session 0x55cbaadf7a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142934016 unmapped: 22937600 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 heartbeat osd_stat(store_statfs(0x1b3736000/0x0/0x1bfc00000, data 0x5eaa7da/0x5ff8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:02.150037+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 142934016 unmapped: 22937600 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.350605965s of 10.255406380s, submitted: 248
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafac400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 ms_handle_reset con 0x55cbaafac400 session 0x55cbaac3b2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 ms_handle_reset con 0x55cba8485000 session 0x55cbab5fd4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:03.150189+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1b3708000/0x0/0x1bfc00000, data 0x5ed30bd/0x6023000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143007744 unmapped: 22863872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:04.150369+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143007744 unmapped: 22863872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:05.150625+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2054738 data_alloc: 301989888 data_used: 8908800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143007744 unmapped: 22863872 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:06.150785+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144130048 unmapped: 21741568 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1b36e7000/0x0/0x1bfc00000, data 0x5ef4cf4/0x6045000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:07.150910+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1b36d6000/0x0/0x1bfc00000, data 0x5f06514/0x6056000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143515648 unmapped: 22355968 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:08.151077+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143515648 unmapped: 22355968 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:09.151382+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144580608 unmapped: 21291008 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:10.151580+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2057762 data_alloc: 301989888 data_used: 8908800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144580608 unmapped: 21291008 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:11.152475+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144580608 unmapped: 21291008 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1b36bf000/0x0/0x1bfc00000, data 0x5f1e796/0x606e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:12.152773+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144605184 unmapped: 21266432 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:13.153116+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144605184 unmapped: 21266432 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.247955322s of 11.661256790s, submitted: 119
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:14.153273+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144605184 unmapped: 21266432 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:15.153491+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2065460 data_alloc: 301989888 data_used: 8908800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144605184 unmapped: 21266432 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8cc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:16.153936+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144670720 unmapped: 21200896 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cbabc8cc00 session 0x55cbaa02b680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1b3681000/0x0/0x1bfc00000, data 0x5f5b32c/0x60ac000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:17.154186+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 153165824 unmapped: 12705792 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:18.154321+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 152879104 unmapped: 12992512 heap: 165871616 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:19.154486+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 143712256 unmapped: 30556160 heap: 174268416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cbaac19800 session 0x55cbaa429a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:20.154749+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2455399 data_alloc: 301989888 data_used: 8908800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144801792 unmapped: 29466624 heap: 174268416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fcc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cba91fcc00 session 0x55cbaa02c3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:21.155036+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 144867328 unmapped: 29401088 heap: 174268416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:22.155197+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 153395200 unmapped: 20873216 heap: 174268416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1ade59000/0x0/0x1bfc00000, data 0xb781ff9/0xb8d5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbab53f000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cbab53f000 session 0x55cbaa55c960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae01400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:23.155373+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 145342464 unmapped: 28925952 heap: 174268416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cbaae01400 session 0x55cbaa02d0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafab400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.084829330s of 10.011775017s, submitted: 169
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:24.155566+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cbaae00400 session 0x55cbaafa2d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 ms_handle_reset con 0x55cbaafab400 session 0x55cba91e6f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1ac068000/0x0/0x1bfc00000, data 0xd571634/0xd6c6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 146669568 unmapped: 35995648 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:25.155832+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3049190 data_alloc: 301989888 data_used: 8908800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 heartbeat osd_stat(store_statfs(0x1aaade000/0x0/0x1bfc00000, data 0xeafb66d/0xec50000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 155164672 unmapped: 27500544 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:26.156077+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa8c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 147005440 unmapped: 35659776 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa8400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:27.156206+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 199 ms_handle_reset con 0x55cbaae91800 session 0x55cbabcdaf00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 147898368 unmapped: 34766848 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 199 ms_handle_reset con 0x55cbaafa9000 session 0x55cba821ad20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:28.156417+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 199 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 148430848 unmapped: 34234368 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 ms_handle_reset con 0x55cbaafa8400 session 0x55cbaadffa40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 ms_handle_reset con 0x55cbaafa8c00 session 0x55cbaa5aa960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:29.156598+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 148717568 unmapped: 33947648 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:30.156784+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae92c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 heartbeat osd_stat(store_statfs(0x1a72d7000/0x0/0x1bfc00000, data 0x122f5752/0x12455000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 ms_handle_reset con 0x55cbaae92c00 session 0x55cbaa001c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3526446 data_alloc: 301989888 data_used: 8925184
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 157261824 unmapped: 25403392 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:31.157230+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 heartbeat osd_stat(store_statfs(0x1a6aaa000/0x0/0x1bfc00000, data 0x12b22e11/0x12c84000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 148955136 unmapped: 33710080 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 heartbeat osd_stat(store_statfs(0x1a5aa8000/0x0/0x1bfc00000, data 0x13b249e3/0x13c86000, compress 0x0/0x0/0x0, omap 0x647, meta 0x64cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae01000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:32.157410+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 149266432 unmapped: 33398784 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 ms_handle_reset con 0x55cbaae91400 session 0x55cba9d88f00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:33.157559+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 158810112 unmapped: 23855104 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafad400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 ms_handle_reset con 0x55cbaafad400 session 0x55cbaa02ba40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:34.157947+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.081013680s of 10.420679092s, submitted: 221
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166338560 unmapped: 16326656 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:35.158179+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4122246 data_alloc: 301989888 data_used: 8937472
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 150667264 unmapped: 31997952 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 ms_handle_reset con 0x55cbabca2c00 session 0x55cbab8ad0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:36.158331+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 ms_handle_reset con 0x55cbaae03000 session 0x55cba9d89860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 heartbeat osd_stat(store_statfs(0x1a0a84000/0x0/0x1bfc00000, data 0x1874916f/0x188aa000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 ms_handle_reset con 0x55cbaae01000 session 0x55cbaa0012c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 150732800 unmapped: 31932416 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 heartbeat osd_stat(store_statfs(0x1a0a84000/0x0/0x1bfc00000, data 0x1874916f/0x188aa000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:37.158463+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 150994944 unmapped: 31670272 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:38.158685+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 ms_handle_reset con 0x55cbaae91400 session 0x55cbaa5b2b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 151126016 unmapped: 31539200 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 heartbeat osd_stat(store_statfs(0x19f23e000/0x0/0x1bfc00000, data 0x19f8caa7/0x1a0ee000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:39.158850+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 202 ms_handle_reset con 0x55cbaae03000 session 0x55cba920de00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 151412736 unmapped: 31252480 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca3400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:40.159047+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 202 heartbeat osd_stat(store_statfs(0x19ea3b000/0x0/0x1bfc00000, data 0x1a78eed8/0x1a8f2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4515750 data_alloc: 301989888 data_used: 8949760
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 151568384 unmapped: 31096832 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:41.159214+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 151568384 unmapped: 31096832 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:42.159377+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 203 heartbeat osd_stat(store_statfs(0x19d5e5000/0x0/0x1bfc00000, data 0x1bbe4b18/0x1bd48000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,0,0,2])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 152707072 unmapped: 29958144 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 203 ms_handle_reset con 0x55cbabca3400 session 0x55cba920c960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:43.159604+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 152723456 unmapped: 29941760 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 204 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:44.159807+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161120256 unmapped: 21544960 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.764348984s of 10.326812744s, submitted: 211
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:45.160008+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4593629 data_alloc: 301989888 data_used: 8966144
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 152666112 unmapped: 29999104 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 204 ms_handle_reset con 0x55cbaae02800 session 0x55cbaa5b3860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:46.160221+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 153206784 unmapped: 29458432 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 204 heartbeat osd_stat(store_statfs(0x19c5c0000/0x0/0x1bfc00000, data 0x1cc0c73b/0x1cd6e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae01000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:47.160422+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 205 ms_handle_reset con 0x55cbaae01000 session 0x55cba9dd21e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 153280512 unmapped: 29384704 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:48.160604+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 206 ms_handle_reset con 0x55cbaae03000 session 0x55cbaa8723c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 206 ms_handle_reset con 0x55cbaae02800 session 0x55cba9240780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161873920 unmapped: 20791296 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:49.160693+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 ms_handle_reset con 0x55cbaae91400 session 0x55cbaab0d2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 heartbeat osd_stat(store_statfs(0x19acae000/0x0/0x1bfc00000, data 0x1e4964af/0x1e5fa000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 153600000 unmapped: 29065216 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae01800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 ms_handle_reset con 0x55cbaae01800 session 0x55cbaadfeb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:50.160881+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4907685 data_alloc: 301989888 data_used: 8982528
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 154771456 unmapped: 27893760 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 heartbeat osd_stat(store_statfs(0x19a7d5000/0x0/0x1bfc00000, data 0x1e950f7a/0x1eab2000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae01000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:51.161104+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 ms_handle_reset con 0x55cbaae01000 session 0x55cba9dd32c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 ms_handle_reset con 0x55cbaae02800 session 0x55cbaa340960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae03000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 156024832 unmapped: 26640384 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 208 ms_handle_reset con 0x55cbaae91400 session 0x55cbaa873c20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:52.161273+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 209 ms_handle_reset con 0x55cbaae03000 session 0x55cbaadec5a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 157220864 unmapped: 25444352 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 209 handle_osd_map epochs [209,210], i have 209, src has [1,210]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 210 heartbeat osd_stat(store_statfs(0x19905e000/0x0/0x1bfc00000, data 0x20168733/0x202cf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:53.161508+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 157384704 unmapped: 25280512 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:54.161694+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 211 ms_handle_reset con 0x55cbaae91c00 session 0x55cbaadfd2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165822464 unmapped: 16842752 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 7.673511982s of 10.014057159s, submitted: 422
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:55.162313+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 211 heartbeat osd_stat(store_statfs(0x196021000/0x0/0x1bfc00000, data 0x231a1724/0x2330c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5419381 data_alloc: 301989888 data_used: 9007104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166133760 unmapped: 16531456 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:56.162513+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbabcdb860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 157827072 unmapped: 24838144 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbaae02000 session 0x55cbabcdba40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbabca2400 session 0x55cbaa02c960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8cc2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cba8cc2c00 session 0x55cbab5fd2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae97400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:57.162678+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbaae97400 session 0x55cbab673860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8cc2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cba8cc2c00 session 0x55cbab5c0000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbaae02000 session 0x55cbaafa2000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8dc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167559168 unmapped: 15106048 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbabc8dc00 session 0x55cbab99e780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 ms_handle_reset con 0x55cbabca2400 session 0x55cbaa2645a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:58.162808+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cbaae00c00 session 0x55cbaa872780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167755776 unmapped: 14909440 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 heartbeat osd_stat(store_statfs(0x192a71000/0x0/0x1bfc00000, data 0x2674ab25/0x268bc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8cc2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:59.163010+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cba8cc2c00 session 0x55cbaa873860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cbaae00c00 session 0x55cba821b860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 159678464 unmapped: 22986752 heap: 182665216 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cbaae02000 session 0x55cbab562b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae04800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cbaae04800 session 0x55cba9d89e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:00.163226+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ee800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cbaa5ee800 session 0x55cba923e1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8cc2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 ms_handle_reset con 0x55cba8cc2c00 session 0x55cbaafa3860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5929743 data_alloc: 301989888 data_used: 9019392
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 168378368 unmapped: 22683648 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:01.163361+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 168648704 unmapped: 22413312 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:02.163533+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161685504 unmapped: 29376512 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:03.163676+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 214 heartbeat osd_stat(store_statfs(0x18d9c9000/0x0/0x1bfc00000, data 0x2b7efaa6/0x2b964000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 160342016 unmapped: 30720000 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:04.163826+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 214 heartbeat osd_stat(store_statfs(0x18c9ca000/0x0/0x1bfc00000, data 0x2c7efaaa/0x2c963000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161505280 unmapped: 29556736 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.557801247s of 10.018026352s, submitted: 296
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:05.164000+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 215 ms_handle_reset con 0x55cba8485000 session 0x55cbaa8721e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 6507653 data_alloc: 301989888 data_used: 10682368
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161513472 unmapped: 29548544 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:06.164207+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8ec00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161546240 unmapped: 29515776 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 215 ms_handle_reset con 0x55cbabc8ec00 session 0x55cbab8ade00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:07.164391+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 161579008 unmapped: 29483008 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:08.164601+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 216 heartbeat osd_stat(store_statfs(0x18c1a0000/0x0/0x1bfc00000, data 0x2d0142f0/0x2d18c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 216 ms_handle_reset con 0x55cbaac19c00 session 0x55cbaadfc000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 160915456 unmapped: 30146560 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:09.164754+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8e400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 216 ms_handle_reset con 0x55cbabc8e400 session 0x55cba7f8e3c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 216 ms_handle_reset con 0x55cbaac19400 session 0x55cba920c000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 216 ms_handle_reset con 0x55cba8485000 session 0x55cbaa3023c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 160407552 unmapped: 30654464 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:10.164935+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2449215 data_alloc: 301989888 data_used: 10694656
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 160407552 unmapped: 30654464 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:11.165107+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 160407552 unmapped: 30654464 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:12.165260+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165502976 unmapped: 25559040 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:13.165414+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165691392 unmapped: 25370624 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b1678000/0x0/0x1bfc00000, data 0x7b36bb3/0x7cae000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:14.165530+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165888000 unmapped: 25174016 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8f000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.775811195s of 10.192414284s, submitted: 398
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaae8f000 session 0x55cbab5c01e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:15.165699+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2562922 data_alloc: 301989888 data_used: 11464704
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165904384 unmapped: 25157632 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:16.165868+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b1623000/0x0/0x1bfc00000, data 0x7b8be4f/0x7d05000, compress 0x0/0x0/0x0, omap 0x647, meta 0x68cf9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,0,0,0,2])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaac19800 session 0x55cbab5fde00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaae06c00 session 0x55cbaa2685a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cba8485000 session 0x55cba7f8e960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaac19400 session 0x55cbaa2692c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165552128 unmapped: 25509888 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:17.166034+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaac19800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaac19800 session 0x55cba9fff2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8f000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165617664 unmapped: 25444352 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaae8f000 session 0x55cba923fc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:18.166188+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165797888 unmapped: 25264128 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:19.166289+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165896192 unmapped: 25165824 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:20.166433+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b11c9000/0x0/0x1bfc00000, data 0x7be9b4a/0x7d64000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2577091 data_alloc: 301989888 data_used: 11464704
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165896192 unmapped: 25165824 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b11c9000/0x0/0x1bfc00000, data 0x7be9b4a/0x7d64000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:21.166582+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 20K writes, 74K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.01 MB/s
                                                          Cumulative WAL: 20K writes, 6880 syncs, 2.95 writes per sync, written: 0.06 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 14K writes, 49K keys, 14K commit groups, 1.0 writes per commit group, ingest: 38.58 MB, 0.06 MB/s
                                                          Interval WAL: 14K writes, 5950 syncs, 2.40 writes per sync, written: 0.04 GB, 0.06 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 165912576 unmapped: 25149440 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:22.166755+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaae06800 session 0x55cbaafa3a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b11bd000/0x0/0x1bfc00000, data 0x7bf89d6/0x7d71000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166739968 unmapped: 24322048 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:23.166925+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166748160 unmapped: 24313856 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:24.167137+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166830080 unmapped: 24231936 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:25.167401+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2581087 data_alloc: 301989888 data_used: 11464704
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.390342712s of 10.694037437s, submitted: 187
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166363136 unmapped: 24698880 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:26.167564+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166379520 unmapped: 24682496 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:27.167820+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166461440 unmapped: 24600576 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1b1145000/0x0/0x1bfc00000, data 0x7c6de69/0x7de9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x6ccf9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:28.167987+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166551552 unmapped: 24510464 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:29.168159+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166567936 unmapped: 24494080 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:30.168367+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1aff7e000/0x0/0x1bfc00000, data 0x7c92c39/0x7e0e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2592241 data_alloc: 301989888 data_used: 11464704
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166682624 unmapped: 24379392 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:31.168529+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 166739968 unmapped: 24322048 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1aff7c000/0x0/0x1bfc00000, data 0x7c95a11/0x7e10000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:32.168692+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167796736 unmapped: 23265280 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1aff3c000/0x0/0x1bfc00000, data 0x7cd3dd1/0x7e50000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:33.168871+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167559168 unmapped: 23502848 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:34.169111+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167559168 unmapped: 23502848 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:35.169302+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 heartbeat osd_stat(store_statfs(0x1aff2b000/0x0/0x1bfc00000, data 0x7ce8805/0x7e63000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2604959 data_alloc: 301989888 data_used: 11472896
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.501451492s of 10.012061119s, submitted: 138
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167698432 unmapped: 23363584 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:36.169467+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 ms_handle_reset con 0x55cbaafa9c00 session 0x55cbab5fd2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167723008 unmapped: 23339008 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:37.169690+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167739392 unmapped: 23322624 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:38.169878+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 218 ms_handle_reset con 0x55cbaae06400 session 0x55cbabcda5a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaab6c000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 167813120 unmapped: 23248896 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:39.170033+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 218 handle_osd_map epochs [218,219], i have 218, src has [1,219]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca1c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 219 ms_handle_reset con 0x55cbabca1c00 session 0x55cbabcda960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 168722432 unmapped: 22339584 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7d70000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 219 heartbeat osd_stat(store_statfs(0x1afe70000/0x0/0x1bfc00000, data 0x7d9e070/0x7f1e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:40.170149+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 219 ms_handle_reset con 0x55cba7d70000 session 0x55cbaadede00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 219 ms_handle_reset con 0x55cbaae93800 session 0x55cbaadfd2c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2632914 data_alloc: 301989888 data_used: 11505664
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 168140800 unmapped: 22921216 heap: 191062016 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:41.170405+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 220 ms_handle_reset con 0x55cbaab6c000 session 0x55cbabcda000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7d70000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 220 heartbeat osd_stat(store_statfs(0x1afe63000/0x0/0x1bfc00000, data 0x7da4af3/0x7f29000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,1,1,1,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 185188352 unmapped: 14270464 heap: 199458816 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:42.170569+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 168108032 unmapped: 39755776 heap: 207863808 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae06400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:43.172684+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176414720 unmapped: 31449088 heap: 207863808 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:44.172851+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 221 ms_handle_reset con 0x55cbaafa9c00 session 0x55cbaadec780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180633600 unmapped: 27230208 heap: 207863808 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:45.173047+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 222 ms_handle_reset con 0x55cbaae06400 session 0x55cbab9883c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4012074 data_alloc: 301989888 data_used: 11534336
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.448264122s of 10.006860733s, submitted: 262
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 169320448 unmapped: 38543360 heap: 207863808 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:46.173217+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 222 heartbeat osd_stat(store_statfs(0x1a41cc000/0x0/0x1bfc00000, data 0x13a35e8c/0x13bbf000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181968896 unmapped: 25894912 heap: 207863808 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:47.173428+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173613056 unmapped: 34250752 heap: 207863808 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 223 handle_osd_map epochs [223,224], i have 223, src has [1,224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:48.173599+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 224 heartbeat osd_stat(store_statfs(0x19edaf000/0x0/0x1bfc00000, data 0x18e52907/0x18fdc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,1,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 182059008 unmapped: 30007296 heap: 212066304 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:49.173812+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba8485c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173662208 unmapped: 38404096 heap: 212066304 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:50.173982+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 224 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5450707 data_alloc: 301989888 data_used: 11534336
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174743552 unmapped: 37322752 heap: 212066304 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae96800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:51.174106+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 225 ms_handle_reset con 0x55cbaae96800 session 0x55cbaa55d0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 225 heartbeat osd_stat(store_statfs(0x195975000/0x0/0x1bfc00000, data 0x22289ad0/0x22419000, compress 0x0/0x0/0x0, omap 0x647, meta 0x7e6f9b9), peers [0,1,2,4,5] op hist [0,0,0,0,0,0,0,1,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174809088 unmapped: 37257216 heap: 212066304 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:52.174268+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 226 ms_handle_reset con 0x55cba8485c00 session 0x55cbaafa21e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 185614336 unmapped: 26451968 heap: 212066304 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:53.174424+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 194363392 unmapped: 21905408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:54.174564+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 ms_handle_reset con 0x55cba7d70000 session 0x55cbab989a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 ms_handle_reset con 0x55cbaae93800 session 0x55cba9dd21e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaab6c000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 ms_handle_reset con 0x55cbaab6c000 session 0x55cbab66a960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 ms_handle_reset con 0x55cbaae05000 session 0x55cbab5fcd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174391296 unmapped: 41877504 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:55.174723+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 227 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8d800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 7.817443371s of 10.005301476s, submitted: 315
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5609771 data_alloc: 301989888 data_used: 11558912
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173776896 unmapped: 42491904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:56.174884+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 228 ms_handle_reset con 0x55cbabc8d800 session 0x55cbab671680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabc8d800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 228 ms_handle_reset con 0x55cbabc8d800 session 0x55cbaa264960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba7d70000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 228 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 229 ms_handle_reset con 0x55cba7d70000 session 0x55cba923e5a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaab6c000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 229 heartbeat osd_stat(store_statfs(0x1aeb71000/0x0/0x1bfc00000, data 0x7ee69b2/0x8079000, compress 0x0/0x0/0x0, omap 0x647, meta 0x900f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173809664 unmapped: 42459136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:57.175030+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 229 ms_handle_reset con 0x55cbaab6c000 session 0x55cbaa5b23c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae05000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173809664 unmapped: 42459136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:58.175197+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 ms_handle_reset con 0x55cbaae05000 session 0x55cba9240000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 ms_handle_reset con 0x55cbaae93800 session 0x55cbabcdbe00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173817856 unmapped: 42450944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:59.175364+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 ms_handle_reset con 0x55cbaae00c00 session 0x55cbaa341680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 ms_handle_reset con 0x55cbaae02000 session 0x55cba8b56780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 heartbeat osd_stat(store_statfs(0x1aeb6c000/0x0/0x1bfc00000, data 0x7eeb295/0x807f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x900f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafacc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173817856 unmapped: 42450944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:00.175551+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 231 ms_handle_reset con 0x55cbaae93800 session 0x55cbaab0da40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 231 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2623472 data_alloc: 301989888 data_used: 9158656
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172769280 unmapped: 43499520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:01.175691+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 231 ms_handle_reset con 0x55cbaafacc00 session 0x55cbaa3032c0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172777472 unmapped: 43491328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:02.175845+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172777472 unmapped: 43491328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:03.176006+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172777472 unmapped: 43491328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:04.176262+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[5.f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafab000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 51
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 233 ms_handle_reset con 0x55cbaae93c00 session 0x55cbaadf6b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172949504 unmapped: 43319296 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:05.176504+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 233 heartbeat osd_stat(store_statfs(0x1b040f000/0x0/0x1bfc00000, data 0x664aa08/0x67de000, compress 0x0/0x0/0x0, omap 0x647, meta 0x900f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 8.987600327s of 10.011120796s, submitted: 411
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2637551 data_alloc: 301989888 data_used: 9170944
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 234 ms_handle_reset con 0x55cbaae02000 session 0x55cbab5c0b40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 234 ms_handle_reset con 0x55cbaae00c00 session 0x55cbab5c01e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172974080 unmapped: 43294720 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:06.176769+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 234 ms_handle_reset con 0x55cbaae93800 session 0x55cbaadf6780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafacc00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 235 ms_handle_reset con 0x55cbaafab000 session 0x55cbaadfef00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 235 ms_handle_reset con 0x55cbaafacc00 session 0x55cbaa02b860
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 172998656 unmapped: 43270144 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:07.177605+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 235 ms_handle_reset con 0x55cbaae00c00 session 0x55cba91e7e00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 235 ms_handle_reset con 0x55cbaae02000 session 0x55cbaa873a40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173039616 unmapped: 43229184 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:08.177846+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae93800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173047808 unmapped: 43220992 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:09.178007+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 237 ms_handle_reset con 0x55cbaae93800 session 0x55cbaa55cb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 237 heartbeat osd_stat(store_statfs(0x1b0003000/0x0/0x1bfc00000, data 0x6653806/0x67e8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:10.178202+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173056000 unmapped: 43212800 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2637539 data_alloc: 301989888 data_used: 9170944
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:11.178345+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173056000 unmapped: 43212800 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 237 ms_handle_reset con 0x55cbaae91400 session 0x55cbaadfe000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:12.178535+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173056000 unmapped: 43212800 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:13.178684+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173072384 unmapped: 43196416 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 ms_handle_reset con 0x55cbaae00000 session 0x55cba83a4000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:14.178854+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173080576 unmapped: 43188224 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 ms_handle_reset con 0x55cbaae00c00 session 0x55cbaa2694a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 heartbeat osd_stat(store_statfs(0x1afffd000/0x0/0x1bfc00000, data 0x66580c0/0x67f0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae01000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:15.179047+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173088768 unmapped: 43180032 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 ms_handle_reset con 0x55cbaae01000 session 0x55cbaa55cf00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8f800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 ms_handle_reset con 0x55cbaae8f800 session 0x55cba821a5a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2644358 data_alloc: 301989888 data_used: 9187328
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:16.179205+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173113344 unmapped: 43155456 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.886656761s of 10.583377838s, submitted: 212
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae91c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:17.179377+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173113344 unmapped: 43155456 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 ms_handle_reset con 0x55cbaae91c00 session 0x55cbaae094a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 heartbeat osd_stat(store_statfs(0x1afff9000/0x0/0x1bfc00000, data 0x665a4ae/0x67f4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 ms_handle_reset con 0x55cbaa3ef800 session 0x55cbaa02cb40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:18.179577+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173154304 unmapped: 43114496 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa3ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 ms_handle_reset con 0x55cbaa3ef800 session 0x55cbabcdbc20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae00c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:19.179842+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173178880 unmapped: 43089920 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 ms_handle_reset con 0x55cbaae00c00 session 0x55cbaafa2780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:20.180123+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173178880 unmapped: 43089920 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2646990 data_alloc: 301989888 data_used: 9191424
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 heartbeat osd_stat(store_statfs(0x1afffa000/0x0/0x1bfc00000, data 0x665a50a/0x67f3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:21.180459+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173187072 unmapped: 43081728 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:22.180800+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173203456 unmapped: 43065344 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:23.181016+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173211648 unmapped: 43057152 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:24.181495+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173211648 unmapped: 43057152 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:25.181717+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173211648 unmapped: 43057152 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2651230 data_alloc: 301989888 data_used: 9203712
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:26.182086+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173219840 unmapped: 43048960 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 240 heartbeat osd_stat(store_statfs(0x1afff5000/0x0/0x1bfc00000, data 0x665c884/0x67f8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae96c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.790507317s of 10.223301888s, submitted: 116
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 240 ms_handle_reset con 0x55cbaae96c00 session 0x55cbaaded4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 240 heartbeat osd_stat(store_statfs(0x1afff6000/0x0/0x1bfc00000, data 0x665c84e/0x67f7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:27.182257+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173219840 unmapped: 43048960 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 240 heartbeat osd_stat(store_statfs(0x1afff6000/0x0/0x1bfc00000, data 0x665c84e/0x67f7000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:28.182415+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca0800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 241 ms_handle_reset con 0x55cbabca0800 session 0x55cbaadf54a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173236224 unmapped: 43032576 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 241 heartbeat osd_stat(store_statfs(0x1afff4000/0x0/0x1bfc00000, data 0x665c8c0/0x67f9000, compress 0x0/0x0/0x0, omap 0x647, meta 0x940f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:29.182673+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173252608 unmapped: 43016192 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:30.182834+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173277184 unmapped: 42991616 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2669972 data_alloc: 301989888 data_used: 9216000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:31.183164+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173285376 unmapped: 42983424 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:32.183401+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173285376 unmapped: 42983424 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:33.183563+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173318144 unmapped: 42950656 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 244 heartbeat osd_stat(store_statfs(0x1b0fe0000/0x0/0x1bfc00000, data 0x66657b1/0x680b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:34.183805+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173326336 unmapped: 42942464 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:35.184039+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 244 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173359104 unmapped: 42909696 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2678118 data_alloc: 301989888 data_used: 9232384
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:36.184187+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173375488 unmapped: 42893312 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.702032089s of 10.196370125s, submitted: 130
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae95c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:37.184404+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 245 heartbeat osd_stat(store_statfs(0x1b0fe0000/0x0/0x1bfc00000, data 0x6667c04/0x680e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173383680 unmapped: 42885120 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 246 ms_handle_reset con 0x55cbaae02400 session 0x55cba9d881e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:38.184576+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173408256 unmapped: 42860544 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 52
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 handle_osd_map epochs [246,247], i have 247, src has [1,247]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 handle_osd_map epochs [246,247], i have 247, src has [1,247]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae02400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 ms_handle_reset con 0x55cbaae02400 session 0x55cbaadff680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:39.184779+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173350912 unmapped: 42917888 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 heartbeat osd_stat(store_statfs(0x1b0fd6000/0x0/0x1bfc00000, data 0x666c3de/0x6816000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:40.184997+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173350912 unmapped: 42917888 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2687750 data_alloc: 301989888 data_used: 9244672
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:41.185193+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173359104 unmapped: 42909696 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 heartbeat osd_stat(store_statfs(0x1b0fd7000/0x0/0x1bfc00000, data 0x666c5fd/0x6817000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:42.185413+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 173359104 unmapped: 42909696 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafa9c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 ms_handle_reset con 0x55cbaafa9c00 session 0x55cbaadfd4a0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:43.185626+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174415872 unmapped: 41852928 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fd000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:44.185835+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 248 heartbeat osd_stat(store_statfs(0x1b0fce000/0x0/0x1bfc00000, data 0x666e9ae/0x681e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174448640 unmapped: 41820160 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 249 ms_handle_reset con 0x55cba91fd000 session 0x55cbaa340d20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaa5ef800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 249 ms_handle_reset con 0x55cbaa5ef800 session 0x55cbab670780
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafad000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:45.186099+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174505984 unmapped: 41762816 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 ms_handle_reset con 0x55cbaafad000 session 0x55cba821a1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaafad000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2703672 data_alloc: 301989888 data_used: 9256960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 ms_handle_reset con 0x55cbaafad000 session 0x55cbaadec1e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cba91fd000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:46.186265+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174555136 unmapped: 41713664 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 ms_handle_reset con 0x55cba91fd000 session 0x55cbaa302000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 heartbeat osd_stat(store_statfs(0x1b0fc8000/0x0/0x1bfc00000, data 0x6672ef2/0x6823000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:47.186442+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174514176 unmapped: 41754624 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:48.186617+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 174514176 unmapped: 41754624 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.514632225s of 12.252906799s, submitted: 187
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:49.186789+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175570944 unmapped: 40697856 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 heartbeat osd_stat(store_statfs(0x1b0fca000/0x0/0x1bfc00000, data 0x6672ec3/0x6823000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:50.186971+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175570944 unmapped: 40697856 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2700642 data_alloc: 301989888 data_used: 9256960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:51.187157+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175570944 unmapped: 40697856 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:52.187379+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175570944 unmapped: 40697856 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 heartbeat osd_stat(store_statfs(0x1b0fca000/0x0/0x1bfc00000, data 0x6672ec3/0x6823000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:53.187555+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175570944 unmapped: 40697856 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.5] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.16] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.10] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[3.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:54.187740+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175595520 unmapped: 40673280 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:55.188035+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175595520 unmapped: 40673280 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b0fc9000/0x0/0x1bfc00000, data 0x6675149/0x6825000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2702296 data_alloc: 301989888 data_used: 9269248
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:56.188253+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175595520 unmapped: 40673280 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:57.188472+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175595520 unmapped: 40673280 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:58.188705+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175595520 unmapped: 40673280 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:59.188894+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175595520 unmapped: 40673280 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.828023911s of 10.934763908s, submitted: 35
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:00.189077+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175620096 unmapped: 40648704 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b0fc8000/0x0/0x1bfc00000, data 0x6675319/0x6826000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 251 heartbeat osd_stat(store_statfs(0x1b0fc7000/0x0/0x1bfc00000, data 0x66753e3/0x6826000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:01.189249+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2706364 data_alloc: 301989888 data_used: 9273344
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175611904 unmapped: 40656896 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:02.189507+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175620096 unmapped: 40648704 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:03.189676+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175628288 unmapped: 40640512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:04.189824+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175628288 unmapped: 40640512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 252 heartbeat osd_stat(store_statfs(0x1b0fc2000/0x0/0x1bfc00000, data 0x6677919/0x682b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:05.190074+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175628288 unmapped: 40640512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:06.190273+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2711616 data_alloc: 301989888 data_used: 9285632
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175628288 unmapped: 40640512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:07.190409+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175636480 unmapped: 40632320 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:08.190690+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 175636480 unmapped: 40632320 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:09.190870+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 253 heartbeat osd_stat(store_statfs(0x1b0fc0000/0x0/0x1bfc00000, data 0x6677ac3/0x682e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 253 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 253 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176709632 unmapped: 39559168 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 254 heartbeat osd_stat(store_statfs(0x1b0fba000/0x0/0x1bfc00000, data 0x667c032/0x6834000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.611555099s of 10.009282112s, submitted: 119
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:10.191055+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176726016 unmapped: 39542784 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:11.191246+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2718488 data_alloc: 301989888 data_used: 9306112
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176734208 unmapped: 39534592 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:12.191448+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176734208 unmapped: 39534592 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:13.191629+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176734208 unmapped: 39534592 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:14.191841+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176742400 unmapped: 39526400 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb7000/0x0/0x1bfc00000, data 0x667e60c/0x6836000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:15.192077+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176758784 unmapped: 39510016 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:16.192319+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2724494 data_alloc: 301989888 data_used: 9318400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176758784 unmapped: 39510016 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:17.192533+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176758784 unmapped: 39510016 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:18.192781+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176766976 unmapped: 39501824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:19.192948+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176766976 unmapped: 39501824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:20.193162+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176766976 unmapped: 39501824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb4000/0x0/0x1bfc00000, data 0x667ea1b/0x683a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:21.193327+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2726894 data_alloc: 301989888 data_used: 9318400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176783360 unmapped: 39485440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.892313957s of 12.011830330s, submitted: 40
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:22.193888+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176783360 unmapped: 39485440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb3000/0x0/0x1bfc00000, data 0x667eb30/0x683b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:23.194782+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176783360 unmapped: 39485440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:24.195495+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176791552 unmapped: 39477248 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 53
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:25.195694+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176807936 unmapped: 39460864 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb1000/0x0/0x1bfc00000, data 0x667ecdf/0x683d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:26.195860+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2731814 data_alloc: 301989888 data_used: 9318400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176824320 unmapped: 39444480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:27.196014+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176824320 unmapped: 39444480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:28.196145+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176824320 unmapped: 39444480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:29.196338+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176832512 unmapped: 39436288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:30.196520+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176832512 unmapped: 39436288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:31.196813+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2734420 data_alloc: 301989888 data_used: 9318400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176840704 unmapped: 39428096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb3000/0x0/0x1bfc00000, data 0x667ef96/0x683b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:32.196960+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176840704 unmapped: 39428096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:33.197148+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176840704 unmapped: 39428096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 11.845650673s of 12.009426117s, submitted: 36
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:34.197305+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176848896 unmapped: 39419904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb3000/0x0/0x1bfc00000, data 0x667effb/0x683b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:35.197529+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0fb3000/0x0/0x1bfc00000, data 0x667effb/0x683b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176848896 unmapped: 39419904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:36.197721+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2735530 data_alloc: 301989888 data_used: 9318400
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176848896 unmapped: 39419904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:37.198006+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176865280 unmapped: 39403520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:38.198199+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0f86000/0x0/0x1bfc00000, data 0x66a93ed/0x6867000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176865280 unmapped: 39403520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:39.198435+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176873472 unmapped: 39395328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:40.198609+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 176873472 unmapped: 39395328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:41.198748+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0f66000/0x0/0x1bfc00000, data 0x66c9864/0x6887000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2744708 data_alloc: 301989888 data_used: 9314304
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 177930240 unmapped: 38338560 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:42.198954+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 177954816 unmapped: 38313984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0f28000/0x0/0x1bfc00000, data 0x67083df/0x68c5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:43.199159+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 177840128 unmapped: 38428672 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.719141960s of 10.008286476s, submitted: 66
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:44.199338+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 177840128 unmapped: 38428672 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:45.199513+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 heartbeat osd_stat(store_statfs(0x1b0f1e000/0x0/0x1bfc00000, data 0x6716170/0x68d0000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 178044928 unmapped: 38223872 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:46.199826+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2752122 data_alloc: 301989888 data_used: 9314304
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 178241536 unmapped: 38027264 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:47.199983+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 178266112 unmapped: 38002688 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:48.200164+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179331072 unmapped: 36937728 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b0e80000/0x0/0x1bfc00000, data 0x67b0fe8/0x696c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:49.200317+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b0e80000/0x0/0x1bfc00000, data 0x67b0fe8/0x696c000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179429376 unmapped: 36839424 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:50.200500+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179437568 unmapped: 36831232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:51.200711+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2759080 data_alloc: 301989888 data_used: 9326592
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179527680 unmapped: 36741120 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:52.200925+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179617792 unmapped: 36651008 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:53.201076+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 heartbeat osd_stat(store_statfs(0x1b0e30000/0x0/0x1bfc00000, data 0x6801bf0/0x69be000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 256 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179625984 unmapped: 36642816 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:54.201269+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179625984 unmapped: 36642816 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.114658356s of 10.652855873s, submitted: 185
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:55.201523+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179716096 unmapped: 36552704 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:56.201721+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2772560 data_alloc: 301989888 data_used: 9338880
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179716096 unmapped: 36552704 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0df1000/0x0/0x1bfc00000, data 0x683f7a1/0x69fc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:57.201973+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 179740672 unmapped: 36528128 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:58.202216+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180985856 unmapped: 35282944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:59.202447+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180985856 unmapped: 35282944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:00.202733+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0d86000/0x0/0x1bfc00000, data 0x68a7a8d/0x6a65000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181084160 unmapped: 35184640 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:01.202979+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2783062 data_alloc: 301989888 data_used: 9338880
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180617216 unmapped: 35651584 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:02.203184+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180666368 unmapped: 35602432 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:03.203380+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180666368 unmapped: 35602432 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0d29000/0x0/0x1bfc00000, data 0x690518d/0x6ac4000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:04.203546+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 180772864 unmapped: 35495936 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0d18000/0x0/0x1bfc00000, data 0x6917a7c/0x6ad5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:05.203802+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.129129410s of 10.581477165s, submitted: 92
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181002240 unmapped: 35266560 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:06.203975+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2786198 data_alloc: 301989888 data_used: 9338880
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181002240 unmapped: 35266560 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:07.204166+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181018624 unmapped: 35250176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:08.204361+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181018624 unmapped: 35250176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:09.204615+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181018624 unmapped: 35250176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:10.204826+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0cd0000/0x0/0x1bfc00000, data 0x6961d19/0x6b1e000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181018624 unmapped: 35250176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:11.205009+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2788922 data_alloc: 301989888 data_used: 9338880
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181018624 unmapped: 35250176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:12.205161+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 181862400 unmapped: 34406400 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0c9b000/0x0/0x1bfc00000, data 0x6996a2b/0x6b53000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:13.205354+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 182173696 unmapped: 34095104 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:14.205493+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0c84000/0x0/0x1bfc00000, data 0x69ad5f7/0x6b6a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0c6b000/0x0/0x1bfc00000, data 0x69c760f/0x6b83000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 182173696 unmapped: 34095104 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:15.205695+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 heartbeat osd_stat(store_statfs(0x1b0c4b000/0x0/0x1bfc00000, data 0x69e6dbf/0x6ba3000, compress 0x0/0x0/0x0, omap 0x647, meta 0x840f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 182247424 unmapped: 34021376 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.535418510s of 10.793501854s, submitted: 51
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:16.205848+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2795402 data_alloc: 301989888 data_used: 9338880
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 182345728 unmapped: 33923072 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:17.205986+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183443456 unmapped: 32825344 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:18.206112+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183459840 unmapped: 32808960 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 259 handle_osd_map epochs [258,259], i have 259, src has [1,259]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:19.206286+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 259 heartbeat osd_stat(store_statfs(0x1b07e3000/0x0/0x1bfc00000, data 0x6a4bb24/0x6c0a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183582720 unmapped: 32686080 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:20.206451+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 259 heartbeat osd_stat(store_statfs(0x1b07d4000/0x0/0x1bfc00000, data 0x6a5baaa/0x6c1a000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183582720 unmapped: 32686080 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:21.206708+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2810364 data_alloc: 301989888 data_used: 9351168
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183582720 unmapped: 32686080 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:22.206872+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183697408 unmapped: 32571392 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:23.207029+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183705600 unmapped: 32563200 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:24.207183+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183713792 unmapped: 32555008 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:25.207428+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 261 heartbeat osd_stat(store_statfs(0x1b0730000/0x0/0x1bfc00000, data 0x6afcc5b/0x6cbe000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 54
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183918592 unmapped: 32350208 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:26.207676+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.056585312s of 10.194576263s, submitted: 314
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 261 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2830896 data_alloc: 301989888 data_used: 9379840
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 183672832 unmapped: 32595968 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:27.207876+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184729600 unmapped: 31539200 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 263 handle_osd_map epochs [263,264], i have 263, src has [1,264]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:28.208007+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184737792 unmapped: 31531008 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:29.208181+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184737792 unmapped: 31531008 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:30.208322+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184754176 unmapped: 31514624 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 264 heartbeat osd_stat(store_statfs(0x1b069a000/0x0/0x1bfc00000, data 0x6b8d592/0x6d53000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:31.208525+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2840286 data_alloc: 301989888 data_used: 9392128
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184762368 unmapped: 31506432 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:32.208761+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 265 heartbeat osd_stat(store_statfs(0x1b065a000/0x0/0x1bfc00000, data 0x6bce193/0x6d94000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184901632 unmapped: 31367168 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:33.208902+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:34.209101+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:35.209372+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:36.209523+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 266 heartbeat osd_stat(store_statfs(0x1b0633000/0x0/0x1bfc00000, data 0x6bf21ab/0x6dbb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2853090 data_alloc: 301989888 data_used: 9408512
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 266 heartbeat osd_stat(store_statfs(0x1b0633000/0x0/0x1bfc00000, data 0x6bf21ab/0x6dbb000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 10.160420418s of 10.704147339s, submitted: 202
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:37.209678+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 266 handle_osd_map epochs [266,267], i have 266, src has [1,267]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:38.209796+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:39.209925+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 267 heartbeat osd_stat(store_statfs(0x1b05e9000/0x0/0x1bfc00000, data 0x6c3798a/0x6e04000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:40.210151+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:41.210411+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2861304 data_alloc: 301989888 data_used: 9420800
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184557568 unmapped: 31711232 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:42.210618+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184565760 unmapped: 31703040 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:43.210881+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184590336 unmapped: 31678464 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:44.211089+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184590336 unmapped: 31678464 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:45.211358+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 268 heartbeat osd_stat(store_statfs(0x1b057f000/0x0/0x1bfc00000, data 0x6ca17d3/0x6e6f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184672256 unmapped: 31596544 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:46.211504+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2878422 data_alloc: 301989888 data_used: 9437184
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 184827904 unmapped: 31440896 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.636810303s of 10.001516342s, submitted: 106
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:47.211751+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186015744 unmapped: 30253056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:48.218389+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 270 handle_osd_map epochs [269,270], i have 270, src has [1,270]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186015744 unmapped: 30253056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 270 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:49.218555+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186015744 unmapped: 30253056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:50.218745+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 270 heartbeat osd_stat(store_statfs(0x1b04e1000/0x0/0x1bfc00000, data 0x6d3c932/0x6f0d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 185868288 unmapped: 30400512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:51.218943+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2884072 data_alloc: 301989888 data_used: 9457664
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 185868288 unmapped: 30400512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:52.219128+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 185868288 unmapped: 30400512 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:53.219309+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 ms_handle_reset con 0x55cbaae95c00 session 0x55cba8bde960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 29679616 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:54.219484+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 55
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 heartbeat osd_stat(store_statfs(0x1b04ba000/0x0/0x1bfc00000, data 0x6d6270b/0x6f33000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186589184 unmapped: 29679616 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:55.219708+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186826752 unmapped: 29442048 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:56.219874+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2891762 data_alloc: 301989888 data_used: 9469952
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.687864304s of 10.001472473s, submitted: 420
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186875904 unmapped: 29392896 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:57.220050+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 186875904 unmapped: 29392896 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:58.220272+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 heartbeat osd_stat(store_statfs(0x1b045d000/0x0/0x1bfc00000, data 0x6dbe877/0x6f91000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 187129856 unmapped: 29138944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:59.220497+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 187129856 unmapped: 29138944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:00.220722+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188252160 unmapped: 28016640 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:01.220893+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2899638 data_alloc: 301989888 data_used: 9469952
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188407808 unmapped: 27860992 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:02.221066+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188407808 unmapped: 27860992 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 heartbeat osd_stat(store_statfs(0x1b03f2000/0x0/0x1bfc00000, data 0x6e29899/0x6ffc000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:03.221237+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 27697152 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:04.221400+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 heartbeat osd_stat(store_statfs(0x1b03c7000/0x0/0x1bfc00000, data 0x6e52dfe/0x7027000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 27697152 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:05.221604+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 heartbeat osd_stat(store_statfs(0x1b03ac000/0x0/0x1bfc00000, data 0x6e6e022/0x7042000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188571648 unmapped: 27697152 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:06.221792+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2903978 data_alloc: 301989888 data_used: 9469952
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.802969933s of 10.005166054s, submitted: 41
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188588032 unmapped: 27680768 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:07.221942+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188743680 unmapped: 27525120 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 272 handle_osd_map epochs [272,272], i have 272, src has [1,272]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:08.222109+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 272 heartbeat osd_stat(store_statfs(0x1b0363000/0x0/0x1bfc00000, data 0x6eb972f/0x708b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188751872 unmapped: 27516928 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:09.222304+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188751872 unmapped: 27516928 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:10.222453+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188760064 unmapped: 27508736 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:11.222695+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2913614 data_alloc: 301989888 data_used: 9482240
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188989440 unmapped: 27279360 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:12.222874+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 272 heartbeat osd_stat(store_statfs(0x1b02f9000/0x0/0x1bfc00000, data 0x6f216d0/0x70f5000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 188989440 unmapped: 27279360 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:13.223050+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 28499968 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:14.223212+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 28499968 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:15.223405+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 273 heartbeat osd_stat(store_statfs(0x1b02f5000/0x0/0x1bfc00000, data 0x6f2394d/0x70f8000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 187768832 unmapped: 28499968 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:16.223555+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2922872 data_alloc: 301989888 data_used: 9494528
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 9.225525856s of 10.004955292s, submitted: 129
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189030400 unmapped: 27238400 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:17.223721+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189046784 unmapped: 27222016 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:18.223855+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189046784 unmapped: 27222016 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:19.223996+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 274 heartbeat osd_stat(store_statfs(0x1b02b2000/0x0/0x1bfc00000, data 0x6f64b82/0x713b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189054976 unmapped: 27213824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:20.224151+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189054976 unmapped: 27213824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:21.224344+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2926782 data_alloc: 301989888 data_used: 9506816
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189054976 unmapped: 27213824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:22.224475+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:23.224706+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189054976 unmapped: 27213824 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.7] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.e] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0293000/0x0/0x1bfc00000, data 0x6f85663/0x715b000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:24.224941+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:25.225183+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:26.225393+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b028e000/0x0/0x1bfc00000, data 0x6f878d1/0x715f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2928892 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:27.225541+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:28.225724+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:29.225919+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:30.226892+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189071360 unmapped: 27197440 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b028e000/0x0/0x1bfc00000, data 0x6f878d1/0x715f000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:31.227229+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189079552 unmapped: 27189248 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2928892 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 14.814065933s of 14.990465164s, submitted: 68
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:32.239823+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189079552 unmapped: 27189248 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:33.240042+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189079552 unmapped: 27189248 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:34.240272+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:35.240491+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:36.240658+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2929816 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:37.240873+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:38.241040+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:39.241241+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:40.241467+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:41.242040+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189087744 unmapped: 27181056 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2929816 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:42.242198+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189095936 unmapped: 27172864 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:43.242413+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189095936 unmapped: 27172864 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:44.242667+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189095936 unmapped: 27172864 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:45.242847+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189095936 unmapped: 27172864 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:46.243040+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189095936 unmapped: 27172864 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2929816 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:47.243271+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189104128 unmapped: 27164672 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:48.243496+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189104128 unmapped: 27164672 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:49.243723+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189104128 unmapped: 27164672 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:50.243925+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:51.244177+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2929816 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:52.244399+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:53.247755+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:54.247966+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:55.248127+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:56.248281+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2929816 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:57.248443+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189112320 unmapped: 27156480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:58.248628+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189120512 unmapped: 27148288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:59.248841+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189120512 unmapped: 27148288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:00.249056+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 heartbeat osd_stat(store_statfs(0x1b0288000/0x0/0x1bfc00000, data 0x6f8d245/0x7165000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189120512 unmapped: 27148288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:01.249175+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189120512 unmapped: 27148288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2929816 data_alloc: 301989888 data_used: 9519104
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:02.249334+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189120512 unmapped: 27148288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:03.249484+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189120512 unmapped: 27148288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 32.107299805s of 32.116291046s, submitted: 1
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbabca2c00
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:04.302266+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189128704 unmapped: 27140096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:05.302442+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189923328 unmapped: 26345472 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 276 heartbeat osd_stat(store_statfs(0x1aee18000/0x0/0x1bfc00000, data 0x83fd268/0x85d6000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [0,0,1])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:06.302596+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 276 ms_handle_reset con 0x55cbabca2c00 session 0x55cba9d88960
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: handle_auth_request added challenge on 0x55cbaae8f000
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3078230 data_alloc: 301989888 data_used: 9531392
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _renew_subs
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:07.302781+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 277 handle_osd_map epochs [277,277], i have 277, src has [1,277]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 277 ms_handle_reset con 0x55cbaae8f000 session 0x55cbaab0cd20
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:08.303203+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 277 heartbeat osd_stat(store_statfs(0x1b0280000/0x0/0x1bfc00000, data 0x6f91969/0x716d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:09.303473+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:10.303675+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:11.303958+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2940474 data_alloc: 301989888 data_used: 9531392
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:12.304145+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189825024 unmapped: 26443776 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 277 heartbeat osd_stat(store_statfs(0x1b0280000/0x0/0x1bfc00000, data 0x6f91969/0x716d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.18] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.d] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 scrub-queue::remove_from_osd_queue removing pg[4.13] failed. State was: not registered w/ OSD
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:13.304419+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189685760 unmapped: 26583040 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b0280000/0x0/0x1bfc00000, data 0x6f91969/0x716d000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:14.304611+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:15.304911+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:16.305130+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2943623 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:17.305369+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:18.340578+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:19.340869+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027c000/0x0/0x1bfc00000, data 0x6f93bb7/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:20.341085+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189693952 unmapped: 26574848 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:21.341326+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189702144 unmapped: 26566656 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2943623 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:22.341552+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189710336 unmapped: 26558464 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:23.341724+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore(/var/lib/ceph/osd/ceph-3) _kv_sync_thread utilization: idle 18.912734985s of 19.336259842s, submitted: 114
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 ms_handle_reset con 0x55cba7799800 session 0x55cbaac3b0e0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190554112 unmapped: 25714688 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:24.341964+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Got map version 56
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:25.342193+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:26.342380+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:27.342550+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:28.342732+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:29.342889+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:30.343068+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:31.343308+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:32.343591+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:33.343767+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:34.343946+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190136320 unmapped: 26132480 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:35.344128+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:36.344278+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:37.344449+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:38.344563+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:39.344682+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:40.344832+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:41.344997+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:42.345149+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190144512 unmapped: 26124288 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:43.345324+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190152704 unmapped: 26116096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:44.345534+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190152704 unmapped: 26116096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:45.345775+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190152704 unmapped: 26116096 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:46.345937+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:47.346093+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:48.346273+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:49.346423+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:50.346584+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:51.346781+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:52.347011+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:53.347144+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190160896 unmapped: 26107904 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:54.347378+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190169088 unmapped: 26099712 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:55.347623+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190169088 unmapped: 26099712 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:56.347830+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190169088 unmapped: 26099712 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:57.347952+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190169088 unmapped: 26099712 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:58.348153+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190169088 unmapped: 26099712 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:59.348367+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190177280 unmapped: 26091520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:00.348572+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190177280 unmapped: 26091520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:01.348723+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190177280 unmapped: 26091520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:02.349060+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190177280 unmapped: 26091520 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:03.349229+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:04.349402+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:05.349721+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:06.349995+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:07.350223+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:08.350350+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:09.350504+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190185472 unmapped: 26083328 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:10.350624+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190193664 unmapped: 26075136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:11.350777+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190193664 unmapped: 26075136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:12.350910+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190193664 unmapped: 26075136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:13.351062+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190193664 unmapped: 26075136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:14.351215+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190193664 unmapped: 26075136 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:15.351389+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190201856 unmapped: 26066944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:16.351534+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190201856 unmapped: 26066944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:17.351741+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190201856 unmapped: 26066944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:18.351902+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190201856 unmapped: 26066944 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:19.352036+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:20.352199+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:21.352412+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:22.352500+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:23.352674+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:24.352855+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:25.353088+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190210048 unmapped: 26058752 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:26.353237+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:27.353531+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:28.353694+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:29.353930+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:30.354105+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:31.354265+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:32.354396+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:33.354610+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:34.354875+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:35.355071+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:36.355310+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:37.355495+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190226432 unmapped: 26042368 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:38.355702+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190234624 unmapped: 26034176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:39.355887+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190234624 unmapped: 26034176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:40.356032+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190234624 unmapped: 26034176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:41.356229+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190234624 unmapped: 26034176 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:42.356403+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:43.356545+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:44.356731+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:45.356894+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:46.357092+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:47.357296+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:48.357443+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:49.357571+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190242816 unmapped: 26025984 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:50.357748+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190251008 unmapped: 26017792 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:51.357912+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190251008 unmapped: 26017792 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:52.358110+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190251008 unmapped: 26017792 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:53.358317+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190251008 unmapped: 26017792 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:54.358511+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190251008 unmapped: 26017792 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:55.358832+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190259200 unmapped: 26009600 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:56.359054+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190259200 unmapped: 26009600 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:57.359285+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190259200 unmapped: 26009600 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:58.359595+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:59.359802+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:00.359961+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:01.360161+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:02.360313+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:03.360496+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:04.360739+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:05.360962+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190267392 unmapped: 26001408 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:06.361171+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190275584 unmapped: 25993216 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:07.361360+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190275584 unmapped: 25993216 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:08.361484+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190275584 unmapped: 25993216 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:09.361688+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190275584 unmapped: 25993216 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:10.361840+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:11.361988+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:12.362125+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:13.362331+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:14.362492+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:15.362646+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:16.362770+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190283776 unmapped: 25985024 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:17.362994+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: bluestore.MempoolThread(0x55cba6a1bb60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2942967 data_alloc: 301989888 data_used: 9543680
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 190291968 unmapped: 25976832 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:18.363146+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'config diff' '{prefix=config diff}'
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'config show' '{prefix=config show}'
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'counter dump' '{prefix=counter dump}'
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'counter schema' '{prefix=counter schema}'
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189857792 unmapped: 26411008 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:19.363306+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: prioritycache tune_memory target: 5709084876 mapped: 189382656 unmapped: 26886144 heap: 216268800 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: osd.3 278 heartbeat osd_stat(store_statfs(0x1b027d000/0x0/0x1bfc00000, data 0x6f93dca/0x7171000, compress 0x0/0x0/0x0, omap 0x647, meta 0x880f9b9), peers [0,1,2,4,5] op hist [])
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: tick
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_tickets
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:20.363419+0000)
Feb 20 10:06:51 np0005625204.localdomain ceph-osd[33177]: do_command 'log dump' '{prefix=log dump}'
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2432434700' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/154490225' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.98795 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.59164 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3386122060' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.49962 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3031087465' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.98816 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1093232064' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.59179 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.49974 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2432434700' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3249238967' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.59191 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/106052076' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:51 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/154490225' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2761767677' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain crontab[327235]: (root) LIST (root)
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3920137714' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.49986 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.98828 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: pgmap v744: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.59200 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3152040596' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1001056526' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.49998 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.98840 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.59218 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2761767677' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1141553942' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2885617671' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.50019 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:52 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3920137714' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3803234465' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/2062908097' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.98861 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.98873 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.59251 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3803234465' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.50040 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2258020461' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3675822213' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.98888 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2415945841' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2062908097' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1921623941' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 20 10:06:53 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2840627815' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.
Feb 20 10:06:54 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "node ls"} v 0)
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4189794073' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain podman[327450]: 2026-02-20 10:06:54.159857843 +0000 UTC m=+0.091761391 container health_status 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 20 10:06:54 np0005625204.localdomain podman[327451]: 2026-02-20 10:06:54.201764122 +0000 UTC m=+0.132033549 container health_status ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 20 10:06:54 np0005625204.localdomain podman[327450]: 2026-02-20 10:06:54.221291248 +0000 UTC m=+0.153194796 container exec_died 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:06:54 np0005625204.localdomain systemd[1]: 67e88b76de81fa03ac4bcb8fe51c8c3a933bf19f33174b03babb9e0afa2e3c2f.service: Deactivated successfully.
Feb 20 10:06:54 np0005625204.localdomain podman[327451]: 2026-02-20 10:06:54.243893117 +0000 UTC m=+0.174162534 container exec_died ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 20 10:06:54 np0005625204.localdomain systemd[1]: ee687cf55bcb163c79f18bfdf8642f7bed2b16940ecf1c757247b1534cf50916.service: Deactivated successfully.
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/494065428' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2467835095' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: pgmap v745: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/14354088' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3660405424' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3733402714' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/700815640' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4189794073' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.98918 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1041369153' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3147845967' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1942123242' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/494065428' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3962166500' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2467835095' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/638926905' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3537827992' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb 20 10:06:54 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3699056303' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4095727606' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2854468367' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 85 handle_osd_map epochs [86,86], i have 85, src has [1,86]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 80.883834839s of 80.907012939s, submitted: 7
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Terminating session with v2:172.18.0.105:6800/2290533109
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 ms_handle_reset con 0x55bf8eef6000 session 0x55bf904332c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba29d000/0x0/0x1bfc00000, data 0x1b73369/0x1bf1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79781888 unmapped: 3022848 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:37.901229+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79781888 unmapped: 3022848 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:38.901464+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79781888 unmapped: 3022848 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:39.901882+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79781888 unmapped: 3022848 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:40.902099+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79781888 unmapped: 3022848 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:41.902483+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 21
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:42.902856+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:43.903378+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:44.903582+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:45.904235+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:46.904502+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:47.904800+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:48.905038+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:49.905244+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:50.905410+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:51.905602+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:52.905840+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:53.906011+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:54.906265+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:55.906497+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:56.906732+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:57.907016+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:58.907196+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:42:59.907401+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:00.907692+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:01.907936+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:02.908153+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:03.908336+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:04.908589+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:05.909109+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:06.909298+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:07.909916+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 heartbeat osd_stat(store_statfs(0x1ba299000/0x0/0x1bfc00000, data 0x1b752e7/0x1bf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:08.910142+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:09.910320+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 22
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 733118 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92124800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 86 handle_osd_map epochs [87,87], i have 86, src has [1,87]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 33.102695465s of 33.116134644s, submitted: 5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79593472 unmapped: 3211264 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:10.910458+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 23
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.104:6800/2990345946,v1:172.18.0.104:6801/2990345946]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Starting new session with [v2:172.18.0.104:6800/2990345946,v1:172.18.0.104:6801/2990345946]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: get_auth_request con 0x55bf92125c00 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79388672 unmapped: 3416064 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:11.910583+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 24
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.104:6800/2990345946,v1:172.18.0.104:6801/2990345946]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79314944 unmapped: 3489792 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:12.910714+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 87 heartbeat osd_stat(store_statfs(0x1ba296000/0x0/0x1bfc00000, data 0x1b7733d/0x1bf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79314944 unmapped: 3489792 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:13.910876+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 25
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.104:6800/2990345946,v1:172.18.0.104:6801/2990345946]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:14.911007+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 736090 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 87 heartbeat osd_stat(store_statfs(0x1ba296000/0x0/0x1bfc00000, data 0x1b7733d/0x1bf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:15.911162+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:16.911301+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 87 heartbeat osd_stat(store_statfs(0x1ba296000/0x0/0x1bfc00000, data 0x1b7733d/0x1bf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:17.911453+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:18.911595+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 87 heartbeat osd_stat(store_statfs(0x1ba296000/0x0/0x1bfc00000, data 0x1b7733d/0x1bf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:19.911842+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 736090 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:20.912048+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:21.912246+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 87 heartbeat osd_stat(store_statfs(0x1ba296000/0x0/0x1bfc00000, data 0x1b7733d/0x1bf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:22.912407+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79552512 unmapped: 3252224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:23.912598+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 87 handle_osd_map epochs [88,88], i have 87, src has [1,88]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.808965683s of 13.826190948s, submitted: 5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 26
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Terminating session with v2:172.18.0.104:6800/2990345946
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 ms_handle_reset con 0x55bf92124800 session 0x55bf904674a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79839232 unmapped: 2965504 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:24.912793+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 27
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: get_auth_request con 0x55bf91d86400 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80052224 unmapped: 2752512 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:25.913011+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 28
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80052224 unmapped: 2752512 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:26.913175+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80052224 unmapped: 2752512 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:27.913332+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 29
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80052224 unmapped: 2752512 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:28.913474+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80052224 unmapped: 2752512 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:29.913680+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 30
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:30.914007+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:31.914187+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:32.914402+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:33.914588+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:34.914981+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:35.915250+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:36.915575+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:37.915865+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:38.916042+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:39.916267+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 10 from mon.np0005625204 (according to old e10)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 10
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:44:10.215299+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
                                                          3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: ms_handle_reset current mon [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _reopen_session rank -1
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _add_conns ranks=[0,2,1,3]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625201 con 0x55bf91f9d400 addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625203 con 0x55bf91f9d800 addr [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625204 con 0x55bf91f9dc00 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 ms_handle_reset con 0x55bf92125000 session 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf91f9d800 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf91f9d400 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf91f9dc00 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_done global_id 14319 payload 293
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_hunting 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: found mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625203 at v2:172.18.0.107:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_auth 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:40.237531+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: get_auth_request con 0x55bf91f9d800 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: --2- [v2:172.18.0.108:6800/2098983975,v1:172.18.0.108:6801/2098983975] >> [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] conn(0x55bf91f9d800 0x55bf8ee57180 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request get_initial_auth_request returned -2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: ms_handle_reset current mon [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _reopen_session rank -1
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _add_conns ranks=[3]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625202 con 0x55bf92125000 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 ms_handle_reset con 0x55bf91f9d800 session 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf92125000 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_done global_id 14319 payload 293
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_hunting 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: found mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_auth 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:40.442266+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 10 from mon.np0005625202 (according to old e10)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 10
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:44:10.215299+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005625203
                                                          3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_config config(7 keys)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: set_mon_vals no callback set
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 30
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:40.916507+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:41.916714+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:42.916959+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:43.917159+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:44.917313+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:45.917461+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:46.917595+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:47.917793+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:48.917936+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:49.918103+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:50.918222+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:51.918396+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:52.918555+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:53.918740+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:54.918928+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:55.919094+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:56.919281+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:57.919506+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:58.919739+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:43:59.919935+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:00.920109+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:01.920263+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:02.920397+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:03.920567+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:04.920737+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:05.920902+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:06.921092+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:07.921301+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:08.921520+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:09.921689+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:10.921835+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:11.921967+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:12.922072+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79822848 unmapped: 2981888 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 11 from mon.np0005625202 (according to old e11)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 11
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:44:43.337910+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: ms_handle_reset current mon [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _reopen_session rank -1
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _add_conns ranks=[1,0,2]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625204 con 0x55bf90caf800 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625201 con 0x55bf921e1000 addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625202 con 0x55bf91f99000 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 ms_handle_reset con 0x55bf92125000 session 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf90caf800 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_done global_id 14319 payload 293
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_hunting 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: found mon.np0005625204
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_auth 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:13.356984+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 11 from mon.np0005625204 (according to old e11)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 11
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:44:43.337910+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_config config(7 keys)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: set_mon_vals no callback set
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 30
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1027089384,v1:172.18.0.106:6811/1027089384]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:13.922287+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:14.922472+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:15.922706+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:16.922893+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:17.923080+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:18.923222+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:19.923343+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:20.923554+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:21.923784+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:22.923954+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:23.924156+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:24.924357+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:25.924498+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:26.924657+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:27.924828+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:28.924981+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:29.925145+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:30.925349+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:31.925515+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:32.925715+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:33.925907+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:34.926078+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:35.926257+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:36.926436+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:37.926652+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:38.926875+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:39.927042+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:40.927175+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:41.927688+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:42.927866+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:43.928083+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:44.928289+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 738918 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:45.928586+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79847424 unmapped: 2957312 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 heartbeat osd_stat(store_statfs(0x1ba293000/0x0/0x1bfc00000, data 0x1b7975d/0x1bfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x3d6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625204 at v2:172.18.0.108:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 31
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Terminating session with v2:172.18.0.106:6810/1027089384
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 82.470855713s of 82.495124817s, submitted: 5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 ms_handle_reset con 0x55bf8eef6000 session 0x55bf90466960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:46.928778+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80052224 unmapped: 2752512 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 32
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: get_auth_request con 0x55bf92123c00 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:47.929105+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 79814656 unmapped: 2990080 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 33
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:48.929336+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80003072 unmapped: 2801664 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:49.930165+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 34
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80150528 unmapped: 2654208 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:50.930329+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80150528 unmapped: 2654208 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 35
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:51.930512+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80297984 unmapped: 2506752 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 36
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:52.930668+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:53.930799+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:54.931014+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:55.931255+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:56.931468+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:57.931615+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:58.931771+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:44:59.931941+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:00.932146+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4213092245' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:01.932272+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:02.932453+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:03.932707+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:04.932972+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:05.933223+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:06.933377+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:07.933575+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:08.933775+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 12 from mon.np0005625204 (according to old e12)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 12
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:39.346453+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
                                                          1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:09.933956+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:10.934145+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:11.934323+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:12.934555+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:13.935833+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:14.936009+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:15.936705+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80322560 unmapped: 2482176 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 13 from mon.np0005625204 (according to old e13)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 13
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:46.327222+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: ms_handle_reset current mon [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _reopen_session rank -1
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _add_conns ranks=[2,0,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625203 con 0x55bf921e0400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625204 con 0x55bf91d85400 addr [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625202 con 0x55bf91f99000 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 ms_handle_reset con 0x55bf90caf800 session 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf91f99000 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf91d85400 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_done global_id 14319 payload 293
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_hunting 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: found mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_auth 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:16.352018+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 13 from mon.np0005625202 (according to old e13)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 13
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:46.327222+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_config config(7 keys)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: set_mon_vals no callback set
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 36
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:16.937101+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:17.937443+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:18.937743+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:19.938098+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:20.938594+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:21.938734+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:23.466206+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:24.466424+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain systemd-journald[48359]: Data hash table of /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Feb 20 10:06:55 np0005625204.localdomain systemd-journald[48359]: /run/log/journal/01f46965e72fd8a157841feaa66c8d52/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 20 10:06:55 np0005625204.localdomain rsyslogd[758]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:25.466660+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:26.466961+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:27.467145+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 14 from mon.np0005625202 (according to old e14)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 14
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:45:57.556107+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
                                                          3: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625201
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:28.467384+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:29.467613+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:30.467773+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:31.467927+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:32.468177+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:33.468485+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:34.468815+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:35.468980+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:36.469281+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:37.469492+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 15 from mon.np0005625202 (according to old e15)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 15
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:08.177805+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005625204
                                                          1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:38.469783+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:39.469911+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:40.470098+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:41.470301+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:42.470473+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:43.470608+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:44.470867+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:45.471087+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:46.471279+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:47.471423+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:48.471611+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:49.471861+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:50.472840+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:51.473165+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:52.474001+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:53.474158+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80519168 unmapped: 2285568 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 16 from mon.np0005625202 (according to old e16)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 16
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:24.360760+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: ms_handle_reset current mon [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _reopen_session rank -1
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _add_conns ranks=[0,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625202 con 0x55bf90caf800 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): picked mon.np0005625203 con 0x55bf921e0400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): start opening mon connection
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 ms_handle_reset con 0x55bf91f99000 session 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf921e0400 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request con 0x55bf90caf800 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth method 2
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): _init_auth already have auth, reseting
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient(hunting): handle_auth_done global_id 14319 payload 293
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_hunting 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: found mon.np0005625202
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_auth 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:54.392133+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 16 from mon.np0005625202 (according to old e16)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 16
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:24.360760+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_config config(7 keys)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: set_mon_vals no callback set
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 36
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:54.474344+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:55.475796+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:56.475975+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:57.476122+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:58.476354+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:45:59.476564+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:00.476719+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:01.476872+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:02.477007+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:03.477169+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:04.477508+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:05.477828+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:06.478053+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:07.478218+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:08.478367+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:09.478497+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:10.478759+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:11.478945+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:12.479896+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e8f000/0x0/0x1bfc00000, data 0x1b7bcd5/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:13.480088+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:14.480257+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:15.480403+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 742294 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:16.480605+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_monmap mon_map magic: 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient:  got monmap 17 from mon.np0005625202 (according to old e17)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: dump:
                                                          epoch 17
                                                          fsid a8557ee9-b55d-5519-942c-cf8f6172f1d8
                                                          last_changed 2026-02-20T09:46:46.606881+0000
                                                          created 2026-02-20T07:36:51.191305+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005625202
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005625203
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005625204
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.197715759s of 90.223068237s, submitted: 6
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:17.480811+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:18.480952+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e90000/0x0/0x1bfc00000, data 0x1b7bdef/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:19.481109+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:20.481270+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e90000/0x0/0x1bfc00000, data 0x1b7bdef/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 741206 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:21.481389+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 37
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/2084071713,v1:172.18.0.107:6811/2084071713]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:22.481522+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:23.481672+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e90000/0x0/0x1bfc00000, data 0x1b7bdef/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:24.481831+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:25.482036+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 741206 data_alloc: 285212672 data_used: 135168
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:26.482171+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:27.482313+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 38
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/2084071713
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 heartbeat osd_stat(store_statfs(0x1b9e90000/0x0/0x1bfc00000, data 0x1b7bdef/0x1bfe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 89 handle_osd_map epochs [90,90], i have 90, src has [1,90]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.451988220s of 11.463270187s, submitted: 3
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 ms_handle_reset con 0x55bf91f9d400 session 0x55bf9166e1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:28.482496+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80535552 unmapped: 2269184 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 39
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: get_auth_request con 0x55bf91f99000 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:29.482694+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:30.483114+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80543744 unmapped: 2260992 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:31.483239+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80691200 unmapped: 2113536 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:32.483363+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80691200 unmapped: 2113536 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:33.483597+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80691200 unmapped: 2113536 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:34.483721+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 41
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:35.483859+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:36.483992+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:37.484132+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:38.484303+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:39.484429+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:40.484581+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:41.484749+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80207872 unmapped: 2596864 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:42.484913+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:43.485054+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:44.485200+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:45.489728+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:46.489873+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:47.490022+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:48.490190+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:49.490391+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:50.490550+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:51.490711+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80224256 unmapped: 2580480 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 42
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/689946273,v1:172.18.0.108:6811/689946273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:52.490872+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:53.491061+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:54.491216+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:55.491338+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:56.492685+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:57.492836+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:58.493018+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:46:59.493200+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:00.493302+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:01.493429+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:02.493581+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:03.493715+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:04.493866+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:05.494008+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:06.494159+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:07.494329+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:08.494533+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:09.494727+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:10.494887+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:11.495076+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:12.495250+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:13.495407+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:14.495602+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:15.495790+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:16.495949+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:17.496080+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:18.496294+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:19.496441+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:20.496610+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:21.496796+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:22.497006+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:23.497178+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:24.497390+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:25.497585+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:26.497738+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:27.497906+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80232448 unmapped: 2572288 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:28.498098+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:29.498273+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:30.498464+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:31.498619+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:32.498853+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:33.499050+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:34.499235+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:35.499392+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:36.499548+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:37.499714+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:38.499928+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:39.500075+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:40.500274+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:41.500465+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:42.500660+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:43.500830+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:44.501005+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:45.501196+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:46.501388+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:47.501533+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:48.501782+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:49.501993+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:50.502153+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80240640 unmapped: 2564096 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 745202 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:51.502363+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 heartbeat osd_stat(store_statfs(0x1b9e8c000/0x0/0x1bfc00000, data 0x1b7e2f9/0x1c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80248832 unmapped: 2555904 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:52.502509+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80248832 unmapped: 2555904 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 43
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now 
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Terminating session with v2:172.18.0.108:6810/689946273
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect No active mgr available yet
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 85.016281128s of 85.029273987s, submitted: 3
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 ms_handle_reset con 0x55bf91db6c00 session 0x55bf8f8fda40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91d85400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:53.502654+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 44
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc reconnect Starting new session with [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: get_auth_request con 0x55bf90cafc00 auth_method 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_configure stats_period=5
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:54.502830+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:55.502995+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:56.503141+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80527360 unmapped: 2277376 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:57.503325+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80502784 unmapped: 2301952 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:58.503566+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 45
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:47:59.503741+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:00.503916+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:01.504044+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:02.504215+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:03.504357+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:04.504505+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:05.504681+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:06.504836+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:07.504957+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:08.505046+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:09.505188+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:10.505327+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:11.505466+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:12.505606+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:13.505716+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:14.505917+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:15.506072+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:16.506203+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:17.506346+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:18.506521+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:19.506694+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:20.506884+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:21.507058+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:22.507230+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:23.507394+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:24.507563+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:25.507719+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:26.507891+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:27.508099+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80363520 unmapped: 2441216 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:28.508381+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:29.508619+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:30.508924+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:31.509161+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:32.509476+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:33.509854+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:34.510034+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:35.510171+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:36.510360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:37.510521+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:38.510748+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:39.510938+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:40.511141+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:41.511342+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:42.511518+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80371712 unmapped: 2433024 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:43.511714+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:44.511898+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:45.512087+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:46.512211+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:47.512363+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:48.512495+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:49.512615+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:50.512826+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80379904 unmapped: 2424832 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:51.512972+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:52.513136+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:53.513278+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:54.513434+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:55.676713+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:56.676915+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:57.677137+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:58.677420+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:48:59.677651+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:00.677844+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:01.678041+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:02.678267+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:03.678429+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:04.678587+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:05.678733+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80388096 unmapped: 2416640 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:06.678869+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:07.679022+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:08.679208+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:09.679352+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:10.679485+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:11.679670+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:12.679843+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:13.679956+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:14.680104+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80396288 unmapped: 2408448 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:15.680261+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:16.680446+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5069 writes, 22K keys, 5069 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5069 writes, 696 syncs, 7.28 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 90 writes, 307 keys, 90 commit groups, 1.0 writes per commit group, ingest: 0.37 MB, 0.00 MB/s
                                                          Interval WAL: 90 writes, 39 syncs, 2.31 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:17.680591+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:18.680805+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:19.680950+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:20.681089+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:21.681291+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 748410 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80acb/0x1c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:22.681442+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80404480 unmapped: 2400256 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 90.125976562s of 90.156661987s, submitted: 10
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:23.681593+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 46
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80576512 unmapped: 2228224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:24.681740+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80576512 unmapped: 2228224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:25.681884+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80576512 unmapped: 2228224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:26.682090+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 747498 data_alloc: 285212672 data_used: 143360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80576512 unmapped: 2228224 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:27.682234+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80584704 unmapped: 2220032 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 heartbeat osd_stat(store_statfs(0x1b9e88000/0x0/0x1bfc00000, data 0x1b80c03/0x1c06000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:28.682422+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80601088 unmapped: 2203648 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:29.682563+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 80699392 unmapped: 2105344 heap: 82804736 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:30.682707+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81821696 unmapped: 2031616 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:31.682863+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 755693 data_alloc: 285212672 data_used: 155648
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 ms_handle_reset con 0x55bf91f9d800 session 0x55bf90432b40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:32.683029+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:33.683923+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:34.684071+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:35.684285+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:36.684787+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:37.684940+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:38.685123+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:39.685274+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:40.685422+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:41.685560+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:42.685750+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:43.685968+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:44.686379+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:45.687240+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:46.687386+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:47.687542+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:48.687718+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:49.687863+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:50.688008+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:51.688161+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:52.688345+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:53.688480+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:54.688619+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:55.688796+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:56.688933+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:57.689079+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:58.689268+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:49:59.689418+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:00.689567+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:01.689717+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:02.689854+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:03.689990+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:04.690146+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:05.690606+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:06.690760+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:07.691690+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:08.694026+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:09.694171+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:10.694320+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:11.695399+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:12.696146+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:13.696261+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:14.696397+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:15.696536+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:16.696670+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:17.696820+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:18.696973+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:19.697113+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:20.697259+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:21.697424+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:22.697611+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:23.697802+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:24.697979+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:25.698129+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:26.698267+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 761589 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:27.698381+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:28.698542+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 81936384 unmapped: 1916928 heap: 83853312 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:29.698681+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 66.440071106s of 66.617134094s, submitted: 39
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9e7d000/0x0/0x1bfc00000, data 0x1b8531e/0x1c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9dc00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 ms_handle_reset con 0x55bf91f9dc00 session 0x55bf90c9de00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 82599936 unmapped: 12820480 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:30.698818+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9031000/0x0/0x1bfc00000, data 0x29d231e/0x2a5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 82599936 unmapped: 12820480 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:31.698982+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 865271 data_alloc: 285212672 data_used: 172032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 82599936 unmapped: 12820480 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 ms_handle_reset con 0x55bf91f9d400 session 0x55bf916a0000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:32.699121+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92123c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 82649088 unmapped: 12771328 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:33.699256+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf921e1000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9030000/0x0/0x1bfc00000, data 0x29d2341/0x2a5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83058688 unmapped: 12361728 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:34.699393+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9030000/0x0/0x1bfc00000, data 0x29d2341/0x2a5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83058688 unmapped: 12361728 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:35.699533+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83058688 unmapped: 12361728 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:36.699704+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 871561 data_alloc: 285212672 data_used: 696320
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83058688 unmapped: 12361728 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:37.700276+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83058688 unmapped: 12361728 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:38.700447+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83058688 unmapped: 12361728 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:39.700622+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b9030000/0x0/0x1bfc00000, data 0x29d2341/0x2a5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83066880 unmapped: 12353536 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:40.700789+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83066880 unmapped: 12353536 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:41.700913+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 871561 data_alloc: 285212672 data_used: 696320
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 83066880 unmapped: 12353536 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:42.701042+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.932395935s of 13.059148788s, submitted: 23
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 84918272 unmapped: 10502144 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:43.701194+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf921e0400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 84041728 unmapped: 11378688 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 heartbeat osd_stat(store_statfs(0x1b854e000/0x0/0x1bfc00000, data 0x34b4341/0x3540000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:44.701928+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 84271104 unmapped: 11149312 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:45.702087+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 94 ms_handle_reset con 0x55bf921e0400 session 0x55bf90466d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 84287488 unmapped: 11132928 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:46.702232+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 972978 data_alloc: 285212672 data_used: 724992
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 94 ms_handle_reset con 0x55bf921e1000 session 0x55bf923541e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 94 ms_handle_reset con 0x55bf92123c00 session 0x55bf9166f860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 84336640 unmapped: 11083776 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:47.702299+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 85401600 unmapped: 10018816 heap: 95420416 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:48.702442+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 95 ms_handle_reset con 0x55bf91db6c00 session 0x55bf916910e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 89202688 unmapped: 30932992 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:49.702567+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 95 ms_handle_reset con 0x55bf91f9d400 session 0x55bf916905a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 95 heartbeat osd_stat(store_statfs(0x1b67ef000/0x0/0x1bfc00000, data 0x520bee5/0x529f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9dc00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 89120768 unmapped: 31014912 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 95 handle_osd_map epochs [95,96], i have 95, src has [1,96]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:50.702691+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 96 ms_handle_reset con 0x55bf91f9d800 session 0x55bf916914a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 96 ms_handle_reset con 0x55bf91f9dc00 session 0x55bf923550e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 89047040 unmapped: 31088640 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:51.702818+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 ms_handle_reset con 0x55bf91f9d800 session 0x55bf8f52af00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 ms_handle_reset con 0x55bf91db6c00 session 0x55bf91691680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1218743 data_alloc: 285212672 data_used: 1019904
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 heartbeat osd_stat(store_statfs(0x1b67de000/0x0/0x1bfc00000, data 0x5219385/0x52ae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 ms_handle_reset con 0x55bf91f9d400 session 0x55bf916461e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92123c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 ms_handle_reset con 0x55bf92123c00 session 0x55bf91644f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 89030656 unmapped: 31105024 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:52.702999+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.145498276s of 10.007163048s, submitted: 207
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 ms_handle_reset con 0x55bf91db6c00 session 0x55bf91644000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 86024192 unmapped: 34111488 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:53.703165+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 heartbeat osd_stat(store_statfs(0x1b8137000/0x0/0x1bfc00000, data 0x38c2362/0x3956000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:54.703324+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 85852160 unmapped: 34283520 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:55.703451+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 86007808 unmapped: 34127872 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 98 handle_osd_map epochs [98,98], i have 98, src has [1,98]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 98 ms_handle_reset con 0x55bf91f9d400 session 0x55bf91a8c000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:56.703576+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 86958080 unmapped: 33177600 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1127204 data_alloc: 285212672 data_used: 212992
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:57.703732+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 86958080 unmapped: 33177600 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 98 ms_handle_reset con 0x55bf91f9d800 session 0x55bf91a8c3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9dc00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 99 ms_handle_reset con 0x55bf91f9dc00 session 0x55bf91a8c5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf921e1000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 99 ms_handle_reset con 0x55bf921e1000 session 0x55bf91a8c780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf921e1000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 99 heartbeat osd_stat(store_statfs(0x1b7631000/0x0/0x1bfc00000, data 0x43c4b12/0x445c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:58.703927+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 86925312 unmapped: 33210368 heap: 120135680 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 ms_handle_reset con 0x55bf91db6c00 session 0x55bf90458f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:50:59.704016+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109682688 unmapped: 14303232 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 ms_handle_reset con 0x55bf921e1000 session 0x55bf91a8c960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:00.704129+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109895680 unmapped: 14090240 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 heartbeat osd_stat(store_statfs(0x1b5ecb000/0x0/0x1bfc00000, data 0x5b25f3b/0x5bc3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 100 handle_osd_map epochs [101,101], i have 101, src has [1,101]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 101 ms_handle_reset con 0x55bf91f9d400 session 0x55bf91a8cd20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:01.704278+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109830144 unmapped: 14155776 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1387416 data_alloc: 301989888 data_used: 20918272
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:02.704434+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 110002176 unmapped: 13983744 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.156674385s of 10.017887115s, submitted: 227
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 102 ms_handle_reset con 0x55bf91f9d800 session 0x55bf91a8cf00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9dc00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:03.704592+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90136576 unmapped: 33849344 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:04.705703+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90169344 unmapped: 33816576 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:05.705826+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90169344 unmapped: 33816576 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:06.706016+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90169344 unmapped: 33816576 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 102 heartbeat osd_stat(store_statfs(0x1b7bb3000/0x0/0x1bfc00000, data 0x3df6679/0x3e95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1114625 data_alloc: 285212672 data_used: 344064
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 102 heartbeat osd_stat(store_statfs(0x1b7bb3000/0x0/0x1bfc00000, data 0x3df6679/0x3e95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:07.706172+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90226688 unmapped: 33759232 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:08.706384+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90243072 unmapped: 33742848 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:09.706572+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90243072 unmapped: 33742848 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:10.706741+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90243072 unmapped: 33742848 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 103 heartbeat osd_stat(store_statfs(0x1b7bf4000/0x0/0x1bfc00000, data 0x3df88c7/0x3e99000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:11.706873+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 90963968 unmapped: 33021952 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1193143 data_alloc: 285212672 data_used: 356352
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:12.707114+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 92725248 unmapped: 31260672 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.722745895s of 10.068332672s, submitted: 91
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 103 ms_handle_reset con 0x55bf91f9dc00 session 0x55bf91a8d680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9dc00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:13.707273+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 94568448 unmapped: 29417472 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:14.707447+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 94887936 unmapped: 29097984 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:15.707589+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 93675520 unmapped: 30310400 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:16.707737+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 103 heartbeat osd_stat(store_statfs(0x1b6c39000/0x0/0x1bfc00000, data 0x4dad8c7/0x4e4e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 94642176 unmapped: 29343744 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1246645 data_alloc: 285212672 data_used: 389120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:17.707892+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 94642176 unmapped: 29343744 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:18.708074+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 94642176 unmapped: 29343744 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:19.708259+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 94650368 unmapped: 29335552 heap: 123985920 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 104 ms_handle_reset con 0x55bf91db6c00 session 0x55bf8f8fd2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:20.708373+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111460352 unmapped: 19570688 heap: 131031040 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 104 ms_handle_reset con 0x55bf91f9dc00 session 0x55bf904672c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 104 ms_handle_reset con 0x55bf91f9d400 session 0x55bf904332c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 104 heartbeat osd_stat(store_statfs(0x1b7399000/0x0/0x1bfc00000, data 0x442bc2f/0x44ce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf921e1000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:21.708568+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 103530496 unmapped: 29925376 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 104 handle_osd_map epochs [104,105], i have 104, src has [1,105]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 105 ms_handle_reset con 0x55bf91f9d800 session 0x55bf91a8d4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 105 ms_handle_reset con 0x55bf921e1000 session 0x55bf91a90d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1218506 data_alloc: 285212672 data_used: 282624
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:22.708708+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 103530496 unmapped: 29925376 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 105 handle_osd_map epochs [105,106], i have 105, src has [1,106]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 ms_handle_reset con 0x55bf919d8c00 session 0x55bf91a8d860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 ms_handle_reset con 0x55bf91db6c00 session 0x55bf91691c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 ms_handle_reset con 0x55bf91f9d400 session 0x55bf91642960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 ms_handle_reset con 0x55bf91f9d800 session 0x55bf916425a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:23.708897+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 103587840 unmapped: 29868032 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:24.709098+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.877096176s of 11.546472549s, submitted: 225
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 103587840 unmapped: 29868032 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9dc00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 ms_handle_reset con 0x55bf91f9dc00 session 0x55bf90cb30e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:25.709307+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96206848 unmapped: 37249024 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:26.709450+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96206848 unmapped: 37249024 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 heartbeat osd_stat(store_statfs(0x1b7a4a000/0x0/0x1bfc00000, data 0x370f34c/0x37b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1099029 data_alloc: 285212672 data_used: 290816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:27.709585+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.16] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96206848 unmapped: 37249024 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:28.709791+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96206848 unmapped: 37249024 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 ms_handle_reset con 0x55bf919d8c00 session 0x55bf8ffef2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:29.709924+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 handle_osd_map epochs [107,107], i have 107, src has [1,107]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 110731264 unmapped: 22724608 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:30.710063+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 heartbeat osd_stat(store_statfs(0x1b71aa000/0x0/0x1bfc00000, data 0x483d59a/0x48e4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 ms_handle_reset con 0x55bf91db6c00 session 0x55bf90434d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111689728 unmapped: 21766144 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:31.710279+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111689728 unmapped: 21766144 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1302788 data_alloc: 301989888 data_used: 11825152
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 heartbeat osd_stat(store_statfs(0x1b6dfc000/0x0/0x1bfc00000, data 0x4beb59a/0x4c92000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 ms_handle_reset con 0x55bf91f9d400 session 0x55bf91a8dc20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:32.710470+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111738880 unmapped: 21716992 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:33.710628+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111820800 unmapped: 21635072 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:34.710859+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 107552768 unmapped: 25903104 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.820917130s of 10.237454414s, submitted: 107
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 ms_handle_reset con 0x55bf91f9d800 session 0x55bf9166eb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:35.711051+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98287616 unmapped: 35168256 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b818e000/0x0/0x1bfc00000, data 0x385792a/0x3900000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:36.711214+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98746368 unmapped: 34709504 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 heartbeat osd_stat(store_statfs(0x1b818e000/0x0/0x1bfc00000, data 0x385792a/0x3900000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1132775 data_alloc: 285212672 data_used: 1372160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:37.711397+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 97927168 unmapped: 35528704 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 ms_handle_reset con 0x55bf8eef6000 session 0x55bf9169ab40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:38.711576+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 97918976 unmapped: 35536896 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 ms_handle_reset con 0x55bf8eef6000 session 0x55bf916a12c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:39.711751+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96763904 unmapped: 36691968 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:40.711949+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96763904 unmapped: 36691968 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b9663000/0x0/0x1bfc00000, data 0x237fb55/0x2429000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:41.712132+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96813056 unmapped: 36642816 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 981569 data_alloc: 285212672 data_used: 851968
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:42.712290+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96813056 unmapped: 36642816 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:43.712465+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96813056 unmapped: 36642816 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b9663000/0x0/0x1bfc00000, data 0x237fb55/0x2429000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:44.712606+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 96829440 unmapped: 36626432 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:45.712769+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.583911896s of 10.766115189s, submitted: 60
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 97501184 unmapped: 35954688 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:46.712941+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 97304576 unmapped: 36151296 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1039485 data_alloc: 285212672 data_used: 864256
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b8ece000/0x0/0x1bfc00000, data 0x2b10b55/0x2bba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:47.713097+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 97583104 unmapped: 35872768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b8ec4000/0x0/0x1bfc00000, data 0x2b19b55/0x2bc3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:48.713290+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 97845248 unmapped: 35610624 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b8e90000/0x0/0x1bfc00000, data 0x2b45b55/0x2bef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:49.713468+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98230272 unmapped: 35225600 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:50.713676+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98230272 unmapped: 35225600 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b8e90000/0x0/0x1bfc00000, data 0x2b45b55/0x2bef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:51.713857+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98295808 unmapped: 35160064 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1047897 data_alloc: 285212672 data_used: 864256
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:52.714068+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98295808 unmapped: 35160064 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:53.714287+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98304000 unmapped: 35151872 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:54.714527+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98304000 unmapped: 35151872 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:55.714777+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98304000 unmapped: 35151872 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b8e9c000/0x0/0x1bfc00000, data 0x2b48b55/0x2bf2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:56.714951+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.870033264s of 11.132240295s, submitted: 68
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 ms_handle_reset con 0x55bf8eef6400 session 0x55bf91691c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 98312192 unmapped: 35143680 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 922943 data_alloc: 285212672 data_used: 327680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:57.715111+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 ms_handle_reset con 0x55bf919d8c00 session 0x55bf904672c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:58.715341+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:51:59.715544+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:00.715735+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:01.715904+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b9e3d000/0x0/0x1bfc00000, data 0x1ba8b22/0x1c50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 919221 data_alloc: 285212672 data_used: 323584
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:02.716077+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:03.716248+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:04.716445+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:05.716674+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b9e3d000/0x0/0x1bfc00000, data 0x1ba8b22/0x1c50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:06.716877+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 919221 data_alloc: 285212672 data_used: 323584
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:07.717047+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:08.717344+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:09.717483+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99393536 unmapped: 34062336 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:10.717662+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 heartbeat osd_stat(store_statfs(0x1b9e3d000/0x0/0x1bfc00000, data 0x1ba8b22/0x1c50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99401728 unmapped: 34054144 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:11.717816+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99401728 unmapped: 34054144 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 919221 data_alloc: 285212672 data_used: 323584
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:12.717917+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99401728 unmapped: 34054144 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:13.718061+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99401728 unmapped: 34054144 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.600391388s of 17.762466431s, submitted: 37
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:14.718203+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99467264 unmapped: 33988608 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 110 ms_handle_reset con 0x55bf91db6c00 session 0x55bf8f8fd2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 110 heartbeat osd_stat(store_statfs(0x1b9e3d000/0x0/0x1bfc00000, data 0x1ba8b45/0x1c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:15.718338+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99549184 unmapped: 33906688 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:16.718531+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99573760 unmapped: 33882112 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b9e39000/0x0/0x1bfc00000, data 0x1baaead/0x1c55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 111 ms_handle_reset con 0x55bf91f9d400 session 0x55bf8f8fd4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 927671 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:17.718697+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99606528 unmapped: 33849344 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:18.718928+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99606528 unmapped: 33849344 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b9e34000/0x0/0x1bfc00000, data 0x1bad246/0x1c58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:19.719116+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99614720 unmapped: 33841152 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:20.719280+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99614720 unmapped: 33841152 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:21.719477+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 111 heartbeat osd_stat(store_statfs(0x1b9e34000/0x0/0x1bfc00000, data 0x1bad246/0x1c58000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99663872 unmapped: 33792000 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 927671 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:22.719692+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:23.719822+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:24.719988+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:25.720178+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:26.720338+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:27.720513+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:28.720703+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:29.720844+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99696640 unmapped: 33759232 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:30.720991+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99696640 unmapped: 33759232 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:31.721159+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99696640 unmapped: 33759232 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:32.721287+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99704832 unmapped: 33751040 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:33.721409+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99704832 unmapped: 33751040 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:34.722384+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99704832 unmapped: 33751040 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:35.739179+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99704832 unmapped: 33751040 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:36.739370+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99704832 unmapped: 33751040 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:37.739525+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99704832 unmapped: 33751040 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:38.739713+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99713024 unmapped: 33742848 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:39.739852+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99713024 unmapped: 33742848 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:40.739997+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99745792 unmapped: 33710080 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:41.740168+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99745792 unmapped: 33710080 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:42.740339+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99745792 unmapped: 33710080 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:43.740504+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99540992 unmapped: 33914880 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:44.740677+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99540992 unmapped: 33914880 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:45.740824+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99540992 unmapped: 33914880 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:46.740989+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99540992 unmapped: 33914880 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:47.741154+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99540992 unmapped: 33914880 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:48.741291+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99557376 unmapped: 33898496 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:49.741469+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99573760 unmapped: 33882112 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:50.741669+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:51.741848+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:52.742036+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:53.742216+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:54.742379+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:55.742584+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:56.742713+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:57.742891+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99581952 unmapped: 33873920 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:58.743108+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99590144 unmapped: 33865728 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:52:59.743280+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99590144 unmapped: 33865728 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:00.743448+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99622912 unmapped: 33832960 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:01.743700+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99622912 unmapped: 33832960 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:02.743878+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99622912 unmapped: 33832960 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:03.744027+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99622912 unmapped: 33832960 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:04.744194+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99622912 unmapped: 33832960 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:05.744340+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99622912 unmapped: 33832960 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:06.744506+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:07.744710+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:08.744948+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:09.745155+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:10.745350+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:11.745473+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:12.745679+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 929571 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:13.745853+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:14.746030+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99631104 unmapped: 33824768 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf494/0x1c5c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:15.746164+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99639296 unmapped: 33816576 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:16.746330+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 62.297084808s of 62.450031281s, submitted: 46
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99639296 unmapped: 33816576 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:17.746480+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 930626 data_alloc: 285212672 data_used: 348160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 113 heartbeat osd_stat(store_statfs(0x1b9e31000/0x0/0x1bfc00000, data 0x1baf4b7/0x1c5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:18.746679+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:19.746819+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99647488 unmapped: 33808384 heap: 133455872 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 113 heartbeat osd_stat(store_statfs(0x1b962d000/0x0/0x1bfc00000, data 0x23b181f/0x2461000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:20.746925+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99713024 unmapped: 42139648 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:21.747022+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99713024 unmapped: 42139648 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:22.747185+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1047534 data_alloc: 285212672 data_used: 372736
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b8e28000/0x0/0x1bfc00000, data 0x2bb3b87/0x2c65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99753984 unmapped: 42098688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 114 heartbeat osd_stat(store_statfs(0x1b8e28000/0x0/0x1bfc00000, data 0x2bb3b87/0x2c65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 114 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 114 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 114 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:23.747362+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99688448 unmapped: 42164224 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:24.747543+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99688448 unmapped: 42164224 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 ms_handle_reset con 0x55bf91f9d400 session 0x55bf916a0960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:25.747696+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 heartbeat osd_stat(store_statfs(0x1b7622000/0x0/0x1bfc00000, data 0x43b5f53/0x446a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 115 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99696640 unmapped: 42156032 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 117 ms_handle_reset con 0x55bf8eef6400 session 0x55bf916a0d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:26.747799+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 117 ms_handle_reset con 0x55bf8eef6000 session 0x55bf8f5b2000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.823126793s of 10.106919289s, submitted: 52
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99803136 unmapped: 42049536 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:27.747955+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 968208 data_alloc: 285212672 data_used: 372736
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 ms_handle_reset con 0x55bf919d8c00 session 0x55bf8f5b30e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99221504 unmapped: 42631168 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 ms_handle_reset con 0x55bf91db6c00 session 0x55bf8f5b3c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:28.748187+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99221504 unmapped: 42631168 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:29.748396+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99221504 unmapped: 42631168 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 ms_handle_reset con 0x55bf91db6c00 session 0x55bf8f5b25a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:30.748508+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:34.088720+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99262464 unmapped: 42590208 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965783 data_alloc: 285212672 data_used: 372736
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 heartbeat osd_stat(store_statfs(0x1b9e12000/0x0/0x1bfc00000, data 0x1bbed51/0x1c7c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:35.088849+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99278848 unmapped: 42573824 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 heartbeat osd_stat(store_statfs(0x1b9e11000/0x0/0x1bfc00000, data 0x1bbed6f/0x1c7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:36.088962+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99278848 unmapped: 42573824 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 ms_handle_reset con 0x55bf8eef6400 session 0x55bf936ac000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 ms_handle_reset con 0x55bf8eef6000 session 0x55bf90c9cd20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:37.089096+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 99368960 unmapped: 42483712 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 119 handle_osd_map epochs [119,120], i have 119, src has [1,120]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.796310425s of 10.092370033s, submitted: 81
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 120 ms_handle_reset con 0x55bf919d8c00 session 0x55bf8f7fa5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 120 ms_handle_reset con 0x55bf91f9d400 session 0x55bf8fa49680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:38.089237+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 121 heartbeat osd_stat(store_statfs(0x1b9e14000/0x0/0x1bfc00000, data 0x1bbecf1/0x1c7a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100466688 unmapped: 41385984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 121 ms_handle_reset con 0x55bf8eef6400 session 0x55bf936ac3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 121 handle_osd_map epochs [121,122], i have 121, src has [1,122]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 122 ms_handle_reset con 0x55bf8eef6000 session 0x55bf8f52be00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:39.089381+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100622336 unmapped: 41230336 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 976338 data_alloc: 285212672 data_used: 385024
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:40.089526+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100622336 unmapped: 41230336 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:41.089723+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100630528 unmapped: 41222144 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:42.089845+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100630528 unmapped: 41222144 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 122 heartbeat osd_stat(store_statfs(0x1b9e06000/0x0/0x1bfc00000, data 0x1bc564c/0x1c84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:43.090004+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:44.090162+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 977788 data_alloc: 285212672 data_used: 385024
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:45.090314+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:46.090481+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:47.090669+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 123 heartbeat osd_stat(store_statfs(0x1b9a05000/0x0/0x1bfc00000, data 0x1bc789a/0x1c88000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 123 heartbeat osd_stat(store_statfs(0x1b9a05000/0x0/0x1bfc00000, data 0x1bc789a/0x1c88000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:48.090811+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:49.091019+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100638720 unmapped: 41213952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 977788 data_alloc: 285212672 data_used: 385024
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.806887627s of 12.003597260s, submitted: 78
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 123 ms_handle_reset con 0x55bf919d8c00 session 0x55bf8f8fde00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:50.091157+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100663296 unmapped: 41189376 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:51.091347+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100679680 unmapped: 41172992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:52.091525+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100679680 unmapped: 41172992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 123 heartbeat osd_stat(store_statfs(0x1b9a06000/0x0/0x1bfc00000, data 0x1bc789a/0x1c88000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:53.091720+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100679680 unmapped: 41172992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:54.091889+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100679680 unmapped: 41172992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 976908 data_alloc: 285212672 data_used: 385024
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:55.092037+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100679680 unmapped: 41172992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:56.092221+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100679680 unmapped: 41172992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 125 ms_handle_reset con 0x55bf91db6c00 session 0x55bf91321860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:57.092395+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100761600 unmapped: 41091072 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 126 heartbeat osd_stat(store_statfs(0x1b99f9000/0x0/0x1bfc00000, data 0x1bccce2/0x1c93000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 126 ms_handle_reset con 0x55bf91f9d800 session 0x55bf936ac5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:58.092556+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100843520 unmapped: 41009152 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 126 heartbeat osd_stat(store_statfs(0x1b99f3000/0x0/0x1bfc00000, data 0x1bcf05b/0x1c98000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:53:59.092717+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100851712 unmapped: 41000960 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1000358 data_alloc: 285212672 data_used: 409600
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 127 handle_osd_map epochs [127,128], i have 127, src has [1,128]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.604923248s of 10.743050575s, submitted: 36
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:00.092872+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 128 heartbeat osd_stat(store_statfs(0x1b99f0000/0x0/0x1bfc00000, data 0x1bd13e6/0x1c9d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100884480 unmapped: 40968192 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91f9d800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 129 handle_osd_map epochs [129,129], i have 129, src has [1,129]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 129 ms_handle_reset con 0x55bf91f9d800 session 0x55bf9401f680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:01.093033+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100958208 unmapped: 40894464 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:02.093201+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100958208 unmapped: 40894464 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 130 handle_osd_map epochs [131,131], i have 130, src has [1,131]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 130 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:03.093340+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 131 heartbeat osd_stat(store_statfs(0x1b99e1000/0x0/0x1bfc00000, data 0x1bd7eb4/0x1cac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101040128 unmapped: 40812544 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 131 ms_handle_reset con 0x55bf8eef6000 session 0x55bf9401fa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf919d8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:04.093483+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101113856 unmapped: 40738816 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1022469 data_alloc: 285212672 data_used: 409600
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 132 ms_handle_reset con 0x55bf919d8c00 session 0x55bf9401fc20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 132 ms_handle_reset con 0x55bf8eef6400 session 0x55bf936ac780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:05.093686+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101072896 unmapped: 40779776 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 133 ms_handle_reset con 0x55bf91db6c00 session 0x55bf9402fa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:06.093892+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101269504 unmapped: 40583168 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 134 ms_handle_reset con 0x55bf8eef7000 session 0x55bf936ac960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 134 ms_handle_reset con 0x55bf91db6c00 session 0x55bf9402fc20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:07.094062+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 134 heartbeat osd_stat(store_statfs(0x1b99d4000/0x0/0x1bfc00000, data 0x1be0d39/0x1cb7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 134 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 134 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101040128 unmapped: 40812544 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 135 ms_handle_reset con 0x55bf8eef6000 session 0x55bf8ffef2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91d86400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:08.094205+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 136 ms_handle_reset con 0x55bf8eef7400 session 0x55bf91e4c3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100827136 unmapped: 41025536 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:09.094347+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 ms_handle_reset con 0x55bf91d86400 session 0x55bf91e4c000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100835328 unmapped: 41017344 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1034006 data_alloc: 285212672 data_used: 434176
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 handle_osd_map epochs [136,137], i have 137, src has [1,137]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 handle_osd_map epochs [136,137], i have 137, src has [1,137]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 ms_handle_reset con 0x55bf8eef6000 session 0x55bf91e4de00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:10.094491+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100876288 unmapped: 40976384 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 heartbeat osd_stat(store_statfs(0x1b99cc000/0x0/0x1bfc00000, data 0x1be6b0e/0x1cc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:11.094672+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100884480 unmapped: 40968192 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:12.094873+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100884480 unmapped: 40968192 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.694233894s of 12.607856750s, submitted: 280
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:13.095022+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100909056 unmapped: 40943616 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:14.095135+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100909056 unmapped: 40943616 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1036018 data_alloc: 285212672 data_used: 434176
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:15.095325+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100909056 unmapped: 40943616 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 138 heartbeat osd_stat(store_statfs(0x1b99c9000/0x0/0x1bfc00000, data 0x1be8d8c/0x1cc5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:16.095498+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100917248 unmapped: 40935424 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:17.095684+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100917248 unmapped: 40935424 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets getting new tickets!
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:18.095904+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _finish_auth 0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:18.097303+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100917248 unmapped: 40935424 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 139 ms_handle_reset con 0x55bf8eef7000 session 0x55bf9402ef00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:19.096033+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100941824 unmapped: 40910848 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1040229 data_alloc: 285212672 data_used: 450560
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 139 heartbeat osd_stat(store_statfs(0x1b99c5000/0x0/0x1bfc00000, data 0x1beb0f4/0x1cc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:20.096185+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100941824 unmapped: 40910848 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 139 heartbeat osd_stat(store_statfs(0x1b99c5000/0x0/0x1bfc00000, data 0x1beb0f4/0x1cc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 139 ms_handle_reset con 0x55bf8eef7400 session 0x55bf9169af00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:21.096360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100941824 unmapped: 40910848 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:22.096509+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101040128 unmapped: 40812544 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:23.096695+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.255812645s of 10.431124687s, submitted: 56
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101056512 unmapped: 40796160 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 140 ms_handle_reset con 0x55bf91db6c00 session 0x55bf913210e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 140 heartbeat osd_stat(store_statfs(0x1b99c0000/0x0/0x1bfc00000, data 0x1bed4b0/0x1ccd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,2])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:24.096877+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101097472 unmapped: 40755200 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1046343 data_alloc: 285212672 data_used: 466944
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 47
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:25.096989+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101179392 unmapped: 40673280 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92124800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 140 ms_handle_reset con 0x55bf92124800 session 0x55bf936ee1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:26.098056+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101187584 unmapped: 40665088 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:27.098315+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101236736 unmapped: 40615936 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:28.099779+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101294080 unmapped: 40558592 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 48
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 141 heartbeat osd_stat(store_statfs(0x1b99af000/0x0/0x1bfc00000, data 0x1bfade2/0x1cde000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:29.100082+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100278272 unmapped: 41574400 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1051125 data_alloc: 285212672 data_used: 479232
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:30.100225+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100278272 unmapped: 41574400 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:31.100409+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100278272 unmapped: 41574400 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:32.100598+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100286464 unmapped: 41566208 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:33.100761+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100286464 unmapped: 41566208 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 141 heartbeat osd_stat(store_statfs(0x1b99ae000/0x0/0x1bfc00000, data 0x1bfdd44/0x1ce0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.534376144s of 10.733454704s, submitted: 53
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:34.100917+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100311040 unmapped: 41541632 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1052449 data_alloc: 285212672 data_used: 479232
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:35.101062+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 142 ms_handle_reset con 0x55bf8eef6000 session 0x55bf936ee780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100302848 unmapped: 41549824 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 142 heartbeat osd_stat(store_statfs(0x1b99ae000/0x0/0x1bfc00000, data 0x1bfdd44/0x1ce0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:36.101201+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100311040 unmapped: 41541632 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 143 ms_handle_reset con 0x55bf8eef7000 session 0x55bf936eeb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:37.101353+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 143 ms_handle_reset con 0x55bf8eef7400 session 0x55bf936eed20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100319232 unmapped: 41533440 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:38.101509+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 144 ms_handle_reset con 0x55bf91db6c00 session 0x55bf936eef00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100343808 unmapped: 41508864 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92124c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 144 ms_handle_reset con 0x55bf92124c00 session 0x55bf936ef0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:39.101755+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 144 ms_handle_reset con 0x55bf8eef6000 session 0x55bf936ef4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100343808 unmapped: 41508864 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1067782 data_alloc: 285212672 data_used: 512000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 145 heartbeat osd_stat(store_statfs(0x1b99a1000/0x0/0x1bfc00000, data 0x1c047d0/0x1cec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:40.101937+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100343808 unmapped: 41508864 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 145 ms_handle_reset con 0x55bf8eef7000 session 0x55bf936ef680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef7400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:41.102084+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 145 ms_handle_reset con 0x55bf8eef7400 session 0x55bf936efa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100360192 unmapped: 41492480 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 145 ms_handle_reset con 0x55bf91db6c00 session 0x55bf936efe00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:42.102843+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100392960 unmapped: 41459712 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:43.103069+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100401152 unmapped: 41451520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:44.103225+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100401152 unmapped: 41451520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1069312 data_alloc: 285212672 data_used: 520192
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 146 heartbeat osd_stat(store_statfs(0x1b999a000/0x0/0x1bfc00000, data 0x1c0943e/0x1cf3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.697421074s of 11.007451057s, submitted: 99
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:45.103367+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 146 heartbeat osd_stat(store_statfs(0x1b999a000/0x0/0x1bfc00000, data 0x1c094a0/0x1cf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100401152 unmapped: 41451520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:46.103544+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100401152 unmapped: 41451520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 147 heartbeat osd_stat(store_statfs(0x1b999a000/0x0/0x1bfc00000, data 0x1c094a0/0x1cf4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:47.103704+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 100409344 unmapped: 41443328 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 147 ms_handle_reset con 0x55bf92125400 session 0x55bf9169af00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:48.103866+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 148 handle_osd_map epochs [148,148], i have 148, src has [1,148]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101433344 unmapped: 40419328 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:49.105480+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101449728 unmapped: 40402944 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1078886 data_alloc: 285212672 data_used: 548864
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 148 ms_handle_reset con 0x55bf92125400 session 0x55bf91d252c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 148 heartbeat osd_stat(store_statfs(0x1b9991000/0x0/0x1bfc00000, data 0x1c0dbc4/0x1cfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:50.105950+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101490688 unmapped: 40361984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:51.106087+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101490688 unmapped: 40361984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:52.106268+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101490688 unmapped: 40361984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 148 heartbeat osd_stat(store_statfs(0x1b9993000/0x0/0x1bfc00000, data 0x1c0db62/0x1cfb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:53.106421+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101490688 unmapped: 40361984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b9993000/0x0/0x1bfc00000, data 0x1c0db62/0x1cfb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:54.106551+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101490688 unmapped: 40361984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1080902 data_alloc: 285212672 data_used: 561152
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b998e000/0x0/0x1bfc00000, data 0x1c0fdb0/0x1cff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:55.106903+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101490688 unmapped: 40361984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:56.107152+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.150645256s of 11.425005913s, submitted: 85
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eef6000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101556224 unmapped: 40296448 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:57.107350+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101572608 unmapped: 40280064 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:58.107668+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101572608 unmapped: 40280064 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:54:59.107823+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101572608 unmapped: 40280064 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1084178 data_alloc: 285212672 data_used: 581632
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:00.108064+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101580800 unmapped: 40271872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b9988000/0x0/0x1bfc00000, data 0x1c15ff4/0x1d06000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:01.108312+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b9988000/0x0/0x1bfc00000, data 0x1c15ff4/0x1d06000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101580800 unmapped: 40271872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:02.108417+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101605376 unmapped: 40247296 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:03.108744+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101638144 unmapped: 40214528 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b9985000/0x0/0x1bfc00000, data 0x1c18907/0x1d09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:04.108913+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101638144 unmapped: 40214528 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1083954 data_alloc: 285212672 data_used: 581632
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:05.109133+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101638144 unmapped: 40214528 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:06.109335+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b9984000/0x0/0x1bfc00000, data 0x1c19f48/0x1d0a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101638144 unmapped: 40214528 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.323979378s of 10.401132584s, submitted: 20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:07.109446+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 ms_handle_reset con 0x55bf91db6c00 session 0x55bf936ad860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101695488 unmapped: 40157184 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:08.109530+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101695488 unmapped: 40157184 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:09.109727+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101703680 unmapped: 40148992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1085380 data_alloc: 285212672 data_used: 581632
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:10.109913+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101728256 unmapped: 40124416 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:11.110077+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101728256 unmapped: 40124416 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:12.110256+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 heartbeat osd_stat(store_statfs(0x1b996c000/0x0/0x1bfc00000, data 0x1c32348/0x1d22000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101801984 unmapped: 40050688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 149 handle_osd_map epochs [150,150], i have 149, src has [1,150]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:13.110452+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 101883904 unmapped: 39968768 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:14.110681+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1093910 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 151 handle_osd_map epochs [150,151], i have 151, src has [1,151]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:15.110874+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:16.111136+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.046094894s of 10.306603432s, submitted: 101
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:17.111283+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 151 heartbeat osd_stat(store_statfs(0x1b9956000/0x0/0x1bfc00000, data 0x1c43f44/0x1d37000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:18.111484+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:19.111753+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1096560 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b9952000/0x0/0x1bfc00000, data 0x1c46640/0x1d3b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:20.111919+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9401fc20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:21.112086+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102047744 unmapped: 39804928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:22.112278+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf900f1800 session 0x55bf904663c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102096896 unmapped: 39755776 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf916e8000 session 0x55bf91a8c1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:23.112445+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf90434d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102776832 unmapped: 39075840 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf900f1800 session 0x55bf91643e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:24.112631+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1113570 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102801408 unmapped: 39051264 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf916e8000 session 0x55bf936ee780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:25.112824+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf91db6c00 session 0x55bf936eef00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102916096 unmapped: 38936576 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b9877000/0x0/0x1bfc00000, data 0x1c55349/0x1d4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:26.113180+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102883328 unmapped: 38969344 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.563544273s of 10.004372597s, submitted: 115
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:27.119295+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102891520 unmapped: 38961152 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:28.119593+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 102891520 unmapped: 38961152 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:29.119799+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1105823 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104038400 unmapped: 37814272 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 49
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:30.119930+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104038400 unmapped: 37814272 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:31.120058+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104038400 unmapped: 37814272 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b9926000/0x0/0x1bfc00000, data 0x1c71704/0x1d68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:32.120229+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104169472 unmapped: 37683200 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:33.120443+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b990a000/0x0/0x1bfc00000, data 0x1c8cf0c/0x1d84000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104169472 unmapped: 37683200 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:34.120683+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b9904000/0x0/0x1bfc00000, data 0x1c91fb6/0x1d8a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1106939 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104169472 unmapped: 37683200 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf916e8400 session 0x55bf936ee5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:35.120855+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b9904000/0x0/0x1bfc00000, data 0x1c91fa5/0x1d89000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104218624 unmapped: 37634048 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:36.121057+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104218624 unmapped: 37634048 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.772076607s of 10.001675606s, submitted: 52
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:37.121269+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf8f7faf00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b9907000/0x0/0x1bfc00000, data 0x1c91d7e/0x1d87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104218624 unmapped: 37634048 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98f9000/0x0/0x1bfc00000, data 0x1ca0149/0x1d95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:38.121655+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104218624 unmapped: 37634048 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:39.121893+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98f9000/0x0/0x1bfc00000, data 0x1ca0149/0x1d95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1100803 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104235008 unmapped: 37617664 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:40.122080+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104243200 unmapped: 37609472 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:41.122220+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104243200 unmapped: 37609472 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:42.122450+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104251392 unmapped: 37601280 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:43.122580+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98ec000/0x0/0x1bfc00000, data 0x1cada78/0x1da2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104251392 unmapped: 37601280 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:44.122772+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1103315 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104251392 unmapped: 37601280 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:45.122913+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104251392 unmapped: 37601280 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:46.123112+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104259584 unmapped: 37593088 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.895606041s of 10.000029564s, submitted: 28
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98e0000/0x0/0x1bfc00000, data 0x1cb90da/0x1dae000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:47.123264+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104267776 unmapped: 37584896 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf900f1800 session 0x55bf913e1c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:48.123437+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104267776 unmapped: 37584896 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf916e8000 session 0x55bf90c9d2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:49.123618+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98dd000/0x0/0x1bfc00000, data 0x1cbc44a/0x1db1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1106285 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104275968 unmapped: 37576704 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf91db6c00 session 0x55bf936ee1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:50.123786+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 ms_handle_reset con 0x55bf916e8800 session 0x55bf91643860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104333312 unmapped: 37519360 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:51.123969+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104333312 unmapped: 37519360 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:52.124238+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104423424 unmapped: 37429248 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98cf000/0x0/0x1bfc00000, data 0x1cca953/0x1dbf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:53.124407+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104448000 unmapped: 37404672 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:54.124536+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1105556 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 104448000 unmapped: 37404672 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:55.124712+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98c5000/0x0/0x1bfc00000, data 0x1cd4a8b/0x1dc9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105496576 unmapped: 36356096 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:56.124877+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105447424 unmapped: 36405248 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.815505981s of 10.003674507s, submitted: 49
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:57.125060+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105455616 unmapped: 36397056 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:58.125224+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105455616 unmapped: 36397056 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:55:59.125406+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98bb000/0x0/0x1bfc00000, data 0x1cdebfc/0x1dd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1107120 data_alloc: 285212672 data_used: 593920
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105472000 unmapped: 36380672 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:00.125536+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105496576 unmapped: 36356096 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:01.125704+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105496576 unmapped: 36356096 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:02.125849+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105512960 unmapped: 36339712 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:03.126033+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98a8000/0x0/0x1bfc00000, data 0x1ceeee5/0x1de6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105521152 unmapped: 36331520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 heartbeat osd_stat(store_statfs(0x1b98a6000/0x0/0x1bfc00000, data 0x1cf1b7a/0x1de8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 152 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:04.126162+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 153 heartbeat osd_stat(store_statfs(0x1b98a6000/0x0/0x1bfc00000, data 0x1cf1b7a/0x1de8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1119072 data_alloc: 285212672 data_used: 606208
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105529344 unmapped: 36323328 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 154 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf916914a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:05.126335+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 105553920 unmapped: 36298752 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 155 handle_osd_map epochs [155,155], i have 155, src has [1,155]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 155 ms_handle_reset con 0x55bf916e8000 session 0x55bf904670e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:06.126451+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 155 ms_handle_reset con 0x55bf91db6c00 session 0x55bf9401e3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106635264 unmapped: 35217408 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.777858734s of 10.000617027s, submitted: 77
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 156 ms_handle_reset con 0x55bf900f1800 session 0x55bf937901e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:07.126577+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 156 ms_handle_reset con 0x55bf916e8c00 session 0x55bf93790d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106659840 unmapped: 35192832 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:08.126814+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 156 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf93790f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 156 heartbeat osd_stat(store_statfs(0x1b9888000/0x0/0x1bfc00000, data 0x1d06f90/0x1e06000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106692608 unmapped: 35160064 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:09.127010+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 157 handle_osd_map epochs [157,157], i have 157, src has [1,157]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 157 ms_handle_reset con 0x55bf900f1800 session 0x55bf937912c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 157 ms_handle_reset con 0x55bf916e8000 session 0x55bf94d514a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1135412 data_alloc: 285212672 data_used: 618496
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106790912 unmapped: 35061760 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:10.127193+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 158 ms_handle_reset con 0x55bf91db6c00 session 0x55bf94d51680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106823680 unmapped: 35028992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:11.127337+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106823680 unmapped: 35028992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:12.127480+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 158 handle_osd_map epochs [158,159], i have 158, src has [1,159]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106840064 unmapped: 35012608 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:13.127611+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 heartbeat osd_stat(store_statfs(0x1b9874000/0x0/0x1bfc00000, data 0x1d1af4b/0x1e1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106856448 unmapped: 34996224 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:14.127783+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1142828 data_alloc: 285212672 data_used: 630784
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106856448 unmapped: 34996224 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:15.127953+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 ms_handle_reset con 0x55bf916e9000 session 0x55bf94d51a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106864640 unmapped: 34988032 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:16.128057+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf94d51c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 ms_handle_reset con 0x55bf900f1800 session 0x55bf91311e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106864640 unmapped: 34988032 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 ms_handle_reset con 0x55bf916e8000 session 0x55bf91311c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.408397675s of 10.001291275s, submitted: 197
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:17.128193+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 ms_handle_reset con 0x55bf91db6c00 session 0x55bf8f5b3c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 heartbeat osd_stat(store_statfs(0x1b9869000/0x0/0x1bfc00000, data 0x1d2104a/0x1e25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 106921984 unmapped: 34930688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:18.128304+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf916e9400 session 0x55bf904325a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf90c9da40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111484928 unmapped: 30367744 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf916e8000 session 0x55bf9169b2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf900f1800 session 0x55bf91644f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:19.128465+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf91db6c00 session 0x55bf91311c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1330121 data_alloc: 285212672 data_used: 643072
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 112058368 unmapped: 29794304 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf916e9800 session 0x55bf90c9d2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf94d51a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:20.128616+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 heartbeat osd_stat(store_statfs(0x1b8a72000/0x0/0x1bfc00000, data 0x271617d/0x281b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 heartbeat osd_stat(store_statfs(0x1b7f01000/0x0/0x1bfc00000, data 0x3288031/0x338d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 108281856 unmapped: 33570816 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:21.128734+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 ms_handle_reset con 0x55bf900f1800 session 0x55bf94d50000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 108314624 unmapped: 33538048 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 160 handle_osd_map epochs [160,161], i have 160, src has [1,161]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:22.128881+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf916e8000 session 0x55bf93790d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf91db6c00 session 0x55bf9169ab40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 heartbeat osd_stat(store_statfs(0x1b875e000/0x0/0x1bfc00000, data 0x28a1f7e/0x29a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109420544 unmapped: 32432128 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:23.129047+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf916e9c00 session 0x55bf9169a1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9401ed20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf900f1800 session 0x55bf946383c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109543424 unmapped: 32309248 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:24.129181+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e8000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf916e9c00 session 0x55bf92355a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1279059 data_alloc: 285212672 data_used: 655360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 ms_handle_reset con 0x55bf91db6c00 session 0x55bf94d50d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109600768 unmapped: 32251904 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 162 ms_handle_reset con 0x55bf916e8000 session 0x55bf946381e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:25.129315+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 162 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9401ef00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109608960 unmapped: 32243712 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:26.129440+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 ms_handle_reset con 0x55bf900f1800 session 0x55bf94638960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 ms_handle_reset con 0x55bf916e9c00 session 0x55bf94638b40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 ms_handle_reset con 0x55bf91db6c00 session 0x55bf94638f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 ms_handle_reset con 0x55bf94688000 session 0x55bf936ef680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109584384 unmapped: 32268288 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.613626480s of 10.001016617s, submitted: 380
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 164 ms_handle_reset con 0x55bf94688000 session 0x55bf946390e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 164 heartbeat osd_stat(store_statfs(0x1b88b5000/0x0/0x1bfc00000, data 0x28cadb0/0x29d8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:27.129608+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 164 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf90c8cd20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109641728 unmapped: 32210944 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:28.129806+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf916e9c00 session 0x55bf91690000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf900f1800 session 0x55bf923552c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db6c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109674496 unmapped: 32178176 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf91db6c00 session 0x55bf9310de00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:29.130020+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf94d51e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf900f1800 session 0x55bf93790d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1289609 data_alloc: 285212672 data_used: 667648
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109674496 unmapped: 32178176 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:30.130136+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf916e9c00 session 0x55bf91a91e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 ms_handle_reset con 0x55bf94688000 session 0x55bf936ac000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109756416 unmapped: 32096256 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:31.130364+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109756416 unmapped: 32096256 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:32.130517+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109895680 unmapped: 31956992 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 heartbeat osd_stat(store_statfs(0x1b887e000/0x0/0x1bfc00000, data 0x2900fbd/0x2a10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 165 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:33.130686+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109813760 unmapped: 32038912 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:34.130916+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1307429 data_alloc: 285212672 data_used: 679936
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109813760 unmapped: 32038912 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:35.131451+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109821952 unmapped: 32030720 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:36.131619+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109871104 unmapped: 31981568 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:37.131820+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 167 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.896203041s of 10.496785164s, submitted: 186
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 109977600 unmapped: 31875072 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:38.132047+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 169 handle_osd_map epochs [168,169], i have 169, src has [1,169]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 169 ms_handle_reset con 0x55bf94688400 session 0x55bf9466a3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 169 heartbeat osd_stat(store_statfs(0x1b882b000/0x0/0x1bfc00000, data 0x294b02e/0x2a62000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 110043136 unmapped: 31809536 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:39.132251+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 169 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9466a5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1316455 data_alloc: 285212672 data_used: 696320
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111099904 unmapped: 30752768 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:40.132572+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111157248 unmapped: 30695424 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:41.133319+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 169 heartbeat osd_stat(store_statfs(0x1b87fd000/0x0/0x1bfc00000, data 0x297ad6b/0x2a90000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111321088 unmapped: 30531584 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:42.133687+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 169 handle_osd_map epochs [169,170], i have 169, src has [1,170]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 170 ms_handle_reset con 0x55bf900f1800 session 0x55bf9466a960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 170 handle_osd_map epochs [170,171], i have 170, src has [1,171]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111427584 unmapped: 30425088 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:43.133817+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 171 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 171 ms_handle_reset con 0x55bf916e9c00 session 0x55bf9466ab40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111501312 unmapped: 30351360 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:44.133962+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 171 heartbeat osd_stat(store_statfs(0x1b87a3000/0x0/0x1bfc00000, data 0x29cb117/0x2aea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 171 ms_handle_reset con 0x55bf94688000 session 0x55bf9466ad20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1336177 data_alloc: 285212672 data_used: 708608
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111648768 unmapped: 30203904 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:45.134148+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 172 ms_handle_reset con 0x55bf94688800 session 0x55bf9466af00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 172 handle_osd_map epochs [172,172], i have 172, src has [1,172]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 172 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9466b4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111730688 unmapped: 30121984 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:46.134315+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 111878144 unmapped: 29974528 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:47.134533+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 173 handle_osd_map epochs [173,174], i have 173, src has [1,174]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 174 ms_handle_reset con 0x55bf900f1800 session 0x55bf9466b860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.389639854s of 10.133373260s, submitted: 255
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 113188864 unmapped: 28663808 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:48.134719+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 175 heartbeat osd_stat(store_statfs(0x1b8765000/0x0/0x1bfc00000, data 0x2a06395/0x2b28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 175 ms_handle_reset con 0x55bf916e9c00 session 0x55bf9466ba40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 113205248 unmapped: 28647424 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:49.134977+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 175 ms_handle_reset con 0x55bf94688000 session 0x55bf9466bc20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1357486 data_alloc: 285212672 data_used: 720896
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 113205248 unmapped: 28647424 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:50.135136+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 114335744 unmapped: 27516928 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 176 heartbeat osd_stat(store_statfs(0x1b75a7000/0x0/0x1bfc00000, data 0x2a22c45/0x2b47000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:51.135346+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 113442816 unmapped: 28409856 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:52.135544+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94689000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 177 ms_handle_reset con 0x55bf94689000 session 0x55bf94638960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 177 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 178 ms_handle_reset con 0x55bf94688c00 session 0x55bf9466be00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 113491968 unmapped: 28360704 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:53.135727+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 178 ms_handle_reset con 0x55bf92125400 session 0x55bf91a8c1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 178 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9463d2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 178 ms_handle_reset con 0x55bf900f1800 session 0x55bf9242a000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 178 handle_osd_map epochs [178,179], i have 178, src has [1,179]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116154368 unmapped: 25698304 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:54.135887+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 179 heartbeat osd_stat(store_statfs(0x1b6c30000/0x0/0x1bfc00000, data 0x3394a8d/0x34bd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1449918 data_alloc: 285212672 data_used: 745472
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 50
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116252672 unmapped: 25600000 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:55.136098+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf916e9c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 179 ms_handle_reset con 0x55bf916e9c00 session 0x55bf9242a3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 179 handle_osd_map epochs [179,180], i have 179, src has [1,180]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116252672 unmapped: 25600000 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:56.136297+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 180 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9242a780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 180 ms_handle_reset con 0x55bf92125400 session 0x55bf9463de00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 180 ms_handle_reset con 0x55bf900f1800 session 0x55bf9242ad20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 180 heartbeat osd_stat(store_statfs(0x1b6c14000/0x0/0x1bfc00000, data 0x33af12e/0x34da000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 180 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116400128 unmapped: 25452544 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:57.136453+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.975796700s of 10.010621071s, submitted: 454
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 181 ms_handle_reset con 0x55bf94688c00 session 0x55bf93f101e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 181 handle_osd_map epochs [181,182], i have 181, src has [1,182]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116416512 unmapped: 25436160 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:58.136674+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 182 heartbeat osd_stat(store_statfs(0x1b73ad000/0x0/0x1bfc00000, data 0x2a86fbf/0x2bb4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 182 ms_handle_reset con 0x55bf94688000 session 0x55bf93f10960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117497856 unmapped: 24354816 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:56:59.136890+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1388184 data_alloc: 285212672 data_used: 757760
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117579776 unmapped: 24272896 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:00.137066+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117628928 unmapped: 24223744 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:01.137221+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 183 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf93f10b40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117628928 unmapped: 24223744 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:02.137396+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 185 handle_osd_map epochs [184,185], i have 185, src has [1,185]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 185 ms_handle_reset con 0x55bf900f1800 session 0x55bf93f10d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117719040 unmapped: 24133632 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:03.137569+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 185 ms_handle_reset con 0x55bf92125400 session 0x55bf93f110e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 185 heartbeat osd_stat(store_statfs(0x1b74f5000/0x0/0x1bfc00000, data 0x2ac5464/0x2bf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117719040 unmapped: 24133632 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:04.137778+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 185 handle_osd_map epochs [185,186], i have 185, src has [1,186]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1402420 data_alloc: 285212672 data_used: 770048
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117784576 unmapped: 24068096 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:05.137960+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 186 ms_handle_reset con 0x55bf94688c00 session 0x55bf93f112c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116932608 unmapped: 24920064 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:06.138091+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 186 heartbeat osd_stat(store_statfs(0x1b74f3000/0x0/0x1bfc00000, data 0x2ac7840/0x2bfb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116932608 unmapped: 24920064 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:07.138269+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 186 handle_osd_map epochs [186,187], i have 186, src has [1,187]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.849292755s of 10.256628990s, submitted: 132
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:08.138486+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e6000/0x0/0x1bfc00000, data 0x2ad48e9/0x2c08000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:09.138720+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e2000/0x0/0x1bfc00000, data 0x2ad6b77/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1404006 data_alloc: 285212672 data_used: 786432
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:10.138878+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:11.139034+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:12.139217+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e2000/0x0/0x1bfc00000, data 0x2ad6b77/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:13.139396+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e2000/0x0/0x1bfc00000, data 0x2ad6b77/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:14.139561+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1404006 data_alloc: 285212672 data_used: 786432
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:15.139753+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e2000/0x0/0x1bfc00000, data 0x2ad6b77/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:16.139894+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:17.140048+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:18.140182+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:19.140409+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1404006 data_alloc: 285212672 data_used: 786432
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e2000/0x0/0x1bfc00000, data 0x2ad6b77/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:20.140593+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 heartbeat osd_stat(store_statfs(0x1b74e2000/0x0/0x1bfc00000, data 0x2ad6b77/0x2c0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:21.140717+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.215974808s of 13.242103577s, submitted: 15
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:22.140916+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116948992 unmapped: 24903680 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:23.141154+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116957184 unmapped: 24895488 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94689400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 188 ms_handle_reset con 0x55bf94689400 session 0x55bf93f11680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:24.141315+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 116940800 unmapped: 24911872 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1414024 data_alloc: 285212672 data_used: 798720
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:25.141487+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117014528 unmapped: 24838144 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 188 heartbeat osd_stat(store_statfs(0x1b74c1000/0x0/0x1bfc00000, data 0x2af444f/0x2c2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 188 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 188 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 188 handle_osd_map epochs [189,189], i have 189, src has [1,189]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:26.141705+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117121024 unmapped: 24731648 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 189 heartbeat osd_stat(store_statfs(0x1b74ba000/0x0/0x1bfc00000, data 0x2af78d9/0x2c33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 189 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf93f11860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:27.141872+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117145600 unmapped: 24707072 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:28.142075+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117145600 unmapped: 24707072 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:29.142308+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117219328 unmapped: 24633344 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1420352 data_alloc: 285212672 data_used: 811008
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:30.142592+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117358592 unmapped: 24494080 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 189 heartbeat osd_stat(store_statfs(0x1b7480000/0x0/0x1bfc00000, data 0x2b32baa/0x2c6e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:31.142749+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 117358592 unmapped: 24494080 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.204909325s of 10.438373566s, submitted: 60
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:32.142954+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118317056 unmapped: 23535616 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 189 heartbeat osd_stat(store_statfs(0x1b7459000/0x0/0x1bfc00000, data 0x2b58993/0x2c95000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:33.155684+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118472704 unmapped: 23379968 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 ms_handle_reset con 0x55bf900f1800 session 0x55bf93f11c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:34.155857+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118480896 unmapped: 23371776 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 ms_handle_reset con 0x55bf92125400 session 0x55bf9466a3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1428850 data_alloc: 285212672 data_used: 823296
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:35.156066+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118505472 unmapped: 23347200 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:36.156245+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118636544 unmapped: 23216128 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:37.156486+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118661120 unmapped: 23191552 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 heartbeat osd_stat(store_statfs(0x1b7429000/0x0/0x1bfc00000, data 0x2b873eb/0x2cc5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 ms_handle_reset con 0x55bf94688c00 session 0x55bf8f8fde00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:38.156746+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118669312 unmapped: 23183360 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94689800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:39.156925+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 ms_handle_reset con 0x55bf94689800 session 0x55bf936ac3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118833152 unmapped: 23019520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1430163 data_alloc: 285212672 data_used: 823296
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:40.157163+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118833152 unmapped: 23019520 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 191 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf8fa49e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:41.158270+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118841344 unmapped: 23011328 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.075431824s of 10.413371086s, submitted: 109
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:42.158451+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118841344 unmapped: 23011328 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 192 ms_handle_reset con 0x55bf900f1800 session 0x55bf91690960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:43.158611+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118857728 unmapped: 22994944 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 192 heartbeat osd_stat(store_statfs(0x1b73fc000/0x0/0x1bfc00000, data 0x2bafbf3/0x2cf1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 192 ms_handle_reset con 0x55bf92125400 session 0x55bf91690780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:44.158726+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 193 ms_handle_reset con 0x55bf94688c00 session 0x55bf8f52a1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118865920 unmapped: 22986752 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 193 heartbeat osd_stat(store_statfs(0x1b73f7000/0x0/0x1bfc00000, data 0x2bb549b/0x2cf7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1441611 data_alloc: 285212672 data_used: 847872
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:45.158867+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 118964224 unmapped: 22888448 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94689800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 194 ms_handle_reset con 0x55bf94689800 session 0x55bf91a91e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:46.159085+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 120037376 unmapped: 21815296 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 194 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf93790d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:47.159267+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 120045568 unmapped: 21807104 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:48.159394+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 ms_handle_reset con 0x55bf900f1800 session 0x55bf94d51e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121118720 unmapped: 20733952 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 ms_handle_reset con 0x55bf92125400 session 0x55bf904592c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:49.161738+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121151488 unmapped: 20701184 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1453316 data_alloc: 285212672 data_used: 860160
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:50.161891+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 ms_handle_reset con 0x55bf94688c00 session 0x55bf935de960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121151488 unmapped: 20701184 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94689c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 ms_handle_reset con 0x55bf94689c00 session 0x55bf935deb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 heartbeat osd_stat(store_statfs(0x1b5e3d000/0x0/0x1bfc00000, data 0x2bc9bd7/0x2d10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf935def00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:51.162045+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121151488 unmapped: 20701184 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.703129768s of 10.000057220s, submitted: 96
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 heartbeat osd_stat(store_statfs(0x1b5e3d000/0x0/0x1bfc00000, data 0x2bc9bd7/0x2d10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:52.162203+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121151488 unmapped: 20701184 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 ms_handle_reset con 0x55bf900f1800 session 0x55bf935df0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:53.162433+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121184256 unmapped: 20668416 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:54.162696+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121184256 unmapped: 20668416 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 196 ms_handle_reset con 0x55bf92125400 session 0x55bf935df4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1455957 data_alloc: 285212672 data_used: 876544
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:55.162853+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121192448 unmapped: 20660224 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 196 ms_handle_reset con 0x55bf94688c00 session 0x55bf935df860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf8eb92000 session 0x55bf935dfa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:56.163014+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121176064 unmapped: 20676608 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf8eb92000 session 0x55bf935dfe00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 heartbeat osd_stat(store_statfs(0x1b5e34000/0x0/0x1bfc00000, data 0x2bce1b8/0x2d19000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:57.163215+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf937f01e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121200640 unmapped: 20652032 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf900f1800 session 0x55bf937f03c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf92125400 session 0x55bf937f0b40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:58.163387+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121200640 unmapped: 20652032 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf94688c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf94688c00 session 0x55bf937f0d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:57:59.163582+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121208832 unmapped: 20643840 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf8eb92000 session 0x55bf937f10e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:00.163779+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1463137 data_alloc: 285212672 data_used: 888832
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121233408 unmapped: 20619264 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:01.163949+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf937f1680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121233408 unmapped: 20619264 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.648187637s of 10.000524521s, submitted: 103
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:02.164106+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121233408 unmapped: 20619264 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 heartbeat osd_stat(store_statfs(0x1b5e36000/0x0/0x1bfc00000, data 0x2bce1a8/0x2d18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 197 handle_osd_map epochs [197,198], i have 197, src has [1,198]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 ms_handle_reset con 0x55bf900f1800 session 0x55bf937f1860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:03.164276+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121257984 unmapped: 20594688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:04.164440+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121257984 unmapped: 20594688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:05.164689+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1467558 data_alloc: 285212672 data_used: 901120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121257984 unmapped: 20594688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:06.164856+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121257984 unmapped: 20594688 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:07.165008+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b5e33000/0x0/0x1bfc00000, data 0x2bd02f9/0x2d1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:08.165360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:09.165740+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:10.165918+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1466678 data_alloc: 285212672 data_used: 901120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:11.166747+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b5e34000/0x0/0x1bfc00000, data 0x2bd02f9/0x2d1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:12.167700+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:13.167879+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121266176 unmapped: 20586496 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.051067352s of 12.164307594s, submitted: 32
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:14.168069+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121274368 unmapped: 20578304 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:15.168389+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1468506 data_alloc: 285212672 data_used: 901120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121274368 unmapped: 20578304 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:16.168942+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121274368 unmapped: 20578304 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 ms_handle_reset con 0x55bf92125400 session 0x55bf935de960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b5e33000/0x0/0x1bfc00000, data 0x2bd030a/0x2d1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:17.169620+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b5e33000/0x0/0x1bfc00000, data 0x2bd030a/0x2d1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121274368 unmapped: 20578304 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b5e32000/0x0/0x1bfc00000, data 0x2bd031a/0x2d1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:18.169808+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121651200 unmapped: 20201472 heap: 141852672 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:19.170066+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 130080768 unmapped: 20168704 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:20.170407+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1636233 data_alloc: 285212672 data_used: 901120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 130154496 unmapped: 20094976 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf917a4800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 ms_handle_reset con 0x55bf917a4800 session 0x55bf935dfa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:21.170829+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 130179072 unmapped: 20070400 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:22.170994+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b3631000/0x0/0x1bfc00000, data 0x53d037c/0x551d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121790464 unmapped: 28459008 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:23.171181+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121741312 unmapped: 28508160 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 ms_handle_reset con 0x55bf8eb92000 session 0x55bf937f12c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:24.171373+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.611006737s of 10.010114670s, submitted: 47
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 ms_handle_reset con 0x55bf900f1800 session 0x55bf9242b680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf937f0000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121298944 unmapped: 28950528 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:25.171665+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2048009 data_alloc: 285212672 data_used: 901120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 heartbeat osd_stat(store_statfs(0x1b1447000/0x0/0x1bfc00000, data 0x75ba32a/0x7707000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 121339904 unmapped: 28909568 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:26.171822+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 129843200 unmapped: 20406272 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:27.171982+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf936ca000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf936cb400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 199 ms_handle_reset con 0x55bf936ca000 session 0x55bf93f10000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 122650624 unmapped: 27598848 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 199 heartbeat osd_stat(store_statfs(0x1ae440000/0x0/0x1bfc00000, data 0xa5bc76a/0xa70d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [1,2,3,4,5] op hist [0,0,0,1,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 199 ms_handle_reset con 0x55bf936cb400 session 0x55bf8fa49e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:28.172169+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 122167296 unmapped: 28082176 heap: 150249472 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf8eb92000 session 0x55bf91691c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf900f1800 session 0x55bf91e4c3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf90434d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf92125400 session 0x55bf9463c3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:29.172446+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 122298368 unmapped: 36347904 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:30.172584+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf936ca000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf936ca000 session 0x55bf913e10e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2738827 data_alloc: 285212672 data_used: 917504
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 122413056 unmapped: 36233216 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:31.172752+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf8eb92000 session 0x55bf93f0e000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 122535936 unmapped: 36110336 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:32.172897+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf93f0e3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 122544128 unmapped: 36102144 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf92125400 session 0x55bf93f0e5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:33.173044+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf91e58400 session 0x55bf93f0f0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 heartbeat osd_stat(store_statfs(0x1a9d48000/0x0/0x1bfc00000, data 0xfc8e6fe/0xfde5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 123584512 unmapped: 35061760 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 ms_handle_reset con 0x55bf91e58800 session 0x55bf91a91c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:34.173243+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 123592704 unmapped: 35053568 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.750410080s of 10.578747749s, submitted: 127
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:35.173411+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3036293 data_alloc: 285212672 data_used: 933888
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 132087808 unmapped: 26558464 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf8eb92000 session 0x55bf8f8fd4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf91e58000 session 0x55bf94d6a000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:36.173558+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf913201e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf91e58c00 session 0x55bf93f0ed20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf91e58400 session 0x55bf916a1680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf900f1800 session 0x55bf93f10d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 123772928 unmapped: 34873344 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:37.173742+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 heartbeat osd_stat(store_statfs(0x1a8543000/0x0/0x1bfc00000, data 0x11490b28/0x115eb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 123781120 unmapped: 34865152 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 ms_handle_reset con 0x55bf8eb92000 session 0x55bf93f0fa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:38.173900+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 132292608 unmapped: 26353664 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:39.174062+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 202 handle_osd_map epochs [202,202], i have 202, src has [1,202]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 202 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf93f0f2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 202 ms_handle_reset con 0x55bf91e58000 session 0x55bf91d241e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e59800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124026880 unmapped: 34619392 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:40.174264+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3255776 data_alloc: 285212672 data_used: 958464
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124043264 unmapped: 34603008 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 203 heartbeat osd_stat(store_statfs(0x1a7541000/0x0/0x1bfc00000, data 0x12492de4/0x125ec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:41.174477+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124133376 unmapped: 34512896 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:42.174913+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124174336 unmapped: 34471936 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 203 ms_handle_reset con 0x55bf91e59800 session 0x55bf9463cd20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 203 ms_handle_reset con 0x55bf91e58c00 session 0x55bf91a90000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:43.175219+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 204 heartbeat osd_stat(store_statfs(0x1a7e8f000/0x0/0x1bfc00000, data 0x11b41419/0x11c9d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124264448 unmapped: 34381824 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:44.175353+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124280832 unmapped: 34365440 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.840657234s of 10.171233177s, submitted: 187
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:45.175573+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3185657 data_alloc: 285212672 data_used: 962560
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124280832 unmapped: 34365440 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 204 ms_handle_reset con 0x55bf92125400 session 0x55bf94639680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 204 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9463c5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:46.175781+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 124280832 unmapped: 34365440 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:47.175952+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 205 ms_handle_reset con 0x55bf900f1800 session 0x55bf90cb2f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 133849088 unmapped: 24797184 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:48.176123+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 ms_handle_reset con 0x55bf8eb92000 session 0x55bf9242ba40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 ms_handle_reset con 0x55bf8eb92000 session 0x55bf916a03c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 ms_handle_reset con 0x55bf91e58000 session 0x55bf937f14a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 125542400 unmapped: 33103872 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 handle_osd_map epochs [206,206], i have 206, src has [1,206]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf937912c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:49.176326+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 heartbeat osd_stat(store_statfs(0x1a6c01000/0x0/0x1bfc00000, data 0x12dcc27b/0x12f2c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 125599744 unmapped: 33046528 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 207 ms_handle_reset con 0x55bf900f1800 session 0x55bf9166fe00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:50.176498+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3358094 data_alloc: 285212672 data_used: 974848
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 134053888 unmapped: 24592384 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:51.176715+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 134201344 unmapped: 24444928 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 208 ms_handle_reset con 0x55bf92125400 session 0x55bf93f11a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:52.176923+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 208 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 125861888 unmapped: 32784384 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 209 ms_handle_reset con 0x55bf91e58c00 session 0x55bf91e4cb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:53.177182+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 210 heartbeat osd_stat(store_statfs(0x1a45d7000/0x0/0x1bfc00000, data 0x153eb0a7/0x15554000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 134307840 unmapped: 24338432 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:54.177376+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 211 ms_handle_reset con 0x55bf92125400 session 0x55bf9463d0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 126091264 unmapped: 32555008 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.981330872s of 10.018521309s, submitted: 218
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:55.177549+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3757525 data_alloc: 285212672 data_used: 1003520
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 126091264 unmapped: 32555008 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:56.177718+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 212 ms_handle_reset con 0x55bf8eb92000 session 0x55bf946390e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 212 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf90c8d0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 134651904 unmapped: 23994368 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 212 ms_handle_reset con 0x55bf900f1800 session 0x55bf916a12c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:57.177854+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128319488 unmapped: 30326784 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:58.178010+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 ms_handle_reset con 0x55bf8eb92000 session 0x55bf91a8cb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 heartbeat osd_stat(store_statfs(0x1a0602000/0x0/0x1bfc00000, data 0x193bcc73/0x1952b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128491520 unmapped: 30154752 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:58:59.178218+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf8f7fa5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137019392 unmapped: 21626880 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 ms_handle_reset con 0x55bf900f1800 session 0x55bf935def00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:00.178376+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4273471 data_alloc: 285212672 data_used: 1015808
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 ms_handle_reset con 0x55bf91e58c00 session 0x55bf8f5b25a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf92125400
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 ms_handle_reset con 0x55bf92125400 session 0x55bf9242a1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128147456 unmapped: 30498816 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:01.178511+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128172032 unmapped: 30474240 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:02.178696+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128311296 unmapped: 30334976 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 heartbeat osd_stat(store_statfs(0x19de06000/0x0/0x1bfc00000, data 0x1bbbcadb/0x1bd28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:03.178842+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 214 heartbeat osd_stat(store_statfs(0x19ce06000/0x0/0x1bfc00000, data 0x1cbbcadb/0x1cd28000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128458752 unmapped: 30187520 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:04.178987+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 214 ms_handle_reset con 0x55bf8eb92000 session 0x55bf937f1e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.103426933s of 10.000686646s, submitted: 188
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:05.179124+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 215 ms_handle_reset con 0x55bf8eb93800 session 0x55bf94d51e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4505089 data_alloc: 285212672 data_used: 1032192
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 215 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9242a5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:06.179257+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 215 ms_handle_reset con 0x55bf900f1800 session 0x55bf9463da40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128557056 unmapped: 30089216 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:07.179407+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 215 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128606208 unmapped: 30040064 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:08.179559+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 handle_osd_map epochs [215,216], i have 216, src has [1,216]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 ms_handle_reset con 0x55bf91e58c00 session 0x55bf937f10e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 127197184 unmapped: 31449088 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:09.179756+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 heartbeat osd_stat(store_statfs(0x1b65fa000/0x0/0x1bfc00000, data 0x33c33b2/0x3533000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 ms_handle_reset con 0x55bf8eb92000 session 0x55bf935df0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 ms_handle_reset con 0x55bf8eb93800 session 0x55bf935dfa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 127205376 unmapped: 31440896 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:10.179996+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1719564 data_alloc: 285212672 data_used: 1048576
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 127205376 unmapped: 31440896 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:11.180214+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 127205376 unmapped: 31440896 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:12.180354+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 130768896 unmapped: 27877376 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:13.180498+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 130768896 unmapped: 27877376 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:14.180663+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf94d51860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b585d000/0x0/0x1bfc00000, data 0x4154787/0x42c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128303104 unmapped: 30343168 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57e5000/0x0/0x1bfc00000, data 0x41d7715/0x4349000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.077154160s of 10.215615273s, submitted: 287
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf900f1800 session 0x55bf9466ba40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:15.180881+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1830045 data_alloc: 285212672 data_used: 1064960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:16.181031+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128303104 unmapped: 30343168 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e59c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf91e58000 session 0x55bf91310b40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf91e59c00 session 0x55bf91a8c5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf91e58c00 session 0x55bf91645680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 14K writes, 56K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                                          Cumulative WAL: 14K writes, 4683 syncs, 3.11 writes per sync, written: 0.04 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 9472 writes, 33K keys, 9472 commit groups, 1.0 writes per commit group, ingest: 25.58 MB, 0.04 MB/s
                                                          Interval WAL: 9472 writes, 3987 syncs, 2.38 writes per sync, written: 0.02 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:17.181190+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128311296 unmapped: 30334976 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8eb92000 session 0x55bf94d51e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf935dfe00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8eb93800 session 0x55bf904672c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:18.181490+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128319488 unmapped: 30326784 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57e0000/0x0/0x1bfc00000, data 0x41d7872/0x434e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:19.181802+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128319488 unmapped: 30326784 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:20.181958+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128458752 unmapped: 30187520 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1839179 data_alloc: 285212672 data_used: 1069056
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8eb92000 session 0x55bf9401e3c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8eb93800 session 0x55bf90c8d0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:21.182168+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf91646960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:22.182357+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf91e58c00 session 0x55bf8f8fdc20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:23.182572+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e59c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57bf000/0x0/0x1bfc00000, data 0x41f9925/0x436f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf91e59c00 session 0x55bf91d24f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57bf000/0x0/0x1bfc00000, data 0x41f9925/0x436f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:24.182777+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:25.182992+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1836639 data_alloc: 285212672 data_used: 1069056
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.320854187s of 10.663363457s, submitted: 78
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:26.183221+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:27.183459+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57c0000/0x0/0x1bfc00000, data 0x41f9914/0x436e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:28.183690+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:29.183892+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:30.184040+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1836463 data_alloc: 285212672 data_used: 1069056
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:31.184203+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57c0000/0x0/0x1bfc00000, data 0x41f9914/0x436e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128466944 unmapped: 30179328 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:32.184360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128483328 unmapped: 30162944 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:33.184511+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128483328 unmapped: 30162944 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:34.184678+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128483328 unmapped: 30162944 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:35.184863+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128483328 unmapped: 30162944 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1836871 data_alloc: 285212672 data_used: 1069056
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:36.185069+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128483328 unmapped: 30162944 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.789338112s of 10.848747253s, submitted: 9
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 ms_handle_reset con 0x55bf8eb92000 session 0x55bf91d24000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:37.185372+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 heartbeat osd_stat(store_statfs(0x1b57bc000/0x0/0x1bfc00000, data 0x41fc974/0x4372000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128499712 unmapped: 30146560 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:38.185561+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128499712 unmapped: 30146560 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 218 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 218 ms_handle_reset con 0x55bf8eb93800 session 0x55bf91310960
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:39.185796+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128499712 unmapped: 30146560 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 219 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf91a8cb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:40.185961+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 219 ms_handle_reset con 0x55bf91e58c00 session 0x55bf936ee780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128532480 unmapped: 30113792 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1853580 data_alloc: 285212672 data_used: 1085440
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:41.186145+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf900f1800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 220 ms_handle_reset con 0x55bf900f1800 session 0x55bf936efa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 128565248 unmapped: 30081024 heap: 158646272 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:42.186317+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139223040 unmapped: 27828224 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 220 heartbeat osd_stat(store_statfs(0x1b0ba7000/0x0/0x1bfc00000, data 0x8a08b42/0x8b87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:43.186487+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 132046848 unmapped: 35004416 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:44.186612+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 221 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf916434a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136249344 unmapped: 30801920 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:45.186780+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 221 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 222 ms_handle_reset con 0x55bf91e58c00 session 0x55bf9402e780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136658944 unmapped: 30392320 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3054779 data_alloc: 285212672 data_used: 1097728
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:46.186930+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149438464 unmapped: 17612800 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 8.579443932s of 10.019932747s, submitted: 218
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:47.187073+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 141066240 unmapped: 25985024 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 223 handle_osd_map epochs [223,223], i have 223, src has [1,223]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 223 ms_handle_reset con 0x55bf91e58000 session 0x55bf9401e000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:48.187281+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 223 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 224 heartbeat osd_stat(store_statfs(0x1a679a000/0x0/0x1bfc00000, data 0x12e0f6b2/0x12f94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,1,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136912896 unmapped: 30138368 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:49.187455+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 133791744 unmapped: 33259520 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:50.187605+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150683648 unmapped: 16367616 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4266940 data_alloc: 285212672 data_used: 1122304
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 224 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:51.187693+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 ms_handle_reset con 0x55bf91db4800 session 0x55bf904325a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 134094848 unmapped: 32956416 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 225 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:52.187861+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 226 ms_handle_reset con 0x55bf91db4000 session 0x55bf91e4cd20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 146792448 unmapped: 20258816 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:53.187999+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 147030016 unmapped: 20021248 heap: 167051264 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:54.188119+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 heartbeat osd_stat(store_statfs(0x197f87000/0x0/0x1bfc00000, data 0x21618898/0x217a7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 ms_handle_reset con 0x55bf8eb93800 session 0x55bf90c8d2c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf8f8fcb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 ms_handle_reset con 0x55bf8eb92000 session 0x55bf936ee5a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 134799360 unmapped: 36454400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 ms_handle_reset con 0x55bf91db4800 session 0x55bf8f7fbe00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:55.188296+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 ms_handle_reset con 0x55bf91db4000 session 0x55bf9466a000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137035776 unmapped: 34217984 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1694498816 meta_used: 5265997 data_alloc: 285212672 data_used: 1146880
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 heartbeat osd_stat(store_statfs(0x197389000/0x0/0x1bfc00000, data 0x22218c0c/0x223a5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [1,0,0,2])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:56.188443+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 ms_handle_reset con 0x55bf91e58000 session 0x55bf9463da40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 ms_handle_reset con 0x55bf8eb92000 session 0x55bf935def00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136249344 unmapped: 35004416 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 ms_handle_reset con 0x55bf8eb93800 session 0x55bf935df0e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 7.296254635s of 10.077240944s, submitted: 497
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 229 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf935dfa40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:57.188604+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 229 ms_handle_reset con 0x55bf91db4000 session 0x55bf94639860
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 229 ms_handle_reset con 0x55bf8eb92000 session 0x55bf8fa49e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136331264 unmapped: 34922496 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:58.188766+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 230 ms_handle_reset con 0x55bf8eb93800 session 0x55bf91691c20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136421376 unmapped: 34832384 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 230 heartbeat osd_stat(store_statfs(0x1b4b55000/0x0/0x1bfc00000, data 0x424b940/0x43d7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 230 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf90cb2780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T09:59:59.188923+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136699904 unmapped: 34553856 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:00.189079+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 230 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 231 ms_handle_reset con 0x55bf91db4000 session 0x55bf91d24f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136806400 unmapped: 34447360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1884536 data_alloc: 285212672 data_used: 1167360
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:01.189259+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136806400 unmapped: 34447360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:02.189375+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 231 heartbeat osd_stat(store_statfs(0x1b6907000/0x0/0x1bfc00000, data 0x2c96fcf/0x2e25000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 136847360 unmapped: 34406400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:03.189557+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137134080 unmapped: 34119680 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.2] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.10] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1c] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:04.189731+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 138272768 unmapped: 32980992 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 51
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:05.189966+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 138461184 unmapped: 32792576 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1903268 data_alloc: 285212672 data_used: 1179648
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 234 ms_handle_reset con 0x55bf91e58000 session 0x55bf91d25a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:06.190112+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 234 ms_handle_reset con 0x55bf8eb92000 session 0x55bf94639e00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 234 heartbeat osd_stat(store_statfs(0x1b6891000/0x0/0x1bfc00000, data 0x2d08ff2/0x2e9c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137863168 unmapped: 33390592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 234 handle_osd_map epochs [234,235], i have 234, src has [1,235]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.249840736s of 10.053759575s, submitted: 248
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:07.192819+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 235 ms_handle_reset con 0x55bf8eb93800 session 0x55bf916434a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 235 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9166e780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137936896 unmapped: 33316864 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 235 handle_osd_map epochs [235,236], i have 235, src has [1,236]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 236 ms_handle_reset con 0x55bf91db4000 session 0x55bf91643a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:08.192987+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137986048 unmapped: 33267712 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:09.193219+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137871360 unmapped: 33382400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:10.193466+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91e58000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 ms_handle_reset con 0x55bf91e58000 session 0x55bf9402e000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137871360 unmapped: 33382400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1913089 data_alloc: 285212672 data_used: 1179648
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:11.193701+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 heartbeat osd_stat(store_statfs(0x1b6846000/0x0/0x1bfc00000, data 0x2d519c5/0x2ee7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 ms_handle_reset con 0x55bf8eb92000 session 0x55bf937f12c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 137879552 unmapped: 33374208 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 heartbeat osd_stat(store_statfs(0x1b6846000/0x0/0x1bfc00000, data 0x2d519c5/0x2ee7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [0,0,2])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:12.195734+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 heartbeat osd_stat(store_statfs(0x1b6818000/0x0/0x1bfc00000, data 0x2d80192/0x2f16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 138969088 unmapped: 32284672 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:13.195910+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf8eb93800 session 0x55bf9463da40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 138977280 unmapped: 32276480 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 heartbeat osd_stat(store_statfs(0x1b6812000/0x0/0x1bfc00000, data 0x2d825b4/0x2f1b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:14.196075+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf91db4000 session 0x55bf93f103c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9463cd20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 138993664 unmapped: 32260096 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:15.196256+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf91db4800 session 0x55bf93f10d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf8eb92000 session 0x55bf935df4a0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139141120 unmapped: 32112640 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1929550 data_alloc: 285212672 data_used: 1196032
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:16.196539+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf8eb93800 session 0x55bf916441e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139141120 unmapped: 32112640 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.360792160s of 10.185649872s, submitted: 192
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:17.196723+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf91a90d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139157504 unmapped: 32096256 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 ms_handle_reset con 0x55bf91db4000 session 0x55bf9463cb40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:18.196932+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139190272 unmapped: 32063488 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:19.197330+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 ms_handle_reset con 0x55bf91db4800 session 0x55bf9401ef00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 heartbeat osd_stat(store_statfs(0x1b67c4000/0x0/0x1bfc00000, data 0x2dd2451/0x2f6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139223040 unmapped: 32030720 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:20.197727+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 heartbeat osd_stat(store_statfs(0x1b67c5000/0x0/0x1bfc00000, data 0x2dd2329/0x2f68000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139223040 unmapped: 32030720 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1932183 data_alloc: 285212672 data_used: 1204224
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:21.197957+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139427840 unmapped: 31825920 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:22.198125+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139427840 unmapped: 31825920 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:23.198269+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139567104 unmapped: 31686656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:24.198568+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139698176 unmapped: 31555584 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:25.198706+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 240 heartbeat osd_stat(store_statfs(0x1b676e000/0x0/0x1bfc00000, data 0x2e27762/0x2fc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139575296 unmapped: 31678464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1938941 data_alloc: 285212672 data_used: 1216512
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:26.198951+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 240 heartbeat osd_stat(store_statfs(0x1b676e000/0x0/0x1bfc00000, data 0x2e27762/0x2fc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 139575296 unmapped: 31678464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.610811234s of 10.002357483s, submitted: 95
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:27.199195+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 140787712 unmapped: 30466048 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 240 handle_osd_map epochs [240,241], i have 240, src has [1,241]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:28.199333+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 140730368 unmapped: 30523392 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:29.199593+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 140730368 unmapped: 30523392 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:30.199736+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 242 heartbeat osd_stat(store_statfs(0x1b671d000/0x0/0x1bfc00000, data 0x2e7378d/0x3010000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1954703 data_alloc: 285212672 data_used: 1228800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 140894208 unmapped: 30359552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:31.199958+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 243 ms_handle_reset con 0x55bf8eb92000 session 0x55bf8fa48d20
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 140935168 unmapped: 30318592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:32.200095+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 140976128 unmapped: 30277632 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 243 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 244 heartbeat osd_stat(store_statfs(0x1b66ee000/0x0/0x1bfc00000, data 0x2e9edaf/0x303d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:33.200245+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 30171136 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 244 ms_handle_reset con 0x55bf8eb93800 session 0x55bf91644f00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:34.200448+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 141082624 unmapped: 30171136 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:35.200705+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1966921 data_alloc: 285212672 data_used: 1228800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 141099008 unmapped: 30154752 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:36.200842+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8fa01c00
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 ms_handle_reset con 0x55bf8fa01c00 session 0x55bf9463c1e0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142336000 unmapped: 28917760 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.593835831s of 10.003067017s, submitted: 109
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:37.201025+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 heartbeat osd_stat(store_statfs(0x1b628c000/0x0/0x1bfc00000, data 0x2f0085e/0x30a2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 245 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142319616 unmapped: 28934144 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:38.201232+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 52
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 247 ms_handle_reset con 0x55bf91db4000 session 0x55bf90458780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142336000 unmapped: 28917760 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:39.201566+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142434304 unmapped: 28819456 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:40.201824+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1976837 data_alloc: 285212672 data_used: 1241088
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142434304 unmapped: 28819456 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:41.202016+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142434304 unmapped: 28819456 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:42.202222+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 247 heartbeat osd_stat(store_statfs(0x1b6258000/0x0/0x1bfc00000, data 0x2f3091c/0x30d6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [1,0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142450688 unmapped: 28803072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:43.202454+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142450688 unmapped: 28803072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:44.202585+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 248 heartbeat osd_stat(store_statfs(0x1b6237000/0x0/0x1bfc00000, data 0x2f4e6e2/0x30f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 248 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf91db4800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142458880 unmapped: 28794880 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 249 ms_handle_reset con 0x55bf91db4800 session 0x55bf92355680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:45.202823+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1991737 data_alloc: 285212672 data_used: 1253376
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 141492224 unmapped: 29761536 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb92000
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:46.202951+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 250 handle_osd_map epochs [250,250], i have 250, src has [1,250]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 250 ms_handle_reset con 0x55bf8eb92000 session 0x55bf94d512c0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 250 heartbeat osd_stat(store_statfs(0x1b620b000/0x0/0x1bfc00000, data 0x2f77072/0x3122000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 141500416 unmapped: 29753344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.577269554s of 10.002587318s, submitted: 124
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:47.203123+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142581760 unmapped: 28672000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:48.203340+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142761984 unmapped: 28491776 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:49.203537+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142802944 unmapped: 28450816 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:50.203974+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1992245 data_alloc: 285212672 data_used: 1253376
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142802944 unmapped: 28450816 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:51.204219+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 143024128 unmapped: 28229632 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:52.204904+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 250 heartbeat osd_stat(store_statfs(0x1b61bf000/0x0/0x1bfc00000, data 0x2fc2b62/0x316f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142671872 unmapped: 28581888 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:53.205351+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b61ae000/0x0/0x1bfc00000, data 0x2fd3e98/0x3180000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[3.f] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142671872 unmapped: 28581888 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:54.205663+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b61a7000/0x0/0x1bfc00000, data 0x2fd849a/0x3186000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142860288 unmapped: 28393472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:55.206064+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1996077 data_alloc: 285212672 data_used: 1265664
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142934016 unmapped: 28319744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:56.206455+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b618e000/0x0/0x1bfc00000, data 0x2ff3444/0x31a0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.803843498s of 10.000458717s, submitted: 49
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142950400 unmapped: 28303360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:57.206719+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b617b000/0x0/0x1bfc00000, data 0x30060f3/0x31b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142950400 unmapped: 28303360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:58.206938+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142966784 unmapped: 28286976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b615d000/0x0/0x1bfc00000, data 0x3024ab5/0x31d1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:00:59.207165+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142966784 unmapped: 28286976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:00.207349+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2001305 data_alloc: 285212672 data_used: 1265664
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 142974976 unmapped: 28278784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:01.207664+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 143130624 unmapped: 28123136 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 heartbeat osd_stat(store_statfs(0x1b612b000/0x0/0x1bfc00000, data 0x3056ebb/0x3203000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:02.207914+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 144195584 unmapped: 27058176 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:03.208200+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 144293888 unmapped: 26959872 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:04.208411+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 252 heartbeat osd_stat(store_statfs(0x1b6106000/0x0/0x1bfc00000, data 0x3079934/0x3227000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 144392192 unmapped: 26861568 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:05.208745+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2025381 data_alloc: 285212672 data_used: 1277952
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 144490496 unmapped: 26763264 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:06.208998+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.674788475s of 10.002058983s, submitted: 93
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 144646144 unmapped: 26607616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:07.209176+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 145752064 unmapped: 25501696 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:08.209527+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 145801216 unmapped: 25452544 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:09.209755+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 254 handle_osd_map epochs [253,254], i have 254, src has [1,254]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 254 heartbeat osd_stat(store_statfs(0x1b606f000/0x0/0x1bfc00000, data 0x3109eee/0x32be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 147128320 unmapped: 24125440 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:10.210109+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2045077 data_alloc: 285212672 data_used: 1302528
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 147218432 unmapped: 24035328 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:11.210227+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 147431424 unmapped: 23822336 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:12.210375+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 146980864 unmapped: 24272896 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:13.210527+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 146980864 unmapped: 24272896 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:14.210716+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4e35000/0x0/0x1bfc00000, data 0x31a62ed/0x3359000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 147152896 unmapped: 24100864 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:15.211125+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2046759 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 148291584 unmapped: 22962176 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:16.211330+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.405538559s of 10.003592491s, submitted: 156
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149348352 unmapped: 21905408 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:17.211527+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4db9000/0x0/0x1bfc00000, data 0x321eb8d/0x33d5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149454848 unmapped: 21798912 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:18.211680+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 21635072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:19.211920+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 21635072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:20.212130+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2048815 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 21635072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:21.212341+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 21635072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:22.212559+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149618688 unmapped: 21635072 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:23.212753+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4d80000/0x0/0x1bfc00000, data 0x325a630/0x340e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 149643264 unmapped: 21610496 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:24.213048+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 53
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 148963328 unmapped: 22290432 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:25.214092+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4d14000/0x0/0x1bfc00000, data 0x32c54d6/0x347a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2061951 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 148979712 unmapped: 22274048 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:26.214956+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.660450935s of 10.001140594s, submitted: 77
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150036480 unmapped: 21217280 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:27.215720+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150151168 unmapped: 21102592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:28.216339+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150396928 unmapped: 20856832 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:29.216512+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150396928 unmapped: 20856832 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:30.216948+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4c54000/0x0/0x1bfc00000, data 0x3383251/0x3537000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2074273 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150478848 unmapped: 20774912 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:31.217299+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 150478848 unmapped: 20774912 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:32.217485+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4c1f000/0x0/0x1bfc00000, data 0x33b825d/0x356e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151527424 unmapped: 19726336 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:33.217687+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151543808 unmapped: 19709952 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:34.217832+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151552000 unmapped: 19701760 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:35.217980+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2086941 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151552000 unmapped: 19701760 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:36.218199+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.533477783s of 10.002933502s, submitted: 110
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151617536 unmapped: 19636224 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:37.218365+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151617536 unmapped: 19636224 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:38.218482+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4b8e000/0x0/0x1bfc00000, data 0x344e661/0x3600000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:39.218737+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151617536 unmapped: 19636224 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3748941254' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4b8e000/0x0/0x1bfc00000, data 0x344e661/0x3600000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:40.218863+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151617536 unmapped: 19636224 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2080781 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:41.219006+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151617536 unmapped: 19636224 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:42.219208+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151617536 unmapped: 19636224 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4b8e000/0x0/0x1bfc00000, data 0x344e661/0x3600000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:43.219457+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151625728 unmapped: 19628032 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:44.219719+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 heartbeat osd_stat(store_statfs(0x1b4b8b000/0x0/0x1bfc00000, data 0x344e797/0x3602000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:45.219882+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2082595 data_alloc: 285212672 data_used: 1314816
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:46.220052+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.007639885s of 10.079208374s, submitted: 17
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:47.220283+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:48.220529+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 256 heartbeat osd_stat(store_statfs(0x1b4b89000/0x0/0x1bfc00000, data 0x3450a62/0x3604000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:49.220732+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:50.220933+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2084035 data_alloc: 285212672 data_used: 1327104
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:51.221155+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:52.221400+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:53.221530+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151633920 unmapped: 19619840 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b88000/0x0/0x1bfc00000, data 0x3450b2b/0x3605000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:54.221723+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151707648 unmapped: 19546112 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:55.221898+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151707648 unmapped: 19546112 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:56.222126+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2088053 data_alloc: 285212672 data_used: 1339392
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151707648 unmapped: 19546112 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.837702751s of 10.004058838s, submitted: 82
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:57.222337+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151732224 unmapped: 19521536 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:58.222595+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b84000/0x0/0x1bfc00000, data 0x3452d6b/0x3609000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:01:59.222923+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:00.223193+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:01.223372+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2089435 data_alloc: 285212672 data_used: 1339392
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b84000/0x0/0x1bfc00000, data 0x3452d6b/0x3609000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:02.223516+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:03.223706+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:04.223854+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b85000/0x0/0x1bfc00000, data 0x3452d6b/0x3609000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:05.224019+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151740416 unmapped: 19513344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:06.224152+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b85000/0x0/0x1bfc00000, data 0x3452d6b/0x3609000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2088069 data_alloc: 285212672 data_used: 1339392
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151748608 unmapped: 19505152 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.972807884s of 10.022805214s, submitted: 7
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:07.224295+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151748608 unmapped: 19505152 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:08.224409+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151748608 unmapped: 19505152 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b84000/0x0/0x1bfc00000, data 0x3452e06/0x360a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:09.224613+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151756800 unmapped: 19496960 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:10.224827+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151756800 unmapped: 19496960 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:11.225030+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2090571 data_alloc: 285212672 data_used: 1339392
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151756800 unmapped: 19496960 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:12.225211+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151764992 unmapped: 19488768 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b83000/0x0/0x1bfc00000, data 0x3452ea1/0x360b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:13.225359+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151764992 unmapped: 19488768 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:14.225545+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151781376 unmapped: 19472384 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:15.225918+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151781376 unmapped: 19472384 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 heartbeat osd_stat(store_statfs(0x1b4b84000/0x0/0x1bfc00000, data 0x3452e06/0x360a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:16.226080+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2090427 data_alloc: 285212672 data_used: 1339392
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151789568 unmapped: 19464192 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.931864738s of 10.006960869s, submitted: 17
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:17.226287+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151797760 unmapped: 19456000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:18.226457+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 151814144 unmapped: 19439616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 258 heartbeat osd_stat(store_statfs(0x1b4b7e000/0x0/0x1bfc00000, data 0x3455272/0x360f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 258 handle_osd_map epochs [258,259], i have 258, src has [1,259]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:19.226726+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 153968640 unmapped: 17285120 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:20.226928+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 153968640 unmapped: 17285120 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 259 heartbeat osd_stat(store_statfs(0x1b39db000/0x0/0x1bfc00000, data 0x34575d8/0x3612000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:21.227099+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2100125 data_alloc: 285212672 data_used: 1351680
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 153968640 unmapped: 17285120 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:22.227286+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 153968640 unmapped: 17285120 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:23.227503+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 153993216 unmapped: 17260544 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:24.227697+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 153993216 unmapped: 17260544 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:25.227869+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154009600 unmapped: 17244160 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 54
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 261 heartbeat osd_stat(store_statfs(0x1b39d0000/0x0/0x1bfc00000, data 0x345be18/0x361d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:26.228016+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 261 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 261 handle_osd_map epochs [262,262], i have 262, src has [1,262]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2113535 data_alloc: 285212672 data_used: 1363968
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154075136 unmapped: 17178624 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.565257072s of 10.015598297s, submitted: 230
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:27.228197+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154124288 unmapped: 17129472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:28.228355+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154124288 unmapped: 17129472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 264 heartbeat osd_stat(store_statfs(0x1b39c6000/0x0/0x1bfc00000, data 0x346276e/0x3627000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:29.228669+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154124288 unmapped: 17129472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:30.228889+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154140672 unmapped: 17113088 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:31.229106+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2116591 data_alloc: 285212672 data_used: 1363968
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154140672 unmapped: 17113088 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 264 heartbeat osd_stat(store_statfs(0x1b39c9000/0x0/0x1bfc00000, data 0x3462638/0x3625000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:32.229273+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154140672 unmapped: 17113088 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:33.229482+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154148864 unmapped: 17104896 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 266 heartbeat osd_stat(store_statfs(0x1b39c4000/0x0/0x1bfc00000, data 0x3464a19/0x3629000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 266 heartbeat osd_stat(store_statfs(0x1b39c4000/0x0/0x1bfc00000, data 0x3464a19/0x3629000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:34.229697+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154181632 unmapped: 17072128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:35.229907+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154181632 unmapped: 17072128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:36.230056+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2124467 data_alloc: 285212672 data_used: 1376256
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154181632 unmapped: 17072128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 266 heartbeat osd_stat(store_statfs(0x1b39bf000/0x0/0x1bfc00000, data 0x3466ca3/0x362d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:37.230210+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154181632 unmapped: 17072128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.616575241s of 10.760381699s, submitted: 65
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:38.230367+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154214400 unmapped: 17039360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:39.230530+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154214400 unmapped: 17039360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:40.230733+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154214400 unmapped: 17039360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:41.230947+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2128885 data_alloc: 285212672 data_used: 1388544
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154214400 unmapped: 17039360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 267 heartbeat osd_stat(store_statfs(0x1b39bb000/0x0/0x1bfc00000, data 0x3468f8c/0x3632000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:42.231137+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154255360 unmapped: 16998400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:43.231294+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154255360 unmapped: 16998400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b39b7000/0x0/0x1bfc00000, data 0x346b38d/0x3636000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:44.231462+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154255360 unmapped: 16998400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b39b8000/0x0/0x1bfc00000, data 0x346b38d/0x3636000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:45.231665+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154288128 unmapped: 16965632 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 268 heartbeat osd_stat(store_statfs(0x1b39b9000/0x0/0x1bfc00000, data 0x346b2f2/0x3635000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:46.231820+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2130477 data_alloc: 285212672 data_used: 1400832
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154296320 unmapped: 16957440 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:47.232005+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154320896 unmapped: 16932864 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:48.232164+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.398365021s of 10.552735329s, submitted: 78
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154320896 unmapped: 16932864 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 270 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 270 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:49.232419+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154320896 unmapped: 16932864 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:50.232606+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154320896 unmapped: 16932864 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:51.232803+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2135645 data_alloc: 285212672 data_used: 1413120
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154329088 unmapped: 16924672 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 270 heartbeat osd_stat(store_statfs(0x1b39b2000/0x0/0x1bfc00000, data 0x346f902/0x363c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:52.232983+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154329088 unmapped: 16924672 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:53.233153+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154337280 unmapped: 16916480 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:54.233325+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 55
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154648576 unmapped: 16605184 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:55.233503+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154648576 unmapped: 16605184 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b39ac000/0x0/0x1bfc00000, data 0x3471c0b/0x3641000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:56.233694+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2141583 data_alloc: 285212672 data_used: 1429504
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154648576 unmapped: 16605184 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b39ac000/0x0/0x1bfc00000, data 0x3471c0b/0x3641000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:57.233910+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154656768 unmapped: 16596992 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:58.234089+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154656768 unmapped: 16596992 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.158243179s of 10.281229973s, submitted: 231
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:02:59.234297+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154656768 unmapped: 16596992 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:00.234491+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154656768 unmapped: 16596992 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:01.234687+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b39ad000/0x0/0x1bfc00000, data 0x3471c0b/0x3641000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2141957 data_alloc: 285212672 data_used: 1429504
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154656768 unmapped: 16596992 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:02.234842+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154664960 unmapped: 16588800 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:03.235006+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154664960 unmapped: 16588800 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:04.235160+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154664960 unmapped: 16588800 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b39ad000/0x0/0x1bfc00000, data 0x3471c0b/0x3641000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:05.235379+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b39ad000/0x0/0x1bfc00000, data 0x3471c0b/0x3641000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154697728 unmapped: 16556032 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:06.235548+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2144467 data_alloc: 285212672 data_used: 1429504
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154697728 unmapped: 16556032 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 heartbeat osd_stat(store_statfs(0x1b39ac000/0x0/0x1bfc00000, data 0x3471ca6/0x3642000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:07.235763+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154730496 unmapped: 16523264 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 272 heartbeat osd_stat(store_statfs(0x1b39ac000/0x0/0x1bfc00000, data 0x3471ca6/0x3642000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:08.235942+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154763264 unmapped: 16490496 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.333109856s of 10.425970078s, submitted: 44
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:09.236193+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154763264 unmapped: 16490496 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:10.236360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154763264 unmapped: 16490496 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:11.236529+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2146041 data_alloc: 285212672 data_used: 1441792
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154771456 unmapped: 16482304 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 272 heartbeat osd_stat(store_statfs(0x1b39aa000/0x0/0x1bfc00000, data 0x3473f41/0x3644000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:12.236747+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154787840 unmapped: 16465920 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:13.236905+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154787840 unmapped: 16465920 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:14.237111+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154796032 unmapped: 16457728 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 273 heartbeat osd_stat(store_statfs(0x1b39a4000/0x0/0x1bfc00000, data 0x347622a/0x3649000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:15.237249+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154812416 unmapped: 16441344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:16.237438+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2152737 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154812416 unmapped: 16441344 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:17.237601+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _renew_subs
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154828800 unmapped: 16424960 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:18.237766+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154828800 unmapped: 16424960 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:19.237943+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 274 heartbeat osd_stat(store_statfs(0x1b39a1000/0x0/0x1bfc00000, data 0x3478590/0x364c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154845184 unmapped: 16408576 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 274 heartbeat osd_stat(store_statfs(0x1b39a1000/0x0/0x1bfc00000, data 0x3478590/0x364c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:20.238115+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154845184 unmapped: 16408576 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:21.238309+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2155049 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154845184 unmapped: 16408576 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.719591141s of 12.839828491s, submitted: 73
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:22.238498+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154845184 unmapped: 16408576 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:23.238725+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 274 handle_osd_map epochs [274,275], i have 274, src has [1,275]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.3] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.15] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b39a1000/0x0/0x1bfc00000, data 0x3478590/0x364c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154861568 unmapped: 16392192 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:24.238946+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154861568 unmapped: 16392192 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:25.239134+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154861568 unmapped: 16392192 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:26.239293+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2157875 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:27.239449+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:28.239614+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:29.239824+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:30.239976+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:31.240158+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2157875 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.959703445s of 10.000681877s, submitted: 15
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:32.240334+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:33.240512+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154869760 unmapped: 16384000 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:34.240720+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:35.240869+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:36.241106+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158051 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:37.241275+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:38.241433+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:39.241687+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:40.241866+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:41.242048+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158051 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154877952 unmapped: 16375808 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:42.242240+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:43.242440+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:44.242619+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:45.242844+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:46.243036+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158051 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:47.243190+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:48.243373+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:49.243605+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154886144 unmapped: 16367616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:50.243834+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:51.244029+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158051 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:52.244230+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:53.247867+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:54.248116+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:55.248327+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:56.248549+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158051 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:57.248899+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154894336 unmapped: 16359424 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:58.249101+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:03:59.249401+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:00.249830+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:01.249943+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2158051 data_alloc: 285212672 data_used: 1454080
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:02.250094+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:03.250235+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 32.129329681s of 32.138740540s, submitted: 1
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:04.250368+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a7fe/0x3650000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:05.250509+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154902528 unmapped: 16351232 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 heartbeat osd_stat(store_statfs(0x1b399d000/0x0/0x1bfc00000, data 0x347a80e/0x3651000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:06.250693+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: handle_auth_request added challenge on 0x55bf8eb93800
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2163011 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154918912 unmapped: 16334848 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:07.250840+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 276 handle_osd_map epochs [276,277], i have 276, src has [1,277]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 277 ms_handle_reset con 0x55bf8eb93800 session 0x55bf913e1a40
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154927104 unmapped: 16326656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:08.250975+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 277 heartbeat osd_stat(store_statfs(0x1b3995000/0x0/0x1bfc00000, data 0x347ef22/0x3658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154927104 unmapped: 16326656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:09.251242+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154927104 unmapped: 16326656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:10.251410+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154927104 unmapped: 16326656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:11.251584+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2164199 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154927104 unmapped: 16326656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:12.251856+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154927104 unmapped: 16326656 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:13.252007+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.e] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.a] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 scrub-queue::remove_from_osd_queue removing pg[4.5] failed. State was: not registered w/ OSD
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:14.252173+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3991000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:15.252378+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:16.252595+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2167201 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:17.252857+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:18.253031+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:19.253238+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154935296 unmapped: 16318464 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3991000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:20.253445+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154943488 unmapped: 16310272 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:21.253584+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2167201 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154943488 unmapped: 16310272 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:22.253745+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 154943488 unmapped: 16310272 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:23.253903+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.259136200s of 19.313919067s, submitted: 29
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 ms_handle_reset con 0x55bf8eef6000 session 0x55bf936ac780
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:24.254059+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Got map version 56
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.106:6810/1895764224,v1:172.18.0.106:6811/1895764224]
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:25.254325+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:26.254521+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:27.254705+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:28.254907+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:29.255149+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:30.255470+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:31.255683+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:32.255834+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:33.256058+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:34.256322+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:35.256516+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:36.256724+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:37.256861+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:38.257008+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:40.518272+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:41.518417+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:42.518711+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:43.518856+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:44.519168+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156131328 unmapped: 15122432 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:45.519574+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156164096 unmapped: 15089664 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:46.520190+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:47.520379+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:48.520686+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:49.520898+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:50.521091+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:51.521273+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:52.521425+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:53.521551+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156172288 unmapped: 15081472 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:54.521735+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:55.521966+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:56.522148+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:57.522360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:58.522537+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:04:59.522758+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:00.522968+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:01.523112+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156213248 unmapped: 15040512 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:02.523481+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:03.523705+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:04.523865+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:05.524051+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:06.524226+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:07.524378+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:08.524557+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:09.524754+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156221440 unmapped: 15032320 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:10.524949+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:11.525135+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:12.525276+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:13.525480+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:14.525710+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:15.525833+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:16.526077+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:17.526279+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156229632 unmapped: 15024128 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:18.526440+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:19.526731+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:20.526921+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:21.527104+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:22.527319+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:23.527488+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:24.527715+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156237824 unmapped: 15015936 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:25.527910+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:26.528079+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:27.528263+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:28.528453+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:29.528665+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:30.528816+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:31.529054+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:32.529224+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156246016 unmapped: 15007744 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:33.529422+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:34.529671+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:35.529867+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:36.530046+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:37.530244+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:38.530453+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:39.530852+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:40.531018+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156254208 unmapped: 14999552 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:41.531220+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:42.531363+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:43.531529+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:44.531689+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:45.531870+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:46.532055+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:47.532269+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:48.532579+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156262400 unmapped: 14991360 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:49.532872+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:50.533019+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:51.533244+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:52.533406+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:53.533631+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:54.533858+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:55.534028+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:56.534190+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156270592 unmapped: 14983168 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:57.534360+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:58.534534+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:05:59.534747+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:00.534912+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _send_mon_message to mon.np0005625202 at v2:172.18.0.103:3300/0
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:01.535085+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:02.535299+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:03.535470+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:04.535621+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156278784 unmapped: 14974976 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:05.535867+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:06.536015+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:07.536149+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:08.536271+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:09.536415+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:10.536569+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:11.536726+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:12.536919+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156286976 unmapped: 14966784 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:13.537299+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:14.537445+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:15.537612+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:16.537768+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:17.537899+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:18.538039+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:19.538192+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: osd.0 278 heartbeat osd_stat(store_statfs(0x1b3992000/0x0/0x1bfc00000, data 0x3481170/0x365c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x8c0f9b7), peers [1,2,3,4,5] op hist [])
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:20.538335+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156295168 unmapped: 14958592 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:21.538461+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: bluestore.MempoolThread(0x55bf8de29b60) _resize_shards cache_size: 4047415775 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2166497 data_alloc: 285212672 data_used: 1466368
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156303360 unmapped: 14950400 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:22.538595+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'config diff' '{prefix=config diff}'
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'config show' '{prefix=config show}'
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'counter dump' '{prefix=counter dump}'
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 156049408 unmapped: 15204352 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'counter schema' '{prefix=counter schema}'
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:23.538774+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 155885568 unmapped: 15368192 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:24.538909+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: prioritycache tune_memory target: 5709084876 mapped: 155910144 unmapped: 15343616 heap: 171253760 old mem: 4047415775 new mem: 4047415775
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: tick
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_tickets
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-20T10:06:25.539048+0000)
Feb 20 10:06:55 np0005625204.localdomain ceph-osd[32226]: do_command 'log dump' '{prefix=log dump}'
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3699056303' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1631189302' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4128328217' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4095727606' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1895372588' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2292524633' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2854468367' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3786798133' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/20682300' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4203312444' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4213092245' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1926924911' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3748941254' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/110035165' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 20 10:06:55 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/401735792' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3797019353' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain sudo[327731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 20 10:06:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:56.061 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4997-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:56.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:06:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:56.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:06:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:56.065 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:56 np0005625204.localdomain sudo[327731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:06:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:56.104 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:06:56 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:56.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:06:56 np0005625204.localdomain sudo[327731]: pam_unix(sudo:session): session closed for user root
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/504695397' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 20 10:06:56 np0005625204.localdomain sudo[327757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/a8557ee9-b55d-5519-942c-cf8f6172f1d8/cephadm.d4329ff0b58389a1c874427e6fa8cdadc2545079117c7744dd9edf4a3e4fc83f --timeout 895 gather-facts
Feb 20 10:06:56 np0005625204.localdomain sudo[327757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:06:56 np0005625204.localdomain rsyslogd[758]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/781692820' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2276765394' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:06:56 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 20 10:06:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:06:56 np0005625204.localdomain openstack_network_exporter[244414]: ERROR   10:06:56 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 20 10:06:56 np0005625204.localdomain openstack_network_exporter[244414]: 
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:06:56 np0005625204.localdomain sudo[327757]: pam_unix(sudo:session): session closed for user root
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2137022502' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/855344926' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: pgmap v746: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2340209825' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3944110738' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3797019353' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/504695397' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2751144286' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/57478616' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/980354851' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3624829222' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/781692820' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2276765394' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2137022502' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 20 10:06:56 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/855344926' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 20 10:06:57 np0005625204.localdomain sudo[327927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 20 10:06:57 np0005625204.localdomain sudo[327927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 20 10:06:57 np0005625204.localdomain sudo[327927]: pam_unix(sudo:session): session closed for user root
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1775975022' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2991465511' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.59371 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.50148 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.59377 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.50154 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.59383 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.50163 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.59389 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1775975022' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.50169 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2991465511' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.59395 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:57 np0005625204.localdomain ceph-mon[301857]: from='client.50175 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain systemd[1]: Starting Hostname Service...
Feb 20 10:06:58 np0005625204.localdomain systemd[1]: Started Hostname Service.
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.99044 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.99041 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: pgmap v747: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.59407 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.50187 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.99050 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2945838039' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2425338279' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.99065 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.50202 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.99059 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.99071 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/732875958' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 20 10:06:58 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/799342705' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 20 10:06:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 20 10:06:59 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1756905580' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 10:06:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "versions"} v 0)
Feb 20 10:06:59 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3900495989' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 20 10:06:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:59.759 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:06:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:59.790 281292 DEBUG nova.compute.manager [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Triggering sync for uuid f9924957-6cff-426e-9f03-c739820f4ff3 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268
Feb 20 10:06:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:59.791 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Acquiring lock "f9924957-6cff-426e-9f03-c739820f4ff3" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 20 10:06:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:59.792 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 20 10:06:59 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:06:59.813 281292 DEBUG oslo_concurrency.lockutils [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Lock "f9924957-6cff-426e-9f03-c739820f4ff3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 20 10:06:59 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb 20 10:06:59 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/601954486' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.50214 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.59428 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.99083 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='mgr.44375 172.18.0.106:0/1082098019' entity='mgr.np0005625202.arwxwo' 
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.50220 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.59437 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1756905580' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/334263105' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3518226692' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.99101 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/2523254873' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3900495989' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/254070468' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/601954486' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2523455324' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:00 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:01.105 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:07:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:01.108 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 20 10:07:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:01.108 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Feb 20 10:07:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:01.109 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='client.99113 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: pgmap v748: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='client.99125 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2523455324' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1396299726' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/870806350' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 20 10:07:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:01.148 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 20 10:07:01 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:01.149 281292 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "config dump"} v 0)
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4258071969' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 20 10:07:01 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 20 10:07:02 np0005625204.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/550215609' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2330083557' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2330083557' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain podman[328640]: 2026-02-20 10:07:02.114952683 +0000 UTC m=+0.092033630 container health_status 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 20 10:07:02 np0005625204.localdomain podman[328640]: 2026-02-20 10:07:02.124667559 +0000 UTC m=+0.101748516 container exec_died 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': '6077946a4180028b328f149742d76066d281f4235c0c64d898675f3233fbaa4c-37161e50899fe4a243fd962c6b8b2b06663e1df0ab7abf060f791e8ff8fc1c67'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.59494 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.50283 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4258071969' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/931757883' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3176478051' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/4234861803' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1539358497' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/550215609' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2330083557' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.32:0/2330083557' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1165278192' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain systemd[1]: 8c5682a780a7e30fa685d876c79367b52ebfaee3d0b29d1dcefab4888dbd28e3.service: Deactivated successfully.
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "df"} v 0)
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3286986597' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 20 10:07:02 np0005625204.localdomain nova_compute[281288]: 2026-02-20 10:07:02.754 281292 DEBUG oslo_service.periodic_task [None req-383a1d2d-0e58-4d63-86dd-dc5a5989a7a2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "fs dump"} v 0)
Feb 20 10:07:02 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1711977834' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: from='client.99182 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: pgmap v749: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3168464906' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/3286986597' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/40887704' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/3478369784' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1711977834' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "fs ls"} v 0)
Feb 20 10:07:03 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4069418538' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 20 10:07:03 np0005625204.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 20 10:07:03 np0005625204.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 20 10:07:03 np0005625204.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 20 10:07:03 np0005625204.localdomain kernel: cfg80211: failed to load regulatory.db
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.59554 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.50325 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3951013327' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/4069418538' entity='client.admin' cmd={"prefix": "fs ls"} : dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/1950675391' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/3631475110' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/4178735613' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mds stat"} v 0)
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1310713204' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: mon.np0005625204@2(peon) e17 handle_command mon_command({"prefix": "mon dump"} v 0)
Feb 20 10:07:04 np0005625204.localdomain ceph-mon[301857]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2161530023' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: pgmap v750: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.99224 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.59578 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/1310713204' entity='client.admin' cmd={"prefix": "mds stat"} : dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.50343 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.107:0/1365962617' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.108:0/2161530023' entity='client.admin' cmd={"prefix": "mon dump"} : dispatch
Feb 20 10:07:05 np0005625204.localdomain ceph-mon[301857]: from='client.? 172.18.0.106:0/2898709264' entity='client.admin' cmd={"prefix": "osd blocklist ls"} : dispatch
